WorldWideScience

Sample records for models standard assessment

  1. Higher Education Quality Assessment Model: Towards Achieving Educational Quality Standard

    Science.gov (United States)

    Noaman, Amin Y.; Ragab, Abdul Hamid M.; Madbouly, Ayman I.; Khedra, Ahmed M.; Fayoumi, Ayman G.

    2017-01-01

    This paper presents a developed higher education quality assessment model (HEQAM) that can be applied for enhancement of university services. This is because there is no universal unified quality standard model that can be used to assess the quality criteria of higher education institutes. The analytical hierarchy process is used to identify the…

  2. TECHNICAL PRODUCT RISK ASSESSMENT: STANDARDS, INTEGRATION IN THE ERM MODEL AND UNCERTAINTY MODELING

    Directory of Open Access Journals (Sweden)

    Mirko Djapic

    2016-03-01

    Full Text Available European Union has accomplished, through introducing New Approach to technical harmonization and standardization, a breakthrough in the field of technical products safety and in assessing their conformity, in such a manner that it integrated products safety requirements into the process of products development. This is achieved by quantifying risk levels with the aim of determining the scope of the required safety measures and systems. The theory of probability is used as a tool for modeling uncertainties in the assessment of that risk. In the last forty years are developed new mathematical theories have proven to be better at modeling uncertainty when we have not enough data about uncertainty events which is usually the case in product development. Bayesian networks based on modeling of subjective probability and Evidence networks based on Dempster-Shafer theory of belief functions proved to be an excellent tool for modeling uncertainty when we do not have enough information about all events aspect.

  3. The Relationship between Students' Performance on Conventional Standardized Mathematics Assessments and Complex Mathematical Modeling Problems

    Science.gov (United States)

    Kartal, Ozgul; Dunya, Beyza Aksu; Diefes-Dux, Heidi A.; Zawojewski, Judith S.

    2016-01-01

    Critical to many science, technology, engineering, and mathematics (STEM) career paths is mathematical modeling--specifically, the creation and adaptation of mathematical models to solve problems in complex settings. Conventional standardized measures of mathematics achievement are not structured to directly assess this type of mathematical…

  4. Regional drought assessment using a distributed hydrological model coupled with Standardized Runoff Index

    Directory of Open Access Journals (Sweden)

    H. Shen

    2015-05-01

    Full Text Available Drought assessment is essential for coping with frequent droughts nowadays. Owing to the large spatio-temporal variations in hydrometeorology in most regions in China, it is very necessary to use a physically-based hydrological model to produce rational spatial and temporal distributions of hydro-meteorological variables for drought assessment. In this study, the large-scale distributed hydrological model Variable Infiltration Capacity (VIC was coupled with a modified standardized runoff index (SRI for drought assessment in the Weihe River basin, northwest China. The result indicates that the coupled model is capable of reasonably reproducing the spatial distribution of drought occurrence. It reflected the spatial heterogeneity of regional drought and improved the physical mechanism of SRI. This model also has potential for drought forecasting, early warning and mitigation, given that accurate meteorological forcing data are available.

  5. Validated assessment tool paves the way for standardized evaluation of trainees on anastomotic models.

    Science.gov (United States)

    Duran, Cassidy A; Shames, Murray; Bismuth, Jean; Lee, Jason T

    2014-01-01

    Simulation modules allow for the safe practice of certain techniques and are becoming increasingly important in the shift toward education for integrated vascular residents. There is an unquestionable need to standardize the evaluation of trainees on these simulation models to assure their impact and effectiveness. We sought to validate such an assessment tool for a basic open vascular technique. Vascular fellows, integrated vascular residents, and general surgery residents attending Society for Clinical Vascular Surgery, Introduction to Academic Vascular Surgery, and Methodist Boot Camp in 2012 were asked to participate in an assessment model using multiple anastomotic models and given 20 minutes to complete an end-to-side anastomosis. Trained vascular faculty evaluated subjects using an assessment tool that included a 25-point checklist and a graded overall global rating scale (GRS) on a 5-point Likert scale with 8 parameters. Self-assessment using the GRS was performed by 20 trainees. Reliability and construct validity were evaluated. Ninety-two trainees were assessed. There was excellent agreement between assessors on 21 of the 25 items, with 2 items found not to be relevant for the bench-top model. Graders agreed that the checklist was prohibitively cumbersome to use. Scores on the global assessments correlated with experience and were higher for the senior trainees, with median global summary scores increasing by postgraduate year. Reliability was confirmed through interrater correlation and internal consistency. Internal consistency was 0.92 for the GRS. There was poor correlation between grades given by the expert observers and the self-assessment from the trainee, but good correlation between scores assigned by faculty. Assessment of appropriate hemostasis was poor, which likely reflects the difficulty of evaluating this parameter in the current inanimate model. Performance on an open simulation model evaluated by a standardized global rating scale

  6. The Standardized Professional Encounter: A New Model to Assess Professionalism and Communication Skills.

    Science.gov (United States)

    Lifchez, Scott D; Cooney, Carisa M; Redett, Richard J

    2015-06-01

    Physician-patient communication is vital to patient care, and physician-nurse interactions are equally critical. Conflict between nurses and physicians can greatly impair communication, increasing the risk of treatment errors, yet physicians receive little education during training on recognizing and resolving professional conflicts. We created and implemented the Standardized Professional (S-Pro) Encounter to improve training and provide opportunities to evaluate resident professionalism and communication with health care team colleagues. The standardized patient model is well established for teaching and assessing clinical and communication skills. Using the standardized patient concept, we created a nurse-resident encounter with 2 professionally trained medical portrayers (1 "nurse," 1 "patient"), in which the nurse disagrees with the resident's treatment plan. Residents were surveyed for prior experience with nurse-physician conflict management, and we assessed postencounter for collaborative skills and conflict resolution. All residents (n=18) observed at least 1 physician-nurse conflict in front of patients. Eleven (61%) reported being involved in at least 1 conflict. Twelve residents (67%) had 2 or fewer prior education experiences in interprofessional conflict management. Faculty assessment and S-Pro scores demonstrated high agreement, while resident self-assessment scores demonstrated low agreement with faculty and S-Pro scores. Participants and evaluators found the encounter to be reasonably authentic. There was strong agreement between the faculty and S-Pro assessment of resident performance when using the Boggs scale. The S-Pro Encounter is easily adapted for other clinical situations or training programs, and facilitates the assessment of professionalism and communication skills between residents and other health care professionals.

  7. Tests of Alignment among Assessment, Standards, and Instruction Using Generalized Linear Model Regression

    Science.gov (United States)

    Fulmer, Gavin W.; Polikoff, Morgan S.

    2014-01-01

    An essential component in school accountability efforts is for assessments to be well-aligned with the standards or curriculum they are intended to measure. However, relatively little prior research has explored methods to determine statistical significance of alignment or misalignment. This study explores analyses of alignment as a special case…

  8. Beyond the standard model

    International Nuclear Information System (INIS)

    Wilczek, F.

    1993-01-01

    The standard model of particle physics is highly successful, although it is obviously not a complete or final theory. In this presentation the author argues that the structure of the standard model gives some quite concrete, compelling hints regarding what lies beyond. Essentially, this presentation is a record of the author's own judgement of what the central clues for physics beyond the standard model are, and also it is an attempt at some pedagogy. 14 refs., 6 figs

  9. Standard Model processes

    CERN Document Server

    Mangano, M.L.; Aguilar-Saavedra, Juan Antonio; Alekhin, S.; Badger, S.; Bauer, C.W.; Becher, T.; Bertone, V.; Bonvini, M.; Boselli, S.; Bothmann, E.; Boughezal, R.; Cacciari, M.; Carloni Calame, C.M.; Caola, F.; Campbell, J.M.; Carrazza, S.; Chiesa, M.; Cieri, L.; Cimaglia, F.; Febres Cordero, F.; Ferrarese, P.; D'Enterria, D.; Ferrera, G.; Garcia i Tormo, X.; Garzelli, M.V.; Germann, E.; Hirschi, V.; Han, T.; Ita, H.; Jäger, B.; Kallweit, S.; Karlberg, A.; Kuttimalai, S.; Krauss, F.; Larkoski, A.J.; Lindert, J.; Luisoni, G.; Maierhöfer, P.; Mattelaer, O.; Martinez, H.; Moch, S.; Montagna, G.; Moretti, M.; Nason, P.; Nicrosini, O.; Oleari, C.; Pagani, D.; Papaefstathiou, A.; Petriello, F.; Piccinini, F.; Pierini, M.; Pierog, T.; Pozzorini, S.; Re, E.; Robens, T.; Rojo, J.; Ruiz, R.; Sakurai, K.; Salam, G.P.; Salfelder, L.; Schönherr, M.; Schulze, M.; Schumann, S.; Selvaggi, M.; Shivaji, A.; Siodmok, A.; Skands, P.; Torrielli, P.; Tramontano, F.; Tsinikos, I.; Tweedie, B.; Vicini, A.; Westhoff, S.; Zaro, M.; Zeppenfeld, D.; CERN. Geneva. ATS Department

    2017-06-22

    This report summarises the properties of Standard Model processes at the 100 TeV pp collider. We document the production rates and typical distributions for a number of benchmark Standard Model processes, and discuss new dynamical phenomena arising at the highest energies available at this collider. We discuss the intrinsic physics interest in the measurement of these Standard Model processes, as well as their role as backgrounds for New Physics searches.

  10. Scan-To Output Validation: Towards a Standardized Geometric Quality Assessment of Building Information Models Based on Point Clouds

    Science.gov (United States)

    Bonduel, M.; Bassier, M.; Vergauwen, M.; Pauwels, P.; Klein, R.

    2017-11-01

    The use of Building Information Modeling (BIM) for existing buildings based on point clouds is increasing. Standardized geometric quality assessment of the BIMs is needed to make them more reliable and thus reusable for future users. First, available literature on the subject is studied. Next, an initial proposal for a standardized geometric quality assessment is presented. Finally, this method is tested and evaluated with a case study. The number of specifications on BIM relating to existing buildings is limited. The Levels of Accuracy (LOA) specification of the USIBD provides definitions and suggestions regarding geometric model accuracy, but lacks a standardized assessment method. A deviation analysis is found to be dependent on (1) the used mathematical model, (2) the density of the point clouds and (3) the order of comparison. Results of the analysis can be graphical and numerical. An analysis on macro (building) and micro (BIM object) scale is necessary. On macro scale, the complete model is compared to the original point cloud and vice versa to get an overview of the general model quality. The graphical results show occluded zones and non-modeled objects respectively. Colored point clouds are derived from this analysis and integrated in the BIM. On micro scale, the relevant surface parts are extracted per BIM object and compared to the complete point cloud. Occluded zones are extracted based on a maximum deviation. What remains is classified according to the LOA specification. The numerical results are integrated in the BIM with the use of object parameters.

  11. The Standard Model course

    CERN Multimedia

    CERN. Geneva HR-RFA

    2006-01-01

    Suggested Readings: Aspects of Quantum Chromodynamics/A Pich, arXiv:hep-ph/0001118. - The Standard Model of Electroweak Interactions/A Pich, arXiv:hep-ph/0502010. - The Standard Model of Particle Physics/A Pich The Standard Model of Elementary Particle Physics will be described. A detailed discussion of the particle content, structure and symmetries of the theory will be given, together with an overview of the most important experimental facts which have established this theoretical framework as the Standard Theory of particle interactions.

  12. Beyond the standard model

    International Nuclear Information System (INIS)

    Pleitez, V.

    1994-01-01

    The search for physics laws beyond the standard model is discussed in a general way, and also some topics on supersymmetry theories. An approach is made on recent possibilities rise in the leptonic sector. Finally, models with SU(3) c X SU(2) L X U(1) Y symmetry are considered as alternatives for the extensions of the elementary particles standard model. 36 refs., 1 fig., 4 tabs

  13. Phd study of reliability and validity: One step closer to a standardized music therapy assessment model

    DEFF Research Database (Denmark)

    Jacobsen, Stine Lindahl

    The paper will present a phd study concerning reliability and validity of music therapy assessment modelAssessment of Parenting Competences” (APC) in the area of families with emotionally neglected children. This study had a multiple strategy design with a philosophical base of critical realism...... and pragmatism. The fixed design for this study was a between and within groups design in testing the APCs reliability and validity. The two different groups were parents with neglected children and parents with non-neglected children. The flexible design had a multiple case study strategy specifically...... with interplay of turns between parent and child as the case under study comparing clinical and non-clinical groups and looking for differences in patterns of interaction. The flexible design informed the fixed design and led to further valuable statistical analysis. The presenter will provide an overview...

  14. Beyond the standard model

    International Nuclear Information System (INIS)

    Gaillard, M.K.

    1990-04-01

    The unresolved issues of the standard model are reviewed, with emphasis on the gauge hierarchy problem. A possible mechanism for generating a hierarchy in the context of superstring theory is described. 24 refs

  15. Standardized echocardiographic assessment of cardiac function in normal adult zebrafish and heart disease models

    Directory of Open Access Journals (Sweden)

    Louis W. Wang

    2017-01-01

    Full Text Available The zebrafish (Danio rerio is an increasingly popular model organism in cardiovascular research. Major insights into cardiac developmental processes have been gained by studies of embryonic zebrafish. However, the utility of zebrafish for modeling adult-onset heart disease has been limited by a lack of robust methods for in vivo evaluation of cardiac function. We established a physiological protocol for underwater zebrafish echocardiography using high frequency ultrasound, and evaluated its reliability in detecting altered cardiac function in two disease models. Serial assessment of cardiac function was performed in wild-type zebrafish aged 3 to 12 months and the effects of anesthetic agents, age, sex and background strain were evaluated. There was a varying extent of bradycardia and ventricular contractile impairment with different anesthetic drugs and doses, with tricaine 0.75 mmol l−1 having a relatively more favorable profile. When compared with males, female fish were larger and had more measurement variability. Although age-related increments in ventricular chamber size were greater in females than males, there were no sex differences when data were normalized to body size. Systolic ventricular function was similar in both sexes at all time points, but differences in diastolic function were evident from 6 months onwards. Wild-type fish of both sexes showed a reliance on atrial contraction for ventricular diastolic filling. Echocardiographic evaluation of adult zebrafish with diphtheria toxin-induced myocarditis or anemia-induced volume overload accurately identified ventricular dilation and altered contraction, with suites of B-mode, ventricular strain, pulsed-wave Doppler and tissue Doppler indices showing concordant changes indicative of myocardial hypocontractility or hypercontractility, respectively. Repeatability, intra-observer and inter-observer correlations for echocardiographic measurements were high. We demonstrate that

  16. Beyond the standard model

    International Nuclear Information System (INIS)

    Cuypers, F.

    1997-05-01

    These lecture notes are intended as a pedagogical introduction to several popular extensions of the standard model of strong and electroweak interactions. The topics include the Higgs sector, the left-right symmetric model, grand unification and supersymmetry. Phenomenological consequences and search procedures are emphasized. (author) figs., tabs., 18 refs

  17. Beyond the Standard Model

    Energy Technology Data Exchange (ETDEWEB)

    Peskin, M.E.

    1997-05-01

    These lectures constitute a short course in ``Beyond the Standard Model`` for students of experimental particle physics. The author discusses the general ideas which guide the construction of models of physics beyond the Standard model. The central principle, the one which most directly motivates the search for new physics, is the search for the mechanism of the spontaneous symmetry breaking observed in the theory of weak interactions. To illustrate models of weak-interaction symmetry breaking, the author gives a detailed discussion of the idea of supersymmetry and that of new strong interactions at the TeV energy scale. He discusses experiments that will probe the details of these models at future pp and e{sup +}e{sup {minus}} colliders.

  18. Beyond the Standard Model

    International Nuclear Information System (INIS)

    Peskin, M.E.

    1997-05-01

    These lectures constitute a short course in ''Beyond the Standard Model'' for students of experimental particle physics. The author discusses the general ideas which guide the construction of models of physics beyond the Standard model. The central principle, the one which most directly motivates the search for new physics, is the search for the mechanism of the spontaneous symmetry breaking observed in the theory of weak interactions. To illustrate models of weak-interaction symmetry breaking, the author gives a detailed discussion of the idea of supersymmetry and that of new strong interactions at the TeV energy scale. He discusses experiments that will probe the details of these models at future pp and e + e - colliders

  19. Conference: STANDARD MODEL @ LHC

    CERN Multimedia

    2012-01-01

    HCØ institute Universitetsparken 5 DK-2100 Copenhagen Ø Denmark Room: Auditorium 2 STANDARD MODEL @ LHC Niels Bohr International Academy and Discovery Center 10-13 April 2012 This four day meeting will bring together both experimental and theoretical aspects of Standard Model phenomenology at the LHC. The very latest results from the LHC experiments will be under discussion. Topics covered will be split into the following categories:     * QCD (Hard,Soft & PDFs)     * Vector Boson production     * Higgs searches     * Top Quark Physics     * Flavour physics

  20. The Standard Model

    Science.gov (United States)

    Burgess, Cliff; Moore, Guy

    2012-04-01

    List of illustrations; List of tables; Preface; Acknowledgments; Part I. Theoretical Framework: 1. Field theory review; 2. The standard model: general features; 3. Cross sections and lifetimes; Part II. Applications: Leptons: 4. Elementary boson decays; 5. Leptonic weak interactions: decays; 6. Leptonic weak interactions: collisions; 7. Effective Lagrangians; Part III. Applications: Hadrons: 8. Hadrons and QCD; 9. Hadronic interactions; Part IV. Beyond the Standard Model: 10. Neutrino masses; 11. Open questions, proposed solutions; Appendix A. Experimental values for the parameters; Appendix B. Symmetries and group theory review; Appendix C. Lorentz group and the Dirac algebra; Appendix D. ξ-gauge Feynman rules; Appendix E. Metric convention conversion table; Select bibliography; Index.

  1. Beyond the Standard Model

    CERN Document Server

    Csáki, Csaba

    2015-01-01

    We introduce aspects of physics beyond the Standard Model focusing on supersymmetry, extra dimensions, and a composite Higgs as solutions to the Hierarchy problem. Lectures given at the 2013 European School of High Energy Physics, Parádfürdo, Hungary, 5-18 June 2013.

  2. Beyond the Standard Model

    CERN Multimedia

    CERN. Geneva

    2005-01-01

    The necessity for new physics beyond the Standard Model will be motivated. Theoretical problems will be exposed and possible solutions will be described. The goal is to present the exciting new physics ideas that will be tested in the near future. Supersymmetry, grand unification, extra dimensions and string theory will be presented.

  3. DOE limited standard: Operations assessments

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1996-05-01

    Purpose of this standard is to provide DOE Field Element assessors with a guide for conducting operations assessments, and provide DOE Field Element managers with the criteria of the EM Operations Assessment Program. Sections 6.1 to 6.21 provide examples of how to assess specific areas; the general techniques of operations assessments (Section 5) may be applied to other areas of health and safety (e.g. fire protection, criticality safety, quality assurance, occupational safety, etc.).

  4. Beyond the Standard Model

    Energy Technology Data Exchange (ETDEWEB)

    Lykken, Joseph D.; /Fermilab

    2010-05-01

    'BSM physics' is a phrase used in several ways. It can refer to physical phenomena established experimentally but not accommodated by the Standard Model, in particular dark matter and neutrino oscillations (technically also anything that has to do with gravity, since gravity is not part of the Standard Model). 'Beyond the Standard Model' can also refer to possible deeper explanations of phenomena that are accommodated by the Standard Model but only with ad hoc parameterizations, such as Yukawa couplings and the strong CP angle. More generally, BSM can be taken to refer to any possible extension of the Standard Model, whether or not the extension solves any particular set of puzzles left unresolved in the SM. In this general sense one sees reference to the BSM 'theory space' of all possible SM extensions, this being a parameter space of coupling constants for new interactions, new charges or other quantum numbers, and parameters describing possible new degrees of freedom or new symmetries. Despite decades of model-building it seems unlikely that we have mapped out most of, or even the most interesting parts of, this theory space. Indeed we do not even know what is the dimensionality of this parameter space, or what fraction of it is already ruled out by experiment. Since Nature is only implementing at most one point in this BSM theory space (at least in our neighborhood of space and time), it might seem an impossible task to map back from a finite number of experimental discoveries and measurements to a unique BSM explanation. Fortunately for theorists the inevitable limitations of experiments themselves, in terms of resolutions, rates, and energy scales, means that in practice there are only a finite number of BSM model 'equivalence classes' competing at any given time to explain any given set of results. BSM phenomenology is a two-way street: not only do experimental results test or constrain BSM models, they also suggest

  5. Testing the Standard Model

    CERN Document Server

    Riles, K

    1998-01-01

    The Large Electron Project (LEP) accelerator near Geneva, more than any other instrument, has rigorously tested the predictions of the Standard Model of elementary particles. LEP measurements have probed the theory from many different directions and, so far, the Standard Model has prevailed. The rigour of these tests has allowed LEP physicists to determine unequivocally the number of fundamental 'generations' of elementary particles. These tests also allowed physicists to ascertain the mass of the top quark in advance of its discovery. Recent increases in the accelerator's energy allow new measurements to be undertaken, measurements that may uncover directly or indirectly the long-sought Higgs particle, believed to impart mass to all other particles.

  6. Standard Model physics

    CERN Multimedia

    Altarelli, Guido

    1999-01-01

    Introduction structure of gauge theories. The QEDand QCD examples. Chiral theories. The electroweak theory. Spontaneous symmetry breaking. The Higgs mechanism Gauge boson and fermion masses. Yukawa coupling. Charges current couplings. The Cabibo-Kobayashi-Maskawa matrix and CP violation. Neutral current couplings. The Glasow-Iliopoulos-Maiani mechanism. Gauge boson and Higgs coupling. Radiative corrections and loops. Cancellation of the chiral anomaly. Limits on the Higgs comparaison. Problems of the Standard Model. Outlook.

  7. Standard model and beyond

    International Nuclear Information System (INIS)

    Quigg, C.

    1984-09-01

    The SU(3)/sub c/ circle crossSU(2)/sub L/circle crossU(1)/sub Y/ gauge theory of ineractions among quarks and leptons is briefly described, and some recent notable successes of the theory are mentioned. Some shortcomings in our ability to apply the theory are noted, and the incompleteness of the standard model is exhibited. Experimental hints that Nature may be richer in structure than the minimal theory are discussed. 23 references

  8. A REGIONAL MODELING STRUCTURE FOR ASSESSING COSTS OF IMPLEMENTING MANURE NUTRIENT STANDARDS: APPLICATION TO THE CHESAPEAKE BAY WATERSHED

    OpenAIRE

    Aillery, Marcel P.; Gollehon, Noel R.; Ribaudo, Marc

    2003-01-01

    A Chesapeake Bay Watershed manure management model estimates the minimal regional net cost of land applying manure at $76 million under a multi-year phosphorus standard, with assumed manure acceptance rate on 60 percent of cropland. The multi-year standard represents a savings of 17 percent relative to an annual phosphorus standard.

  9. Quasi standard model physics

    International Nuclear Information System (INIS)

    Peccei, R.D.

    1986-01-01

    Possible small extensions of the standard model are considered, which are motivated by the strong CP problem and by the baryon asymmetry of the Universe. Phenomenological arguments are given which suggest that imposing a PQ symmetry to solve the strong CP problem is only tenable if the scale of the PQ breakdown is much above M W . Furthermore, an attempt is made to connect the scale of the PQ breakdown to that of the breakdown of lepton number. It is argued that in these theories the same intermediate scale may be responsible for the baryon number of the Universe, provided the Kuzmin Rubakov Shaposhnikov (B+L) erasing mechanism is operative. (orig.)

  10. Standard-model bundles

    CERN Document Server

    Donagi, Ron; Pantev, Tony; Waldram, Dan; Donagi, Ron; Ovrut, Burt; Pantev, Tony; Waldram, Dan

    2002-01-01

    We describe a family of genus one fibered Calabi-Yau threefolds with fundamental group ${\\mathbb Z}/2$. On each Calabi-Yau $Z$ in the family we exhibit a positive dimensional family of Mumford stable bundles whose symmetry group is the Standard Model group $SU(3)\\times SU(2)\\times U(1)$ and which have $c_{3} = 6$. We also show that for each bundle $V$ in our family, $c_{2}(Z) - c_{2}(V)$ is the class of an effective curve on $Z$. These conditions ensure that $Z$ and $V$ can be used for a phenomenologically relevant compactification of Heterotic M-theory.

  11. Impact assessment of commodity standards

    NARCIS (Netherlands)

    Ruben, Ruerd

    2017-01-01

    Voluntary commodity standards are widely used to enhance the performance of tropical agro-food chains and to support the welfare and sustainability of smallholder farmers. Different methods and approaches are used to assess the effectiveness and impact of these certification schemes at

  12. The standard model

    International Nuclear Information System (INIS)

    Marciano, W.J.

    1994-03-01

    In these lectures, my aim is to provide a survey of the standard model with emphasis on its renormalizability and electroweak radiative corrections. Since this is a school, I will try to be somewhat pedagogical by providing examples of loop calculations. In that way, I hope to illustrate some of the commonly employed tools of particle physics. With those goals in mind, I have organized my presentations as follows: In Section 2, renormalization is discussed from an applied perspective. The technique of dimensional regularization is described and used to define running couplings and masses. The utility of the renormalization group for computing leading logs is illustrated for the muon anomalous magnetic moment. In Section 3 electroweak radiative corrections are discussed. Standard model predictions are surveyed and used to constrain the top quark mass. The S, T, and U parameters are introduced and employed to probe for ''new physics''. The effect of Z' bosons on low energy phenomenology is described. In Section 4, a detailed illustration of electroweak radiative corrections is given for atomic parity violation. Finally, in Section 5, I conclude with an outlook for the future

  13. Structure of the standard model

    Energy Technology Data Exchange (ETDEWEB)

    Langacker, Paul [Pennsylvania Univ., PA (United States). Dept. of Physics

    1996-07-01

    This lecture presents the structure of the standard model, approaching the following aspects: the standard model Lagrangian, spontaneous symmetry breaking, gauge interactions, covering charged currents, quantum electrodynamics, the neutral current and gauge self-interactions, and problems with the standard model, such as gauge, fermion, Higgs and hierarchy, strong C P and graviton problems.

  14. Standardizing data exchange for clinical research protocols and case report forms: An assessment of the suitability of the Clinical Data Interchange Standards Consortium (CDISC) Operational Data Model (ODM)

    Science.gov (United States)

    Huser, Vojtech; Sastry, Chandan; Breymaier, Matthew; Idriss, Asma; Cimino, James J.

    2015-01-01

    Efficient communication of a clinical study protocol and case report forms during all stages of a human clinical study is important for many stakeholders. An electronic and structured study representation format that can be used throughout the whole study life-span can improve such communication and potentially lower total study costs. The most relevant standard for representing clinical study data, applicable to unregulated as well as regulated studies, is the Operational Data Model (ODM) in development since 1999 by the Clinical Data Interchange Standards Consortium (CDISC). ODM's initial objective was exchange of case report forms data but it is increasingly utilized in other contexts. An ODM extension called Study Design Model, introduced in 2011, provides additional protocol representation elements. Using a case study approach, we evaluated ODM's ability to capture all necessary protocol elements during a complete clinical study lifecycle in the Intramural Research Program of the National Institutes of Health. ODM offers the advantage of a single format for institutions that deal with hundreds or thousands of concurrent clinical studies and maintain a data warehouse for these studies. For each study stage, we present a list of gaps in the ODM standard and identify necessary vendor or institutional extensions that can compensate for such gaps. The current version of ODM (1.3.2) has only partial support for study protocol and study registration data mainly because it is outside the original development goal. ODM provides comprehensive support for representation of case report forms (in both the design stage and with patient level data). Inclusion of requirements of observational, non-regulated or investigator-initiated studies (outside Food and Drug Administration (FDA) regulation) can further improve future revisions of the standard. PMID:26188274

  15. Standardizing data exchange for clinical research protocols and case report forms: An assessment of the suitability of the Clinical Data Interchange Standards Consortium (CDISC) Operational Data Model (ODM).

    Science.gov (United States)

    Huser, Vojtech; Sastry, Chandan; Breymaier, Matthew; Idriss, Asma; Cimino, James J

    2015-10-01

    Efficient communication of a clinical study protocol and case report forms during all stages of a human clinical study is important for many stakeholders. An electronic and structured study representation format that can be used throughout the whole study life-span can improve such communication and potentially lower total study costs. The most relevant standard for representing clinical study data, applicable to unregulated as well as regulated studies, is the Operational Data Model (ODM) in development since 1999 by the Clinical Data Interchange Standards Consortium (CDISC). ODM's initial objective was exchange of case report forms data but it is increasingly utilized in other contexts. An ODM extension called Study Design Model, introduced in 2011, provides additional protocol representation elements. Using a case study approach, we evaluated ODM's ability to capture all necessary protocol elements during a complete clinical study lifecycle in the Intramural Research Program of the National Institutes of Health. ODM offers the advantage of a single format for institutions that deal with hundreds or thousands of concurrent clinical studies and maintain a data warehouse for these studies. For each study stage, we present a list of gaps in the ODM standard and identify necessary vendor or institutional extensions that can compensate for such gaps. The current version of ODM (1.3.2) has only partial support for study protocol and study registration data mainly because it is outside the original development goal. ODM provides comprehensive support for representation of case report forms (in both the design stage and with patient level data). Inclusion of requirements of observational, non-regulated or investigator-initiated studies (outside Food and Drug Administration (FDA) regulation) can further improve future revisions of the standard. Published by Elsevier Inc.

  16. Using Distractor-Driven Standards-Based Multiple-Choice Assessments and Rasch Modeling to Investigate Hierarchies of Chemistry Misconceptions and Detect Structural Problems with Individual Items

    Science.gov (United States)

    Herrmann-Abell, Cari F.; DeBoer, George E.

    2011-01-01

    Distractor-driven multiple-choice assessment items and Rasch modeling were used as diagnostic tools to investigate students' understanding of middle school chemistry ideas. Ninety-one items were developed according to a procedure that ensured content alignment to the targeted standards and construct validity. The items were administered to 13360…

  17. Beyond Standard Model Physics

    Energy Technology Data Exchange (ETDEWEB)

    Bellantoni, L.

    2009-11-01

    There are many recent results from searches for fundamental new physics using the TeVatron, the SLAC b-factory and HERA. This talk quickly reviewed searches for pair-produced stop, for gauge-mediated SUSY breaking, for Higgs bosons in the MSSM and NMSSM models, for leptoquarks, and v-hadrons. There is a SUSY model which accommodates the recent astrophysical experimental results that suggest that dark matter annihilation is occurring in the center of our galaxy, and a relevant experimental result. Finally, model-independent searches at D0, CDF, and H1 are discussed.

  18. Assessing ballast treatment standards for effect on rate of establishment using a stochastic model of the green crab

    Directory of Open Access Journals (Sweden)

    Cynthia Cooper

    2012-03-01

    Full Text Available This paper describes a stochastic model used to characterize the probability/risk of NIS establishment from ships' ballast water discharges. Establishment is defined as the existence of a sufficient number of individuals of a species to provide for a sustained population of the organism. The inherent variability in population dynamics of organisms in their native or established environments is generally difficult to quantify. Muchqualitative information is known about organism life cycles and biotic and abiotic environmental pressures on the population, but generally little quantitative data exist to develop a mechanistic model of populations in such complex environments. Moreover, there is little quantitative data to characterize the stochastic fluctuations of population size over time even without accounting for systematic responses to biotic and abiotic pressures. This research applies an approach using life-stage density and fecundity measures reported in research to determine a stochastic model of an organism's population dynamics. The model is illustrated withdata from research studies on the green crab that span a range of habitats of the established organism and were collected over some years to represent a range of time-varying biotic and abiotic conditions that are expected to exist in many receiving environments. This model is applied to introductions of NIS at the IMO D-2 and the U.S. ballast water discharge standard levels designated as Phase Two in the United States Coast Guard'sNotice of Proposed Rulemaking. Under a representative range of ballast volumes discharged at U.S. ports, the average rate of establishment of green crabs for ballast waters treated to the IMO-D2 concentration standard (less than 10 organisms/m3 is predicted to be reduced to about a third the average rate from untreated ballast water discharge. The longevity of populations from the untreated ballast water discharges is expected to be reducedby about 90% by

  19. Chapter 1: Standard Model processes

    OpenAIRE

    Becher, Thomas

    2017-01-01

    This chapter documents the production rates and typical distributions for a number of benchmark Standard Model processes, and discusses new dynamical phenomena arising at the highest energies available at this collider. We discuss the intrinsic physics interest in the measurement of these Standard Model processes, as well as their role as backgrounds for New Physics searches.

  20. A standardized patient model to teach and assess professionalism and communication skills: the effect of personality type on performance.

    Science.gov (United States)

    Lifchez, Scott D; Redett, Richard J

    2014-01-01

    Teaching and assessing professionalism and interpersonal communication skills can be more difficult for surgical residency programs than teaching medical knowledge or patient care, for which many structured educational curricula and assessment tools exist. Residents often learn these skills indirectly, by observing the behavior of their attendings when communicating with patients and colleagues. The purpose of this study was to assess the results of an educational curriculum we created to teach and assess our residents in professionalism and communication. We assessed resident and faculty prior education in delivering bad news to patients. Residents then participated in a standardized patient (SP) encounter to deliver bad news to a patient's family regarding a severe burn injury. Residents received feedback from the encounter and participated in an education curriculum on communication skills and professionalism. As a part of this curriculum, residents underwent assessment of communication style using the Myers-Briggs type inventory. The residents then participated in a second SP encounter discussing a severe pulmonary embolus with a patient's family. Resident performance on the SP evaluation correlated with an increased comfort in delivering bad news. Comfort in delivering bad news did not correlate with the amount of prior education on the topic for either residents or attendings. Most of our residents demonstrated an intuitive thinking style (NT) on the Myers-Briggs type inventory, very different from population norms. The lack of correlation between comfort in delivering bad news and prior education on the subject may indicate the difficulty in imparting communication and professionalism skills to residents effectively. Understanding communication style differences between our residents and the general population can help us teach professionalism and communication skills more effectively. With the next accreditation system, residency programs would need to

  1. Assessing behind armor blunt trauma (BABT) under NIJ standard-0101.04 conditions using human torso models.

    Science.gov (United States)

    Merkle, Andrew C; Ward, Emily E; O'Connor, James V; Roberts, Jack C

    2008-06-01

    Although soft armor vests serve to prevent penetrating wounds and dissipate impact energy, the potential of nonpenetrating injury to the thorax, termed behind armor blunt trauma, does exist. Currently, the ballistic resistance of personal body armor is determined by impacting a soft armor vest over a clay backing and measuring the resulting clay deformation as specified in National Institute of Justice (NIJ) Standard-0101.04. This research effort evaluated the efficacy of a physical Human Surrogate Torso Model (HSTM) as a device for determining thoracic response when exposed to impact conditions specified in the NIJ Standard. The HSTM was subjected to a series of ballistic impacts over the sternum and stomach. The pressure waves propagating through the torso were measured with sensors installed in the organs. A previously developed Human Torso Finite Element Model (HTFEM) was used to analyze the amount of tissue displacement during impact and compared with the amount of clay deformation predicted by a validated finite element model. All experiments and simulations were conducted at NIJ Standard test conditions. When normalized by the response at the lowest threat level (Level I), the clay deformations for the higher levels are relatively constant and range from 2.3 to 2.7 times that of the base threat level. However, the pressures in the HSTM increase with each test level and range from three to seven times greater than Level I depending on the organ. The results demonstrate the abilities of the HSTM to discriminate between threat levels, impact conditions, and impact locations. The HTFEM and HSTM are capable of realizing pressure and displacement differences because of the level of protection, surrounding tissue, and proximity to the impact point. The results of this research provide insight into the transfer of energy and pressure wave propagation during ballistic impacts using a physical surrogate and computational model of the human torso.

  2. Research Notes - Openness and Evolvability - Standards Assessment

    Science.gov (United States)

    2016-08-01

    and Processes developed to assess system level Openness and Evolvability. The Research Notes within this report focus on Standards Assessment. 2...the standard available without an unreasonable financial burden to any vendor that wishes to access the standard ? If a cost is charged at all, the...Notes – Openness and Evolvability – Standards Assessment 3. SECURITY CLASSIFICATION (FOR UNCLASSIFIED REPORTS THAT ARE LIMITED RELEASE USE (U/L

  3. GAIA Service and Standard Assessment

    DEFF Research Database (Denmark)

    Dormann, Claire; Øst, Alexander Gorm

    A delivery from the ACTS-project GAIA. The report validates the gAIA architecture and standard. It provides results concerning the deployment of distributed brokerage systems over broadband networks.......A delivery from the ACTS-project GAIA. The report validates the gAIA architecture and standard. It provides results concerning the deployment of distributed brokerage systems over broadband networks....

  4. Framework for Designing The Assessment Models of Readiness SMEs to Adopt Indonesian National Standard (SNI), Case Study: SMEs Batik in Surakarta

    Science.gov (United States)

    Fahma, Fakhrina; Zakaria, Roni; Fajar Gumilang, Royan

    2018-03-01

    Since the ASEAN Economic Community (AEC) is released, the opportunity to expand market share has become very open, but the level of competition is also very high. Standardization is believed to be an important factor in seizing opportunities in the AEC’s era and other free trade agreements in the future. Standardization activities in industry can be proven by obtaining certification of SNI (Indonesian National Standard). This is a challenge for SMEs, considering that currently only 20% of SMEs had SNI certification both product and process. This research will be designed a model of readiness assessment to obtain SNI certification for SMEs. The stages of model development used an innovative approach by Roger (2003). Variables that affect the readiness of SMEs are obtained from product certification requirements established by BSN (National Standardization Agency) and LSPro (Certification body). This model will be used for mapping the readiness for SNI certification of SMEs’product. The level of readiness of SMEs is determined by the percentage of compliance with those requirements. Based on the result of this study, the five variables are determined as main aspects to assess SME readiness. For model validation, trials were conducted on Batik SMEs in Laweyan Surakarta.

  5. Physics beyond the Standard Model

    CERN Document Server

    Valle, José W F

    1991-01-01

    We discuss some of the signatures associated with extensions of the Standard Model related to the neutrino and electroweak symmetry breaking sectors, with and without supersymmetry. The topics include a basic discussion of the theory of neutrino mass and the corresponding extensions of the Standard Model that incorporate massive neutrinos; an overview of the present observational status of neutrino mass searches, with emphasis on solar neutrinos, as well the as cosmological data on the amplitude of primordial density fluctuations; the implications of neutrino mass in cosmological nucleosynthesis, non-accelerator, as well as in high energy particle collider experiments. Turning to the electroweak breaking sector, we discuss the physics potential for Higgs boson searches at LEP200, including Majoron extensions of the Standard Model, and the physics of invisibly decaying Higgs bosons. We discuss the minimal supersymmetric Standard Model phenomenology, as well as some of the laboratory signatures that would be as...

  6. Physics Beyond the Standard Model

    CERN Document Server

    Ellis, John

    2009-01-01

    The Standard Model is in good shape, apart possibly from g_\\mu - 2 and some niggling doubts about the electroweak data. Something like a Higgs boson is required to provide particle masses, but theorists are actively considering alternatives. The problems of flavour, unification and quantum gravity will require physics beyond the Standard Model, and astrophysics and cosmology also provide reasons to expect physics beyond the Standard Model, in particular to provide the dark matter and explain the origin of the matter in the Universe. Personally, I find supersymmetry to be the most attractive option for new physics at the TeV scale. The LHC should establish the origin of particle masses has good prospects for discovering dark matter, and might also cast light on unification and even quantum gravity. Important roles may also be played by lower-energy experiments, astrophysics and cosmology in the searches for new physics beyond the Standard Model.

  7. Beyond the standard model; Au-dela du modele standard

    Energy Technology Data Exchange (ETDEWEB)

    Cuypers, F. [Paul Scherrer Inst. (PSI), Villigen (Switzerland)

    1997-05-01

    These lecture notes are intended as a pedagogical introduction to several popular extensions of the standard model of strong and electroweak interactions. The topics include the Higgs sector, the left-right symmetric model, grand unification and supersymmetry. Phenomenological consequences and search procedures are emphasized. (author) figs., tabs., 18 refs.

  8. Programmatic Environmental Assessment for Standard Targetry Replacement

    National Research Council Canada - National Science Library

    2006-01-01

    This Programmatic Environmental Assessment (PEA) evaluates potential direct, indirect, and cumulative impacts of standard targetry replacement and alternatives on environmental and land use resources...

  9. About the standard solar model

    International Nuclear Information System (INIS)

    Cahen, S.

    1986-07-01

    A discussion of the still controversial solar helium content is presented, based on a comparison of recent standard solar models. Our last model yields an helium mass fraction ∼0.276, 6.4 SNU on 37 Cl and 126 SNU on 71 Ga

  10. The standard model and colliders

    International Nuclear Information System (INIS)

    Hinchliffe, I.

    1987-03-01

    Some topics in the standard model of strong and electroweak interactions are discussed, as well as how these topics are relevant for the high energy colliders which will become operational in the next few years. The radiative corrections in the Glashow-Weinberg-Salam model are discussed, stressing how these corrections may be measured at LEP and the SLC. CP violation is discussed briefly, followed by a discussion of the Higgs boson and the searches which are relevant to hadron colliders are then discussed. Some of the problems which the standard model does not solve are discussed, and the energy ranges accessible to the new colliders are indicated

  11. Dynamics of the standard model

    CERN Document Server

    Donoghue, John F; Holstein, Barry R

    2014-01-01

    Describing the fundamental theory of particle physics and its applications, this book provides a detailed account of the Standard Model, focusing on techniques that can produce information about real observed phenomena. The book begins with a pedagogic account of the Standard Model, introducing essential techniques such as effective field theory and path integral methods. It then focuses on the use of the Standard Model in the calculation of physical properties of particles. Rigorous methods are emphasized, but other useful models are also described. This second edition has been updated to include recent theoretical and experimental advances, such as the discovery of the Higgs boson. A new chapter is devoted to the theoretical and experimental understanding of neutrinos, and major advances in CP violation and electroweak physics have been given a modern treatment. This book is valuable to graduate students and researchers in particle physics, nuclear physics and related fields.

  12. Establishing the isolated Standard Model

    International Nuclear Information System (INIS)

    Wells, James D.; Zhang, Zhengkang; Zhao, Yue

    2017-02-01

    The goal of this article is to initiate a discussion on what it takes to claim ''there is no new physics at the weak scale,'' namely that the Standard Model (SM) is ''isolated.'' The lack of discovery of beyond the SM (BSM) physics suggests that this may be the case. But to truly establish this statement requires proving all ''connected'' BSM theories are false, which presents a significant challenge. We propose a general approach to quantitatively assess the current status and future prospects of establishing the isolated SM (ISM), which we give a reasonable definition of. We consider broad elements of BSM theories, and show many examples where current experimental results are not sufficient to verify the ISM. In some cases, there is a clear roadmap for the future experimental program, which we outline, while in other cases, further efforts - both theoretical and experimental - are needed in order to robustly claim the establishment of the ISM in the absence of new physics discoveries.

  13. Establishing the isolated standard model

    Science.gov (United States)

    Wells, James D.; Zhang, Zhengkang; Zhao, Yue

    2017-07-01

    The goal of this article is to initiate a discussion on what it takes to claim "there is no new physics at the weak scale," namely that the Standard Model (SM) is "isolated." The lack of discovery of beyond the SM (BSM) physics suggests that this may be the case. But to truly establish this statement requires proving all "connected" BSM theories are false, which presents a significant challenge. We propose a general approach to quantitatively assess the current status and future prospects of establishing the isolated SM (ISM), which we give a reasonable definition of. We consider broad elements of BSM theories, and show many examples where current experimental results are not sufficient to verify the ISM. In some cases, there is a clear roadmap for the future experimental program, which we outline, while in other cases, further efforts—both theoretical and experimental—are needed in order to robustly claim the establishment of the ISM in the absence of new physics discoveries.

  14. The standard model and beyond

    CERN Document Server

    Langacker, Paul

    2017-01-01

    This new edition of The Standard Model and Beyond presents an advanced introduction to the physics and formalism of the standard model and other non-abelian gauge theories. It provides a solid background for understanding supersymmetry, string theory, extra dimensions, dynamical symmetry breaking, and cosmology. In addition to updating all of the experimental and phenomenological results from the first edition, it contains a new chapter on collider physics; expanded discussions of Higgs, neutrino, and dark matter physics; and many new problems. The book first reviews calculational techniques in field theory and the status of quantum electrodynamics. It then focuses on global and local symmetries and the construction of non-abelian gauge theories. The structure and tests of quantum chromodynamics, collider physics, the electroweak interactions and theory, and the physics of neutrino mass and mixing are thoroughly explored. The final chapter discusses the motivations for extending the standard model and examin...

  15. Standard model of knowledge representation

    Science.gov (United States)

    Yin, Wensheng

    2016-09-01

    Knowledge representation is the core of artificial intelligence research. Knowledge representation methods include predicate logic, semantic network, computer programming language, database, mathematical model, graphics language, natural language, etc. To establish the intrinsic link between various knowledge representation methods, a unified knowledge representation model is necessary. According to ontology, system theory, and control theory, a standard model of knowledge representation that reflects the change of the objective world is proposed. The model is composed of input, processing, and output. This knowledge representation method is not a contradiction to the traditional knowledge representation method. It can express knowledge in terms of multivariate and multidimensional. It can also express process knowledge, and at the same time, it has a strong ability to solve problems. In addition, the standard model of knowledge representation provides a way to solve problems of non-precision and inconsistent knowledge.

  16. Extensions of the Standard Model

    CERN Document Server

    Zwirner, Fabio

    1996-01-01

    Rapporteur talk at the International Europhysics Conference on High Energy Physics, Brussels (Belgium), July 27-August 2, 1995. This talk begins with a brief general introduction to the extensions of the Standard Model, reviewing the ideology of effective field theories and its practical implications. The central part deals with candidate extensions near the Fermi scale, focusing on some phenomenological aspects of the Minimal Supersymmetric Standard Model. The final part discusses some possible low-energy implications of further extensions near the Planck scale, namely superstring theories.

  17. Physics beyond the standard model

    Energy Technology Data Exchange (ETDEWEB)

    Valle, J.W.F. [Valencia Univ. (Spain). Dept. de Fisica Teorica]. E-mail: valle@flamenco.uv.es

    1996-07-01

    We discuss some of the signatures associated with extensions of the Standard Model related to the neutrino and electroweak symmetry breaking sectors, with and without supersymmetry. The topics include a basic discussion of the theory of neutrino mass and the corresponding extensions of the Standard Model that incorporate massive neutrinos; an overview of the present observational status of neutrino mass searches, with emphasis on solar neutrinos, as well as cosmological data on the amplitude of primordial density fluctuations; the implications of neutrino mass in cosmological nucleosynthesis, non-accelerator, as well as in high energy particle collider experiments. Turning to the electroweak breaking sector, we discuss the physics potential for Higgs boson searches at LEP200, including Majorana extensions of the Standard Model, and the physics of invisibly decaying Higgs bosons. We discuss the minimal supersymmetric Standard Model phenomenology, as well as some of the laboratory signatures that would be associated to models with R parity violation, especially in Z and scalar boson decays. (author)

  18. Custom v. Standardized Risk Models

    Directory of Open Access Journals (Sweden)

    Zura Kakushadze

    2015-05-01

    Full Text Available We discuss when and why custom multi-factor risk models are warranted and give source code for computing some risk factors. Pension/mutual funds do not require customization but standardization. However, using standardized risk models in quant trading with much shorter holding horizons is suboptimal: (1 longer horizon risk factors (value, growth, etc. increase noise trades and trading costs; (2 arbitrary risk factors can neutralize alpha; (3 “standardized” industries are artificial and insufficiently granular; (4 normalization of style risk factors is lost for the trading universe; (5 diversifying risk models lowers P&L correlations, reduces turnover and market impact, and increases capacity. We discuss various aspects of custom risk model building.

  19. Standard Model at LHC 2016

    CERN Document Server

    2016-01-01

    The meeting aims to bring together experimentalists and theorists to discuss the phenomenology, observational results and theoretical tools for Standard Model physics at the LHC. The agenda is divided into four working groups: Electroweak physics Higgs physics QCD (hard, soft & PDFs) Top & flavour physics

  20. The standard model and beyond

    International Nuclear Information System (INIS)

    Marciano, W.J.

    1989-05-01

    In these lectures, my aim is to present a status report on the standard model and some key tests of electroweak unification. Within that context, I also discuss how and where hints of new physics may emerge. To accomplish those goals, I have organized my presentation as follows. I survey the standard model parameters with particular emphasis on the gauge coupling constants and vector boson masses. Examples of new physics appendages are also commented on. In addition, I have included an appendix on dimensional regularization and a simple example which employs that technique. I focus on weak charged current phenomenology. Precision tests of the standard model are described and up-to-date values for the Cabibbo-Kobayashi-Maskawa (CKM) mixing matrix parameters are presented. Constraints implied by those tests for a 4th generation, extra Z' bosons, and compositeness are discussed. An overview of the physics of tau decays is also included. I discuss weak neutral current phenomenology and the extraction of sin 2 θW from experiment. The results presented there are based on a global analysis of all existing data. I have chosen to concentrate that discussion on radiative corrections, the effect of a heavy top quark mass, implications for grand unified theories (GUTS), extra Z' gauge bosons, and atomic parity violation. The potential for further experimental progress is also commented on. Finally, I depart from the narrowest version of the standard model and discuss effects of neutrino masses, mixings, and electromagnetic moments. 32 refs., 3 figs., 5 tabs

  1. Beyond the Standard Model course

    CERN Multimedia

    CERN. Geneva HR-RFA

    2006-01-01

    The necessity for new physics beyond the Standard Model will be motivated. Theoretical problems will be exposed and possible solutions will be described. The goal is to present the exciting new physics ideas that will be tested in the near future, at LHC and elsewhere. Supersymmetry, grand unification, extra dimensions and a glimpse of string theory will be presented.

  2. Modular modelling with Physiome standards

    Science.gov (United States)

    Nickerson, David P.; Nielsen, Poul M. F.; Hunter, Peter J.

    2016-01-01

    Key points The complexity of computational models is increasing, supported by research in modelling tools and frameworks. But relatively little thought has gone into design principles for complex models.We propose a set of design principles for complex model construction with the Physiome standard modelling protocol CellML.By following the principles, models are generated that are extensible and are themselves suitable for reuse in larger models of increasing complexity.We illustrate these principles with examples including an architectural prototype linking, for the first time, electrophysiology, thermodynamically compliant metabolism, signal transduction, gene regulation and synthetic biology.The design principles complement other Physiome research projects, facilitating the application of virtual experiment protocols and model analysis techniques to assist the modelling community in creating libraries of composable, characterised and simulatable quantitative descriptions of physiology. Abstract The ability to produce and customise complex computational models has great potential to have a positive impact on human health. As the field develops towards whole‐cell models and linking such models in multi‐scale frameworks to encompass tissue, organ, or organism levels, reuse of previous modelling efforts will become increasingly necessary. Any modelling group wishing to reuse existing computational models as modules for their own work faces many challenges in the context of construction, storage, retrieval, documentation and analysis of such modules. Physiome standards, frameworks and tools seek to address several of these challenges, especially for models expressed in the modular protocol CellML. Aside from providing a general ability to produce modules, there has been relatively little research work on architectural principles of CellML models that will enable reuse at larger scales. To complement and support the existing tools and frameworks, we develop a set

  3. Calculating Impacts of Energy Standards on Energy Demand in U.S. Buildings under Uncertainty with an Integrated Assessment Model: Technical Background Data

    Energy Technology Data Exchange (ETDEWEB)

    Scott, Michael J. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Daly, Don S. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Hathaway, John E. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Lansing, Carina S. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Liu, Ying [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); McJeon, Haewon C. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Moss, Richard H. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Patel, Pralit L. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Peterson, Marty J. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Rice, Jennie S. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Zhou, Yuyu [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2014-12-06

    This report presents data and assumptions employed in an application of PNNL’s Global Change Assessment Model with a newly-developed Monte Carlo analysis capability. The model is used to analyze the impacts of more aggressive U.S. residential and commercial building-energy codes and equipment standards on energy consumption and energy service costs at the state level, explicitly recognizing uncertainty in technology effectiveness and cost, socioeconomics, presence or absence of carbon prices, and climate impacts on energy demand. The report provides a summary of how residential and commercial buildings are modeled, together with assumptions made for the distributions of state–level population, Gross Domestic Product (GDP) per worker, efficiency and cost of residential and commercial energy equipment by end use, and efficiency and cost of residential and commercial building shells. The cost and performance of equipment and of building shells are reported separately for current building and equipment efficiency standards and for more aggressive standards. The report also details assumptions concerning future improvements brought about by projected trends in technology.

  4. D-brane Standard Model

    CERN Document Server

    Antoniadis, Ignatios; Tomaras, T N

    2001-01-01

    The minimal embedding of the Standard Model in type I string theory is described. The SU(3) color and SU(2) weak interactions arise from two different collections of branes. The correct prediction of the weak angle is obtained for a string scale of 6-8 TeV. Two Higgs doublets are necessary and proton stability is guaranteed. It predicts two massive vector bosons with masses at the TeV scale, as well as a new superweak interaction.

  5. The standard model and beyond

    International Nuclear Information System (INIS)

    Gaillard, M.K.

    1989-05-01

    The field of elementary particle, or high energy, physics seeks to identify the most elementary constituents of nature and to study the forces that govern their interactions. Increasing the energy of a probe in a laboratory experiment increases its power as an effective microscope for discerning increasingly smaller structures of matter. Thus we have learned that matter is composed of molecules that are in turn composed of atoms, that the atom consists of a nucleus surrounded by a cloud of electrons, and that the atomic nucleus is a collection of protons and neutrons. The more powerful probes provided by high energy particle accelerators have taught us that a nucleon is itself made of objects called quarks. The forces among quarks and electrons are understood within a general theoretical framework called the ''standard model,'' that accounts for all interactions observed in high energy laboratory experiments to date. These are commonly categorized as the ''strong,'' ''weak'' and ''electromagnetic'' interactions. In this lecture I will describe the standard model, and point out some of its limitations. Probing for deeper structures in quarks and electrons defines the present frontier of particle physics. I will discuss some speculative ideas about extensions of the standard model and/or yet more fundamental forces that may underlie our present picture. 11 figs., 1 tab

  6. Extensions of the standard model

    International Nuclear Information System (INIS)

    Ramond, P.

    1983-01-01

    In these lectures we focus on several issues that arise in theoretical extensions of the standard model. First we describe the kinds of fermions that can be added to the standard model without affecting known phenomenology. We focus in particular on three types: the vector-like completion of the existing fermions as would be predicted by a Kaluza-Klein type theory, which we find cannot be realistically achieved without some chiral symmetry; fermions which are vector-like by themselves, such as do appear in supersymmetric extensions, and finally anomaly-free chiral sets of fermions. We note that a chiral symmetry, such as the Peccei-Quinn symmetry can be used to produce a vector-like theory which, at scales less than M/sub W/, appears to be chiral. Next, we turn to the analysis of the second hierarchy problem which arises in Grand Unified extensions of the standard model, and plays a crucial role in proton decay of supersymmetric extensions. We review the known mechanisms for avoiding this problem and present a new one which seems to lead to the (family) triplication of the gauge group. Finally, this being a summer school, we present a list of homework problems. 44 references

  7. Consistency Across Standards or Standards in a New Business Model

    Science.gov (United States)

    Russo, Dane M.

    2010-01-01

    Presentation topics include: standards in a changing business model, the new National Space Policy is driving change, a new paradigm for human spaceflight, consistency across standards, the purpose of standards, danger of over-prescriptive standards, a balance is needed (between prescriptive and general standards), enabling versus inhibiting, characteristics of success-oriented standards, characteristics of success-oriented standards, and conclusions. Additional slides include NASA Procedural Requirements 8705.2B identifies human rating standards and requirements, draft health and medical standards for human rating, what's been done, government oversight models, examples of consistency from anthropometry, examples of inconsistency from air quality and appendices of government and non-governmental human factors standards.

  8. Institutional model for supporting standardization

    International Nuclear Information System (INIS)

    Sanford, M.O.; Jackson, K.J.

    1993-01-01

    Restoring the nuclear option for utilities requires standardized designs. This premise is widely accepted by all parties involved in ALWR development activities. Achieving and maintaining standardization, however, demands new perspectives on the roles and responsibilities for the various commercial organizations involved in nuclear power. Some efforts are needed to define a workable model for a long-term support structure that will allow the benefits of standardization to be realized. The Nuclear Power Oversight Committee (NPOC) has developed a strategic plan that lays out the steps necessary to enable the nuclear industry to be in a position to order a new nuclear power plant by the mid 1990's. One of the key elements of the plan is the, ''industry commitment to standardization: through design certification, combined license, first-of-a-kind engineering, construction, operation, and maintenance of nuclear power plants.'' This commitment is a result of the recognition by utilities of the substantial advantages to standardization. Among these are economic benefits, licensing benefits from being treated as one of a family, sharing risks across a broader ownership group, sharing operating experiences, enhancing public safety, and a more coherent market force. Utilities controlled the construction of the past generation of nuclear units in a largely autonomous fashion procuring equipment and designs from a vendor, engineering services from an architect/engineer, and construction from a construction management firm. This, in addition to forcing the utility to assume virtually all of the risks associated with the project, typically resulted in highly customized designs based on preferences of the individual utility. However, the benefits of standardization can be realized only through cooperative choices and decision making by the utilities and through working as partners with reactor vendors, architect/engineers, and construction firms

  9. Temporal assessment of copper speciation, bioavailability and toxicity in UK freshwaters using chemical equilibrium and biotic ligand models: Implications for compliance with copper environmental quality standards.

    Science.gov (United States)

    Lathouri, Maria; Korre, Anna

    2015-12-15

    Although significant progress has been made in understanding how environmental factors modify the speciation, bioavailability and toxicity of metals such as copper in aquatic environments, the current methods used to establish water quality standards do not necessarily consider the different geological and geochemical characteristics of a given site and the factors that affect copper fate, bioavailability potential and toxicity. In addition, the temporal variation in the concentration and bioavailable metal fraction is also important in freshwater systems. The work presented in this paper illustrates the temporal and seasonal variability of a range of water quality parameters, and Cu speciation, bioavailability and toxicity at four freshwaters sites in the UK. Rivers Coquet, Cree, Lower Clyde and Eden (Kent) were selected to cover a broad range of different geochemical environments and site characteristics. The monitoring data used covered a period of around six years at almost monthly intervals. Chemical equilibrium modelling was used to study temporal variations in Cu speciation and was combined with acute toxicity modelling to assess Cu bioavailability for two aquatic species, Daphnia magna and Daphnia pulex. The estimated copper bioavailability, toxicity levels and the corresponding ecosystem risks were analysed in relation to key water quality parameters (alkalinity, pH and DOC). Although copper concentrations did not vary much during the sampling period or between the seasons at the different sites; copper bioavailability varied markedly. In addition, through the chronic-Cu BLM-based on the voluntary risk assessment approach, the potential environmental risk in terms of the chronic toxicity was assessed. A much higher likelihood of toxicity effects was found during the cold period at all sites. It is suggested that besides the metal (copper) concentration in the surface water environment, the variability and seasonality of other important water quality

  10. The standard model and beyond

    CERN Document Server

    Vergados, J D

    2017-01-01

    This book contains a systematic and pedagogical exposition of recent developments in particle physics and cosmology. It starts with two introductory chapters on group theory and the Dirac theory. Then it proceeds with the formulation of the Standard Model (SM) of Particle Physics, particle content and symmetries, fully exploiting the first chapters. It discusses the concept of gauge symmetries and emphasizes their role in particle physics. It then analyses the Higgs mechanism and the spontaneous symmetry breaking (SSB). It explains how the particles (gauge bosons and fermions) after SSB acquire a mass and get admixed. The various forms of charged currents are discussed in detail as well as how the parameters of the SM, which cannot be determined by the theory, are fixed by experiment, including the recent LHC data and the Higgs discovery. Quantum chromodynamics is discussed and various low energy approximations to it are presented. The Feynman diagrams are introduced and applied, in a way undertandable by fir...

  11. NUSS safety standards: A critical assessment

    International Nuclear Information System (INIS)

    Minogue, R.B.

    1985-01-01

    The NUSS safety standards are based on systematic review of safety criteria of many countries in a process carefully defined to assure completeness of coverage. They represent an international consensus of accepted safety principles and practices for regulation and for the design, construction, and operation of nuclear power plants. They are a codification of principles and practices already in use by some Member States. Thus, they are not standards which describe methodologies at their present state of evolution as a result of more recent experience and improvements in technological understanding. The NUSS standards assume an underlying body of national standards and a defined technological base. Detailed design and industrial practices vary between countries and the implementation of basic safety standards within countries has taken approaches that conform with national industrial practices. Thus, application of the NUSS standards requires reconciliation with the standards of the country where the reactor will be built as well as with the country from which procurement takes place. Experience in making that reconciliation will undoubtedly suggest areas of needed improvement. After the TMI accident a reassessment of the NUSS programme was made and it was concluded that, given the information at that time and the then level of technology, the basic approach was sound; the NUSS programme should be continued to completion, and the standards should be brought into use. It was also recognized, however, that in areas such as probabilistic risk assessment, human factors methodology, and consideration of detailed accident sequences, more advanced technology was emerging. As these technologies develop, and become more amenable to practical application, it is anticipated that the NUSS standards will need revision. Ideally those future revisions will also flow from experience in their use

  12. Non-commutative standard model: model building

    CERN Document Server

    Chaichian, Masud; Presnajder, P

    2003-01-01

    A non-commutative version of the usual electro-weak theory is constructed. We discuss how to overcome the two major problems: (1) although we can have non-commutative U(n) (which we denote by U sub * (n)) gauge theory we cannot have non-commutative SU(n) and (2) the charges in non-commutative QED are quantized to just 0,+-1. We show how the latter problem with charge quantization, as well as with the gauge group, can be resolved by taking the U sub * (3) x U sub * (2) x U sub * (1) gauge group and reducing the extra U(1) factors in an appropriate way. Then we proceed with building the non-commutative version of the standard model by specifying the proper representations for the entire particle content of the theory, the gauge bosons, the fermions and Higgs. We also present the full action for the non-commutative standard model (NCSM). In addition, among several peculiar features of our model, we address the inherentCP violation and new neutrino interactions. (orig.)

  13. MATLAB-implemented estimation procedure for model-based assessment of hepatic insulin degradation from standard intravenous glucose tolerance test data.

    Science.gov (United States)

    Di Nardo, Francesco; Mengoni, Michele; Morettini, Micaela

    2013-05-01

    Present study provides a novel MATLAB-based parameter estimation procedure for individual assessment of hepatic insulin degradation (HID) process from standard frequently-sampled intravenous glucose tolerance test (FSIGTT) data. Direct access to the source code, offered by MATLAB, enabled us to design an optimization procedure based on the alternating use of Gauss-Newton's and Levenberg-Marquardt's algorithms, which assures the full convergence of the process and the containment of computational time. Reliability was tested by direct comparison with the application, in eighteen non-diabetic subjects, of well-known kinetic analysis software package SAAM II, and by application on different data. Agreement between MATLAB and SAAM II was warranted by intraclass correlation coefficients ≥0.73; no significant differences between corresponding mean parameter estimates and prediction of HID rate; and consistent residual analysis. Moreover, MATLAB optimization procedure resulted in a significant 51% reduction of CV% for the worst-estimated parameter by SAAM II and in maintaining all model-parameter CV% MATLAB-based procedure was suggested as a suitable tool for the individual assessment of HID process. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.

  14. Experiments beyond the standard model

    International Nuclear Information System (INIS)

    Perl, M.L.

    1984-09-01

    This paper is based upon lectures in which I have described and explored the ways in which experimenters can try to find answers, or at least clues toward answers, to some of the fundamental questions of elementary particle physics. All of these experimental techniques and directions have been discussed fully in other papers, for example: searches for heavy charged leptons, tests of quantum chromodynamics, searches for Higgs particles, searches for particles predicted by supersymmetric theories, searches for particles predicted by technicolor theories, searches for proton decay, searches for neutrino oscillations, monopole searches, studies of low transfer momentum hadron physics at very high energies, and elementary particle studies using cosmic rays. Each of these subjects requires several lectures by itself to do justice to the large amount of experimental work and theoretical thought which has been devoted to these subjects. My approach in these tutorial lectures is to describe general ways to experiment beyond the standard model. I will use some of the topics listed to illustrate these general ways. Also, in these lectures I present some dreams and challenges about new techniques in experimental particle physics and accelerator technology, I call these Experimental Needs. 92 references

  15. Vacuum Stability of Standard Model^{++}

    CERN Document Server

    Anchordoqui, Luis A.; Goldberg, Haim; Huang, Xing; Lust, Dieter; Taylor, Tomasz R.; Vlcek, Brian

    2013-01-01

    The latest results of the ATLAS and CMS experiments point to a preferred narrow Higgs mass range (m_h \\simeq 124 - 126 GeV) in which the effective potential of the Standard Model (SM) develops a vacuum instability at a scale 10^{9} -10^{11} GeV, with the precise scale depending on the precise value of the top quark mass and the strong coupling constant. Motivated by this experimental situation, we present here a detailed investigation about the stability of the SM^{++} vacuum, which is characterized by a simple extension of the SM obtained by adding to the scalar sector a complex SU(2) singlet that has the quantum numbers of the right-handed neutrino, H", and to the gauge sector an U(1) that is broken by the vacuum expectation value of H". We derive the complete set of renormalization group equations at one loop. We then pursue a numerical study of the system to determine the triviality and vacuum stability bounds, using a scan of 10^4 random set of points to fix the initial conditions. We show that, if there...

  16. Perceptual Objective Listening Quality Assessment (POLQA), The Third Generation ITU-T Standard for End-to-End Speech Quality Measurement : Part II – Perceptual Model

    NARCIS (Netherlands)

    Beerends, J.G.; Schmidmer, C.; Berger, J.; Obermann, M.; Ullman, R.; Pomy, J.; Keyhl, M.

    2013-01-01

    In this and the companion paper Part I, the authors present the Perceptual Objective Listening Quality Assessment (POLQA), the third-generation speech quality measurement algorithm, standardized by the International Telecommunication Union in 2011 as Recommendation P.863. This paper describes the

  17. Control system architecture: The standard and non-standard models

    International Nuclear Information System (INIS)

    Thuot, M.E.; Dalesio, L.R.

    1993-01-01

    Control system architecture development has followed the advances in computer technology through mainframes to minicomputers to micros and workstations. This technology advance and increasingly challenging accelerator data acquisition and automation requirements have driven control system architecture development. In summarizing the progress of control system architecture at the last International Conference on Accelerator and Large Experimental Physics Control Systems (ICALEPCS) B. Kuiper asserted that the system architecture issue was resolved and presented a ''standard model''. The ''standard model'' consists of a local area network (Ethernet or FDDI) providing communication between front end microcomputers, connected to the accelerator, and workstations, providing the operator interface and computational support. Although this model represents many present designs, there are exceptions including reflected memory and hierarchical architectures driven by requirements for widely dispersed, large channel count or tightly coupled systems. This paper describes the performance characteristics and features of the ''standard model'' to determine if the requirements of ''non-standard'' architectures can be met. Several possible extensions to the ''standard model'' are suggested including software as well as the hardware architectural feature

  18. An alternative to the standard model

    International Nuclear Information System (INIS)

    Baek, Seungwon; Ko, Pyungwon; Park, Wan-Il

    2014-01-01

    We present an extension of the standard model to dark sector with an unbroken local dark U(1) X symmetry. Including various singlet portal interactions provided by the standard model Higgs, right-handed neutrinos and kinetic mixing, we show that the model can address most of phenomenological issues (inflation, neutrino mass and mixing, baryon number asymmetry, dark matter, direct/indirect dark matter searches, some scale scale puzzles of the standard collisionless cold dark matter, vacuum stability of the standard model Higgs potential, dark radiation) and be regarded as an alternative to the standard model. The Higgs signal strength is equal to one as in the standard model for unbroken U(1) X case with a scalar dark matter, but it could be less than one independent of decay channels if the dark matter is a dark sector fermion or if U(1) X is spontaneously broken, because of a mixing with a new neutral scalar boson in the models

  19. MRI assessment of myelination: an age standardization

    Energy Technology Data Exchange (ETDEWEB)

    Staudt, M. (Kinderklinik Dritter Orden, Passau (Germany)); Schropp, C. (Kinderklinik Dritter Orden, Passau (Germany)); Staudt, F. (Kinderklinik Dritter Orden, Passau (Germany)); Obletter, N. (Radiologische Praxis, Klinikum Ingolstadt (Germany)); Bise, K. (Neuropathologisches Inst., Muenchen Univ. (Germany)); Breit, A. (MR Tomographie, Klinikum Passau (Germany)); Weinmann, H.M. (Kinderklinik Schwabing, Muenchen (Germany))

    1994-04-01

    777 cerebral MRI examinations of children aged 3 days to 14 years were staged for myelination to establish an age standardization. Staging was performed using a system proposed in a previous paper, separately ranking 10 different regions of the brain. Interpretation of the results led to the identification of foue clinical diagnoses that are frequently associated with delays in myelination: West syndrome, cerebral palsy, developmental retardation, and congenital anomalies. In addition, it was found that assessment of myelination in children with head injuries was not practical as alterations in MRI signal can simulate earlier stages of myelination. Age limits were therefore calculated from the case material after excluding all children with these conditions. When simplifications of the definition of the stages are applied, these age limits for the various stages of myelination of each of the 10 regions of the brain make the staging system applicable for routine assessment of myelination. (orig.)

  20. Quality model for semantic IS standards

    NARCIS (Netherlands)

    Folmer, Erwin Johan Albert

    2011-01-01

    Semantic IS (Information Systems) standards are essential for achieving interoperability between organizations. However a recent survey suggests that not the full benefits of standards are achieved, due to the quality issues. This paper presents a quality model for semantic IS standards, that should

  1. Cultural models of linguistic standardization

    Directory of Open Access Journals (Sweden)

    Dirk Geeraerts

    2016-02-01

    Full Text Available In line with well-known trends in cultural theory (see Burke et al., 2000, Cognitive Linguistics has stressed the idea that we think about social reality in terms of models – ‘cultural models’ or ‘folk theories’: from Holland & Quinn (1987 over Lakoff (1996 and Palmer (1996 to Dirven et al. (2001a, 2001b, Cognitive linguists have demonstrated how the technical apparatus of Cognitive Linguistics can be used to analyze how our conception of social reality is shaped by underlying patterns of thought. But if language is a social and cultural reality, what are the models that shape our conception of language? Specifically, what are the models that shape our thinking about language as a social phenomenon? What are the paradigms that we use to think about language, not primarily in terms of linguistic structure (as in Reddy 1979, but in terms of linguistic variation: models about the way in which language varieties are distributed over a language community and about the way in which such distribution should be evaluated?In this paper, I will argue that two basic models may be identified: a rationalist and a romantic one. I will chart the ways in which they interact, describe how they are transformed in the course of time, and explore how the models can be used in the analysis of actual linguistic variation.

  2. Standard Model, Higgs Boson and What Next?

    Indian Academy of Sciences (India)

    IAS Admin

    RESONANCE | October 2012. GENERAL | ARTICLE. Standard Model is now known to be the basis of almost ALL of known physics except gravity. It is the dynamical theory of electromagnetism and the strong and weak nuclear forces. Standard Model has been constructed by generalizing the century-old electrodynamics of.

  3. Modeling in the Common Core State Standards

    Science.gov (United States)

    Tam, Kai Chung

    2011-01-01

    The inclusion of modeling and applications into the mathematics curriculum has proven to be a challenging task over the last fifty years. The Common Core State Standards (CCSS) has made mathematical modeling both one of its Standards for Mathematical Practice and one of its Conceptual Categories. This article discusses the need for mathematical…

  4. Beyond the Standard Model: Working group report

    Indian Academy of Sciences (India)

    tion within the 'Beyond the Standard Model' working group of WHEPP-6. These problems addressed various extensions of the Standard Model (SM) currently under consideration in the particle physics phenomenology community. Smaller subgroups were formed to focus on each of these problems. The progresstill the end ...

  5. Competency model and standards for media education

    Directory of Open Access Journals (Sweden)

    Gerhard TULODZIECKI

    2012-12-01

    Full Text Available In Germany, educational standards for key school subjects have been developed as a consequence of the results of international comparative studies like PISA. Subsequently, supporters of interdisciplinary fields such as media education have also started calling for goals in the form of competency models and standards. In this context a competency standard model for media education will be developed with regard to the discussion about media competence and media education. In doing so the development of a competency model and the formulation of standards is described consequently as a decision making process. In this process decisions have to be made on competence areas and competence aspects to structure the model, on criteria to differentiate certain levels of competence, on the number of competence levels, on the abstraction level of standard formulations and on the tasks to test the standards. It is shown that the discussion on media education as well as on competencies and standards provides different possibilities of structuring, emphasizing and designing a competence standard model. Against this background we describe and give reasons for our decisions and our competency standards model. At the same time our contribution is meant to initiate further developments, testing and discussion.

  6. A revisited standard solar model

    International Nuclear Information System (INIS)

    Casse, M.; Cahen, S.; Doom, C.

    1985-09-01

    Recent models of the Sun, including our own, based on canonical physics and featuring modern reaction rates and radiative opacities are presented. They lead to a presolar helium abundance of approximately 0.28 by mass, at variance with the value of 0.25 proposed by Bahcall et al. (1982, 1985), but in better agreement with the value found in the Orion nebula. Most models predict a neutrino counting rate greater than 6 SNU in the chlorine-argon detector, which is at least 3 times higher than the observed rate. The primordial helium abundance derived from the solar one, on the basis of recent models of helium production from the birth of the Galaxy to the birth of the sun, Ysub(P) approximately 0.26, is significantly higher than the value inferred from observations of extragalactic metal-poor nebulae (Y approximately 0.23). This indicates that the stellar production of helium is probably underestimated by the models considered

  7. Beyond the supersymmetric standard model

    International Nuclear Information System (INIS)

    Hall, L.J.

    1988-02-01

    The possibility of baryon number violation at the weak scale and an alternative primordial nucleosynthesis scheme arising from the decay of gravitations are discussed. The minimal low energy supergravity model is defined and a few of its features are described. Renormalization group scaling and flavor physics are mentioned

  8. Beyond the supersymmetric standard model

    Energy Technology Data Exchange (ETDEWEB)

    Hall, L.J.

    1988-02-01

    The possibility of baryon number violation at the weak scale and an alternative primordial nucleosynthesis scheme arising from the decay of gravitations are discussed. The minimal low energy supergravity model is defined and a few of its features are described. Renormalization group scaling and flavor physics are mentioned.

  9. Control system architecture: The standard and non-standard models

    International Nuclear Information System (INIS)

    Thuot, M.E.; Dalesio, L.R.

    1993-01-01

    Control system architecture development has followed the advances in computer technology through mainframes to minicomputers to micros and workstations. This technology advance and increasingly challenging accelerator data acquisition and automation requirements have driven control system architecture development. In summarizing the progress of control system architecture at the last International Conference on Accelerator and Large Experimental Physics Control Systems (ICALEPCS) B. Kuiper asserted that the system architecture issue was resolved and presented a open-quotes standard modelclose quotes. The open-quotes standard modelclose quotes consists of a local area network (Ethernet or FDDI) providing communication between front end microcomputers, connected to the accelerator, and workstations, providing the operator interface and computational support. Although this model represents many present designs, there are exceptions including reflected memory and hierarchical architectures driven by requirements for widely dispersed, large channel count or tightly coupled systems. This paper describes the performance characteristics and features of the open-quotes standard modelclose quotes to determine if the requirements of open-quotes non-standardclose quotes architectures can be met. Several possible extensions to the open-quotes standard modelclose quotes are suggested including software as well as the hardware architectural features

  10. Electroweak baryogenesis and the standard model

    International Nuclear Information System (INIS)

    Huet, P.

    1994-01-01

    Electroweak baryogenesis is addressed within the context of the standard model of particle physics. Although the minimal standard model has the means of fulfilling the three Sakharov's conditions, it falls short to explaining the making of the baryon asymmetry of the universe. In particular, it is demonstrated that the phase of the CKM mixing matrix is an, insufficient source of CP violation. The shortcomings of the standard model could be bypassed by enlarging the symmetry breaking sector and adding a new source of CP violation

  11. Savannah River Site peer evaluator standards: Operator assessment for restart

    International Nuclear Information System (INIS)

    1990-01-01

    Savannah River Site has implemented a Peer Evaluator program for the assessment of certified Central Control Room Operators, Central Control Room Supervisors and Shift Technical Engineers prior to restart. This program is modeled after the nuclear Regulatory Commission's (NRC's) Examiner Standard, ES-601, for the requalification of licensed operators in the commercial utility industry. It has been tailored to reflect the unique differences between Savannah River production reactors and commercial power reactors

  12. Savannah River Site peer evaluator standards: Operator assessment for restart

    Energy Technology Data Exchange (ETDEWEB)

    1990-06-01

    Savannah River Site has implemented a Peer Evaluator program for the assessment of certified Central Control Room Operators, Central Control Room Supervisors and Shift Technical Engineers prior to restart. This program is modeled after the nuclear Regulatory Commission`s (NRC`s) Examiner Standard, ES-601, for the requalification of licensed operators in the commercial utility industry. It has been tailored to reflect the unique differences between Savannah River production reactors and commercial power reactors.

  13. Savannah River Site peer evaluator standards: Operator assessment for restart

    Energy Technology Data Exchange (ETDEWEB)

    1990-06-01

    Savannah River Site has implemented a Peer Evaluator program for the assessment of certified Central Control Room Operators, Central Control Room Supervisors and Shift Technical Engineers prior to restart. This program is modeled after the nuclear Regulatory Commission's (NRC's) Examiner Standard, ES-601, for the requalification of licensed operators in the commercial utility industry. It has been tailored to reflect the unique differences between Savannah River production reactors and commercial power reactors.

  14. The making of the standard model

    NARCIS (Netherlands)

    Hooft, G. 't

    2007-01-01

    The standard model of particle physics is more than a model. It is a detailed theory that encompasses nearly all that is known about the subatomic particles and forces in a concise set of principles and equations. The extensive research that culminated in this model includes numerous small and

  15. Discrete symmetry breaking beyond the standard model

    NARCIS (Netherlands)

    Dekens, Wouter Gerard

    2015-01-01

    The current knowledge of elementary particles and their interactions is summarized in the Standard Model of particle physics. Practically all the predictions of this model, that have been tested, were confirmed experimentally. Nonetheless, there are phenomena which the model cannot explain. For

  16. Beyond the Standard Model for Montaneros

    CERN Document Server

    Bustamante, M; Ellis, John

    2010-01-01

    These notes cover (i) electroweak symmetry breaking in the Standard Model (SM) and the Higgs boson, (ii) alternatives to the SM Higgs boson} including an introduction to composite Higgs models and Higgsless models that invoke extra dimensions, (iii) the theory and phenomenology of supersymmetry, and (iv) various further beyond topics, including Grand Unification, proton decay and neutrino masses, supergravity, superstrings and extra dimensions.

  17. Is the Standard Model about to crater?

    CERN Multimedia

    Lane, Kenneth

    2015-01-01

    The Standard Model is coming under more and more pressure from experiments. New results from the analysis of LHC's Run 1 data show effects that, if confirmed, would be the signature of new interactions at the TeV scale.

  18. The standard model in a nutshell

    CERN Document Server

    Goldberg, Dave

    2017-01-01

    For a theory as genuinely elegant as the Standard Model--the current framework describing elementary particles and their forces--it can sometimes appear to students to be little more than a complicated collection of particles and ranked list of interactions. The Standard Model in a Nutshell provides a comprehensive and uncommonly accessible introduction to one of the most important subjects in modern physics, revealing why, despite initial appearances, the entire framework really is as elegant as physicists say. Dave Goldberg uses a "just-in-time" approach to instruction that enables students to gradually develop a deep understanding of the Standard Model even if this is their first exposure to it. He covers everything from relativity, group theory, and relativistic quantum mechanics to the Higgs boson, unification schemes, and physics beyond the Standard Model. The book also looks at new avenues of research that could answer still-unresolved questions and features numerous worked examples, helpful illustrat...

  19. Beyond the Standard Model (1/5)

    CERN Multimedia

    CERN. Geneva

    2000-01-01

    After a critical discussion of the questions left unanswered by the Standard Model, I will review the main attemps to construct new theories. In particular, I will discuss grand unification, supersymmetry, technicolour, and theories with extra dimensions.

  20. Beyond the Standard Model (5/5)

    CERN Multimedia

    CERN. Geneva

    2000-01-01

    After a critical discussion of the questions left unanswered by the Standard Model, I will review the main attemps to construct new theories. In particular, I will discuss grand unification, supersymmetry, technicolour, and theories with extra dimensions.

  1. Beyond the Standard Model (3/5)

    CERN Multimedia

    CERN. Geneva

    2000-01-01

    After a critical discussion of the questions left unanswered by the Standard Model, I will review the main attemps to construct new theories. In particular, I will discuss grand unification, supersymmetry, technicolour, and theories with extra dimensions.

  2. Beyond the Standard Model (2/5)

    CERN Multimedia

    CERN. Geneva

    2000-01-01

    After a critical discussion of the questions left unanswered by the Standard Model, I will review the main attemps to construct new theories. In particular, I will discuss grand unification, supersymmetry, technicolour, and theories with extra dimensions.

  3. Beyond the Standard Model (4/5)

    CERN Multimedia

    CERN. Geneva

    2000-01-01

    After a critical discussion of the questions left unanswered by the Standard Model, I will review the main attemps to construct new theories. In particular, I will discuss grand unification, supersymmetry, technicolour, and theories with extra dimensions.

  4. From the standard model to dark matter

    International Nuclear Information System (INIS)

    Wilczek, F.

    1995-01-01

    The standard model of particle physics is marvelously successful. However, it is obviously not a complete or final theory. I shall argue here that the structure of the standard model gives some quite concrete, compelling hints regarding what lies beyond. Taking these hints seriously, one is led to predict the existence of new types of very weakly interacting matter, stable on cosmological time scales and produced with cosmologically interesting densities--that is, ''dark matter''. copyright 1995 American Institute of Physics

  5. Standard Model measurements with the ATLAS detector

    Directory of Open Access Journals (Sweden)

    Hassani Samira

    2015-01-01

    Full Text Available Various Standard Model measurements have been performed in proton-proton collisions at a centre-of-mass energy of √s = 7 and 8 TeV using the ATLAS detector at the Large Hadron Collider. A review of a selection of the latest results of electroweak measurements, W/Z production in association with jets, jet physics and soft QCD is given. Measurements are in general found to be well described by the Standard Model predictions.

  6. Working group report: Beyond the standard model

    Indian Academy of Sciences (India)

    The working group on Beyond the Standard Model concentrated on identifying interesting physics issues in models ... In view of the range of current interest in the high energy physics community, this work- ing group was organised ... the computational tools currently relevant for particle phenomenology. Thus in this group,.

  7. Standard Model Particles from Split Octonions

    Directory of Open Access Journals (Sweden)

    Gogberashvili M.

    2016-01-01

    Full Text Available We model physical signals using elements of the algebra of split octonions over the field of real numbers. Elementary particles are corresponded to the special elements of the algebra that nullify octonionic norms (zero divisors. It is shown that the standard model particle spectrum naturally follows from the classification of the independent primitive zero divisors of split octonions.

  8. Exploring the Standard Model of Particles

    Science.gov (United States)

    Johansson, K. E.; Watkins, P. M.

    2013-01-01

    With the recent discovery of a new particle at the CERN Large Hadron Collider (LHC) the Higgs boson could be about to be discovered. This paper provides a brief summary of the standard model of particle physics and the importance of the Higgs boson and field in that model for non-specialists. The role of Feynman diagrams in making predictions for…

  9. Noncommutative geometry and the standard model vacuum

    International Nuclear Information System (INIS)

    Barrett, John W.; Dawe Martins, Rachel A.

    2006-01-01

    The space of Dirac operators for the Connes-Chamseddine spectral action for the standard model of particle physics coupled to gravity is studied. The model is extended by including right-handed neutrino states, and the S 0 -reality axiom is not assumed. The possibility of allowing more general fluctuations than the inner fluctuations of the vacuum is proposed. The maximal case of all possible fluctuations is studied by considering the equations of motion for the vacuum. While there are interesting nontrivial vacua with Majorana-type mass terms for the leptons, the conclusion is that the equations are too restrictive to allow solutions with the standard model mass matrix

  10. Upper limb risk assessment according to ISO/CEN standards

    NARCIS (Netherlands)

    Delleman, N.J.

    2000-01-01

    This paper describes the current status, general content, and application of the standards EN 1005, ISO 11226, and ISO 11228-3 concerning upper limb risk assessment. The upper limb risk assessment according to International Organization for Standardization (ISO)/CEN standards is discussed. Risk

  11. The Standard Model and Higgs physics

    Science.gov (United States)

    Torassa, Ezio

    2018-05-01

    The Standard Model is a consistent and computable theory that successfully describes the elementary particle interactions. The strong, electromagnetic and weak interactions have been included in the theory exploiting the relation between group symmetries and group generators, in order to smartly introduce the force carriers. The group properties lead to constraints between boson masses and couplings. All the measurements performed at the LEP, Tevatron, LHC and other accelerators proved the consistency of the Standard Model. A key element of the theory is the Higgs field, which together with the spontaneous symmetry breaking, gives mass to the vector bosons and to the fermions. Unlike the case of vector bosons, the theory does not provide prediction for the Higgs boson mass. The LEP experiments, while providing very precise measurements of the Standard Model theory, searched for the evidence of the Higgs boson until the year 2000. The discovery of the top quark in 1994 by the Tevatron experiments and of the Higgs boson in 2012 by the LHC experiments were considered as the completion of the fundamental particles list of the Standard Model theory. Nevertheless the neutrino oscillations, the dark matter and the baryon asymmetry in the Universe evidence that we need a new extended model. In the Standard Model there are also some unattractive theoretical aspects like the divergent loop corrections to the Higgs boson mass and the very small Yukawa couplings needed to describe the neutrino masses. For all these reasons, the hunt of discrepancies between Standard Model and data is still going on with the aim to finally describe the new extended theory.

  12. The Cosmological Standard Model and Its Implications for Beyond the Standard Model of Particle Physics

    CERN Multimedia

    CERN. Geneva

    2011-01-01

    While the cosmological standard model has many notable successes, it assumes 95% of the mass-energy density of the universe is dark and of unknown nature, and there was an early stage of inflationary expansion driven by physics far beyond the range of the particle physics standard model. In the colloquium I will discuss potential particle-physics implications of the standard cosmological model.

  13. Assessment of Safety Standards for Automotive Electronic Control Systems

    Science.gov (United States)

    2016-06-01

    This report summarizes the results of a study that assessed and compared six industry and government safety standards relevant to the safety and reliability of automotive electronic control systems. These standards include ISO 26262 (Road Vehicles - ...

  14. LHC Higgs physics beyond the Standard Model

    Energy Technology Data Exchange (ETDEWEB)

    Spannowsky, M.

    2007-09-22

    The Large Hadron Collider (LHC) at CERN will be able to perform proton collisions at a much higher center-of-mass energy and luminosity than any other collider. Its main purpose is to detect the Higgs boson, the last unobserved particle of the Standard Model, explaining the riddle of the origin of mass. Studies have shown, that for the whole allowed region of the Higgs mass processes exist to detect the Higgs at the LHC. However, the Standard Model cannot be a theory of everything and is not able to provide a complete understanding of physics. It is at most an effective theory up to a presently unknown energy scale. Hence, extensions of the Standard Model are necessary which can affect the Higgs-boson signals. We discuss these effects in two popular extensions of the Standard Model: the Minimal Supersymmetric Standard Model (MSSM) and the Standard Model with four generations (SM4G). Constraints on these models come predominantly from flavor physics and electroweak precision measurements. We show, that the SM4G is still viable and that a fourth generation has strong impact on decay and production processes of the Higgs boson. Furthermore, we study the charged Higgs boson in the MSSM, yielding a clear signal for physics beyond the Standard Model. For small tan {beta} in minimal flavor violation (MFV) no processes for the detection of a charged Higgs boson do exist at the LHC. However, MFV is just motivated by the experimental agreement of results from flavor physics with Standard Model predictions, but not by any basic theoretical consideration. In this thesis, we calculate charged Higgs boson production cross sections beyond the assumption of MFV, where a large number of free parameters is present in the MSSM. We find that the soft-breaking parameters which enhance the charged-Higgs boson production most are just bound to large values, e.g. by rare B-meson decays. Although the charged-Higgs boson cross sections beyond MFV turn out to be sizeable, only a detailed

  15. LHC Higgs physics beyond the Standard Model

    International Nuclear Information System (INIS)

    Spannowsky, M.

    2007-01-01

    The Large Hadron Collider (LHC) at CERN will be able to perform proton collisions at a much higher center-of-mass energy and luminosity than any other collider. Its main purpose is to detect the Higgs boson, the last unobserved particle of the Standard Model, explaining the riddle of the origin of mass. Studies have shown, that for the whole allowed region of the Higgs mass processes exist to detect the Higgs at the LHC. However, the Standard Model cannot be a theory of everything and is not able to provide a complete understanding of physics. It is at most an effective theory up to a presently unknown energy scale. Hence, extensions of the Standard Model are necessary which can affect the Higgs-boson signals. We discuss these effects in two popular extensions of the Standard Model: the Minimal Supersymmetric Standard Model (MSSM) and the Standard Model with four generations (SM4G). Constraints on these models come predominantly from flavor physics and electroweak precision measurements. We show, that the SM4G is still viable and that a fourth generation has strong impact on decay and production processes of the Higgs boson. Furthermore, we study the charged Higgs boson in the MSSM, yielding a clear signal for physics beyond the Standard Model. For small tan β in minimal flavor violation (MFV) no processes for the detection of a charged Higgs boson do exist at the LHC. However, MFV is just motivated by the experimental agreement of results from flavor physics with Standard Model predictions, but not by any basic theoretical consideration. In this thesis, we calculate charged Higgs boson production cross sections beyond the assumption of MFV, where a large number of free parameters is present in the MSSM. We find that the soft-breaking parameters which enhance the charged-Higgs boson production most are just bound to large values, e.g. by rare B-meson decays. Although the charged-Higgs boson cross sections beyond MFV turn out to be sizeable, only a detailed

  16. CP Violation Beyond the Standard Model

    CERN Document Server

    Fleischer, Robert

    1997-01-01

    Recent developments concerning CP violation beyond the Standard Model are reviewed. The central target of this presentation is the $B$ system, as it plays an outstanding role in the extraction of CKM phases. Besides a general discussion of the appearance of new physics in the corresponding CP-violating asymmetries through $B^0_q$--$\\bar{B^0_q}$ mixing $(q\\in\\{d,s\\})$, it is emphasized that CP violation in non-leptonic penguin modes, e.g. in $B_d\\to\\phi K_{S}$, offers a powerful tool to probe physics beyond the Standard Model. In this respect $B\\to\\pi K$ modes, which have been observed recently by the CLEO collaboration, may also turn out to be very useful. Their combined branching ratios allow us to constrain the CKM angle $\\gamma$ and may indicate the presence of physics beyond the Standard Model.

  17. Industrial diffusion models and technological standardization

    International Nuclear Information System (INIS)

    Carrillo-Hermosilla, J.

    2007-01-01

    Conventional models of technology diffusion have typically focused on the question of the rate of diffusion at which one new technology is fully adopted. The model described here provides a broader approach, from the perspective the extension of the diffusion of multiple technologies, and the related phenomenon of standardization. Moreover, most conventional research has characterized the diffusion process in terms of technology attributes or adopting firms attributes. Alternatively, we propose here a wide-ranging and consistent taxonomy of the relationships between the circumstances of an industry and the attributes of the technology standardization processes taking place within it. (Author) 100 refs

  18. Standard Model mass spectrum in inflationary universe

    Energy Technology Data Exchange (ETDEWEB)

    Chen, Xingang [Institute for Theory and Computation, Harvard-Smithsonian Center for Astrophysics,60 Garden Street, Cambridge, MA 02138 (United States); Wang, Yi [Department of Physics, The Hong Kong University of Science and Technology,Clear Water Bay, Kowloon, Hong Kong (China); Xianyu, Zhong-Zhi [Center of Mathematical Sciences and Applications, Harvard University,20 Garden Street, Cambridge, MA 02138 (United States)

    2017-04-11

    We work out the Standard Model (SM) mass spectrum during inflation with quantum corrections, and explore its observable consequences in the squeezed limit of non-Gaussianity. Both non-Higgs and Higgs inflation models are studied in detail. We also illustrate how some inflationary loop diagrams can be computed neatly by Wick-rotating the inflation background to Euclidean signature and by dimensional regularization.

  19. Next to new minimal standard model

    Energy Technology Data Exchange (ETDEWEB)

    Haba, Naoyuki [Graduate School of Science and Engineering, Shimane University, Matsue, Shimane 690-8504 (Japan); Department of Physics, Faculty of Science, Hokkaido University, Sapporo, Hokkaido 060-0810 (Japan); Kaneta, Kunio [Department of Physics, Faculty of Science, Hokkaido University, Sapporo, Hokkaido 060-0810 (Japan); Kavli IPMU (WPI), The University of Tokyo, Kashiwa, Chiba 277-8568 (Japan); Department of Physics, Graduate School of Science, Osaka University, Toyonaka, Osaka 560-0043 (Japan); Takahashi, Ryo [Department of Physics, Faculty of Science, Hokkaido University, Sapporo, Hokkaido 060-0810 (Japan)

    2014-06-27

    We suggest a minimal extension of the standard model, which can explain current experimental data of the dark matter, small neutrino masses and baryon asymmetry of the universe, inflation, and dark energy, and achieve gauge coupling unification. The gauge coupling unification can explain the charge quantization, and be realized by introducing six new fields. We investigate the vacuum stability, coupling perturbativity, and correct dark matter abundance in this model by use of current experimental data.

  20. Standard Model Effective Potential from Trace Anomalies

    Directory of Open Access Journals (Sweden)

    Renata Jora

    2018-01-01

    Full Text Available By analogy with the low energy QCD effective linear sigma model, we construct a standard model effective potential based entirely on the requirement that the tree level and quantum level trace anomalies must be satisfied. We discuss a particular realization of this potential in connection with the Higgs boson mass and Higgs boson effective couplings to two photons and two gluons. We find that this kind of potential may describe well the known phenomenology of the Higgs boson.

  1. Prospects of experimentally reachable beyond Standard Model ...

    Indian Academy of Sciences (India)

    2016-01-06

    Jan 6, 2016 ... Home; Journals; Pramana – Journal of Physics; Volume 86; Issue 2. Prospects of experimentally reachable beyond Standard Model physics in inverse see-saw motivated SO(10) GUT. Ram Lal Awasthi. Special: Supersymmetric Unified Theories and Higgs Physics Volume 86 Issue 2 February 2016 pp 223- ...

  2. Why supersymmetry? Physics beyond the standard model

    Indian Academy of Sciences (India)

    The Naturalness Principle as a requirement that the heavy mass scales decouple from the physics of light mass scales is reviewed. In quantum field theories containing {\\em elementary} scalar fields, such as the StandardModel of electroweak interactions containing the Higgs particle, mass of the scalar field is not a natural ...

  3. Beyond the Standard Model: Working group report

    Indian Academy of Sciences (India)

    55, Nos 1 & 2. — journal of. July & August 2000 physics pp. 307–313. Beyond the Standard Model: Working group report. GAUTAM BHATTACHARYYA. ½ .... action: ¯Consider the possibility that these neutrinos are of Majorana nature, i.e. r η И r , where η И. ¦½. Then the initial condition of degeneracy stated above.

  4. Asymptotically Safe Standard Model via Vectorlike Fermions

    DEFF Research Database (Denmark)

    Mann, R. B.; Meffe, J. R.; Sannino, F.

    2017-01-01

    We construct asymptotically safe extensions of the standard model by adding gauged vectorlike fermions. Using large number-of-flavor techniques we argue that all gauge couplings, including the hypercharge and, under certain conditions, the Higgs coupling, can achieve an interacting ultraviolet...

  5. The race to break the standard model

    CERN Multimedia

    Brumfiel, Geoff

    2008-01-01

    The Large Hadron Collider is the latest attempt to move fundamental physics past the frustratingly successful "standard model". But it is not the only way to do it... The author surveys the contenders attempting to capture the prize before the collider gets up to speed.(4 pages)

  6. Why supersymmetry? Physics beyond the standard model

    Indian Academy of Sciences (India)

    2016-08-23

    Aug 23, 2016 ... Abstract. The Naturalness Principle as a requirement that the heavy mass scales decouple from the physics of light mass scales is reviewed. In quantum field theories containing elementary scalar fields, such as the Standard. Model of electroweak interactions containing the Higgs particle, mass of the ...

  7. 76 FR 16250 - Planning Resource Adequacy Assessment Reliability Standard

    Science.gov (United States)

    2011-03-23

    ... planning, horizon. The Commission agrees with Borlick's comment, and emphasizes that any type of demand...; Order No. 747] Planning Resource Adequacy Assessment Reliability Standard AGENCY: Federal Energy... (Planning Resource Adequacy Analysis, Assessment and [[Page 16251

  8. Primordial nucleosynthesis: Beyond the standard model

    International Nuclear Information System (INIS)

    Malaney, R.A.

    1991-01-01

    Non-standard primordial nucleosynthesis merits continued study for several reasons. First and foremost are the important implications determined from primordial nucleosynthesis regarding the composition of the matter in the universe. Second, the production and the subsequent observation of the primordial isotopes is the most direct experimental link with the early (t approx-lt 1 sec) universe. Third, studies of primordial nucleosynthesis allow for important, and otherwise unattainable, constraints on many aspects of particle physics. Finally, there is tentative evidence which suggests that the Standard Big Bang (SBB) model is incorrect in that it cannot reproduce the inferred primordial abundances for a single value of the baryon-to-photon ratio. Reviewed here are some aspects of non-standard primordial nucleosynthesis which mostly overlap with the authors own personal interest. He begins with a short discussion of the SBB nucleosynthesis theory, high-lighting some recent related developments. Next he discusses how recent observations of helium and lithium abundances may indicate looming problems for the SBB model. He then discusses how the QCD phase transition, neutrinos, and cosmic strings can influence primordial nucleosynthesis. He concludes with a short discussion of the multitude of other non-standard nucleosynthesis models found in the literature, and make some comments on possible progress in the future. 58 refs., 7 figs., 2 tabs

  9. Physical Activity Stories: Assessing the "Meaning Standard" in Physical Education

    Science.gov (United States)

    Johnson, Tyler G.

    2016-01-01

    The presence of the "meaning standard" in both national and state content standards suggests that professionals consider it an important outcome of a quality physical education program. However, only 10 percent of states require an assessment to examine whether students achieve this standard. The purpose of this article is to introduce…

  10. Study on Standard Fatigue Vehicle Load Model

    Science.gov (United States)

    Huang, H. Y.; Zhang, J. P.; Li, Y. H.

    2018-02-01

    Based on the measured data of truck from three artery expressways in Guangdong Province, the statistical analysis of truck weight was conducted according to axle number. The standard fatigue vehicle model applied to industrial areas in the middle and late was obtained, which adopted equivalence damage principle, Miner linear accumulation law, water discharge method and damage ratio theory. Compared with the fatigue vehicle model Specified by the current bridge design code, the proposed model has better applicability. It is of certain reference value for the fatigue design of bridge in China.

  11. Standardized testing: a case of annual national assessments in ...

    African Journals Online (AJOL)

    Standardized assessments are becoming a norm internationally. Governments have turned to standardized assessments due to mounting pressure by citizens for accountability, quality education, transparency, and increased public confidence in education. In South Africa, the observed learning deficiencies and prolonged ...

  12. PASSING STANDARDIZED ASSESSMENTS WITH FADING PROMPTS

    Directory of Open Access Journals (Sweden)

    Amy Marie GREENE

    2015-11-01

    Full Text Available Introduction: No Child Left Behind Act of 2001 mandates that all students perform at a level of proficient on state assessments. This includes students with learning and intellectual disabilities who are inherently performing below grade level. Given that schools are held accountable for meeting these goals and some states are not allowing students to graduate if they do not pass the assessments, this is a large concern for students, parents, teachers, and administration Method: Forty-five students with a disability in writing or an intellectual disability participated in this quasi-experimental, single-group, pretest-posttest design that evaluated the effectiveness of the Fading Prompts through Graphic Organizers method for students with learning and intellectual disabilities in written expression as measured according to the Pennsylvania System of School Assessment. Results: Data analyses were conducted through the use of four dichotomies for percent differences, which compared teacher administered pretests and posttests, pretests and the state administered PSSA, teacher administered posttests and the PSSA, and the participants’ PSSA and the average state PSSA score. All forty-five students performed at a below basic level during baseline and a proficient level on the posttest. The learned skills generalized to the PSSA with forty-three students earning a passing score of proficient, while two students advanced to basic. Conclusion: Based on the outcomes of this study, it is highly recommended that this program be utilized at least for students with learning and intellectual disabilities until further research can be done.

  13. Assessing the Genetics Content in the Next Generation Science Standards

    OpenAIRE

    Lontok, Katherine S.; Zhang, Hubert; Dougherty, Michael J.

    2015-01-01

    Science standards have a long history in the United States and currently form the backbone of efforts to improve primary and secondary education in science, technology, engineering, and math (STEM). Although there has been much political controversy over the influence of standards on teacher autonomy and student performance, little light has been shed on how well standards cover science content. We assessed the coverage of genetics content in the Next Generation Science Standards (NGSS) using...

  14. Superconnections: an interpretation of the standard model

    Directory of Open Access Journals (Sweden)

    Gert Roepstorff

    2000-07-01

    Full Text Available The mathematical framework of superbundles as pioneered by D. Quillen suggests that one consider the Higgs field as a natural constituent of a superconnection. I propose to take as superbundle the exterior algebra obtained from a Hermitian vector bundle of rank n where n=2 for the electroweak theory and n=5 for the full Standard Model. The present setup is similar to but avoids the use of non-commutative geometry.

  15. Status of the electroweak standard model

    International Nuclear Information System (INIS)

    Haidt, D.

    1990-01-01

    It is the aim of this report to confront the results extracted from the experiments in each sector with the electroweak standard model in its minimal form (QFD), to search for internal inconsistencies and, if not found, to obtain best values for the electroweak couplings together with constraints on the as yet unobserved top quark. The e + e - data of the three TRISTAN experiments, even though partly preliminary, are now systematically included in the fits. (orig./HSI)

  16. Indoorgml - a Standard for Indoor Spatial Modeling

    Science.gov (United States)

    Li, Ki-Joune

    2016-06-01

    With recent progress of mobile devices and indoor positioning technologies, it becomes possible to provide location-based services in indoor space as well as outdoor space. It is in a seamless way between indoor and outdoor spaces or in an independent way only for indoor space. However, we cannot simply apply spatial models developed for outdoor space to indoor space due to their differences. For example, coordinate reference systems are employed to indicate a specific position in outdoor space, while the location in indoor space is rather specified by cell number such as room number. Unlike outdoor space, the distance between two points in indoor space is not determined by the length of the straight line but the constraints given by indoor components such as walls, stairs, and doors. For this reason, we need to establish a new framework for indoor space from fundamental theoretical basis, indoor spatial data models, and information systems to store, manage, and analyse indoor spatial data. In order to provide this framework, an international standard, called IndoorGML has been developed and published by OGC (Open Geospatial Consortium). This standard is based on a cellular notion of space, which considers an indoor space as a set of non-overlapping cells. It consists of two types of modules; core module and extension module. While core module consists of four basic conceptual and implementation modeling components (geometric model for cell, topology between cells, semantic model of cell, and multi-layered space model), extension modules may be defined on the top of the core module to support an application area. As the first version of the standard, we provide an extension for indoor navigation.

  17. Beyond the standard model in many directions

    Energy Technology Data Exchange (ETDEWEB)

    Chris Quigg

    2004-04-28

    These four lectures constitute a gentle introduction to what may lie beyond the standard model of quarks and leptons interacting through SU(3){sub c} {direct_product} SU(2){sub L} {direct_product} U(1){sub Y} gauge bosons, prepared for an audience of graduate students in experimental particle physics. In the first lecture, I introduce a novel graphical representation of the particles and interactions, the double simplex, to elicit questions that motivate our interest in physics beyond the standard model, without recourse to equations and formalism. Lecture 2 is devoted to a short review of the current status of the standard model, especially the electroweak theory, which serves as the point of departure for our explorations. The third lecture is concerned with unified theories of the strong, weak, and electromagnetic interactions. In the fourth lecture, I survey some attempts to extend and complete the electroweak theory, emphasizing some of the promise and challenges of supersymmetry. A short concluding section looks forward.

  18. Assessing the Genetics Content in the Next Generation Science Standards.

    Directory of Open Access Journals (Sweden)

    Katherine S Lontok

    Full Text Available Science standards have a long history in the United States and currently form the backbone of efforts to improve primary and secondary education in science, technology, engineering, and math (STEM. Although there has been much political controversy over the influence of standards on teacher autonomy and student performance, little light has been shed on how well standards cover science content. We assessed the coverage of genetics content in the Next Generation Science Standards (NGSS using a consensus list of American Society of Human Genetics (ASHG core concepts. We also compared the NGSS against state science standards. Our goals were to assess the potential of the new standards to support genetic literacy and to determine if they improve the coverage of genetics concepts relative to state standards. We found that expert reviewers cannot identify ASHG core concepts within the new standards with high reliability, suggesting that the scope of content addressed by the standards may be inconsistently interpreted. Given results that indicate that the disciplinary core ideas (DCIs included in the NGSS documents produced by Achieve, Inc. clarify the content covered by the standards statements themselves, we recommend that the NGSS standards statements always be viewed alongside their supporting disciplinary core ideas. In addition, gaps exist in the coverage of essential genetics concepts, most worryingly concepts dealing with patterns of inheritance, both Mendelian and complex. Finally, state standards vary widely in their coverage of genetics concepts when compared with the NGSS. On average, however, the NGSS support genetic literacy better than extant state standards.

  19. Assessing the Genetics Content in the Next Generation Science Standards.

    Science.gov (United States)

    Lontok, Katherine S; Zhang, Hubert; Dougherty, Michael J

    2015-01-01

    Science standards have a long history in the United States and currently form the backbone of efforts to improve primary and secondary education in science, technology, engineering, and math (STEM). Although there has been much political controversy over the influence of standards on teacher autonomy and student performance, little light has been shed on how well standards cover science content. We assessed the coverage of genetics content in the Next Generation Science Standards (NGSS) using a consensus list of American Society of Human Genetics (ASHG) core concepts. We also compared the NGSS against state science standards. Our goals were to assess the potential of the new standards to support genetic literacy and to determine if they improve the coverage of genetics concepts relative to state standards. We found that expert reviewers cannot identify ASHG core concepts within the new standards with high reliability, suggesting that the scope of content addressed by the standards may be inconsistently interpreted. Given results that indicate that the disciplinary core ideas (DCIs) included in the NGSS documents produced by Achieve, Inc. clarify the content covered by the standards statements themselves, we recommend that the NGSS standards statements always be viewed alongside their supporting disciplinary core ideas. In addition, gaps exist in the coverage of essential genetics concepts, most worryingly concepts dealing with patterns of inheritance, both Mendelian and complex. Finally, state standards vary widely in their coverage of genetics concepts when compared with the NGSS. On average, however, the NGSS support genetic literacy better than extant state standards.

  20. Standard Model backgrounds to supersymmetry searches

    CERN Document Server

    Mangano, Michelangelo L

    2009-01-01

    This work presents a review of the Standard Model sources of backgrounds to the search of supersymmetry signals. Depending on the specific model, typical signals may include jets, leptons, and missing transverse energy due to the escaping lightest supersymmetric particle. We focus on the simplest case of multijets and missing energy, since this allows us to expose most of the issues common to other more complex cases. The review is not exhaustive, and is aimed at collecting a series of general comments and observations, to serve as guideline for the process that will lead to a complete experimental determination of size and features of such SM processes.

  1. Performance Standards': Utility for Different Uses of Assessments

    Directory of Open Access Journals (Sweden)

    Robert L. Linn

    2003-09-01

    Full Text Available Performance standards are arguably one of the most controversial topics in educational measurement. There are uses of assessments such as licensure and certification where performance standards are essential. There are many other uses, however, where performance standards have been mandated or become the preferred method of reporting assessment results where the standards are not essential to the use. Distinctions between essential and nonessential uses of performance standards are discussed. It is argued that the insistence on reporting in terms of performance standards in situations where they are not essential has been more harmful than helpful. Variability in the definitions of proficient academic achievement by states for purposes of the No Child Left Behind Act of 2001 is discussed and it is argued that the variability is so great that characterizing achievement is meaningless. Illustrations of the great uncertainty in standards are provided.

  2. A new accurate 3D measurement tool to assess the range of motion of the tongue in oral cancer patients: a standardized model

    NARCIS (Netherlands)

    van Dijk, Simone; van Alphen, M.J.A.; Jacobi, Irene; Smeele, Ludwig E.; van der Heijden, Ferdinand; Balm, Alfonsus Jacobus Maria; Balm, Alfons J.M.

    2016-01-01

    In oral cancer treatment, function loss such as speech and swallowing deterioration can be severe, mostly due to reduced lingual mobility. Until now, there is no standardized measurement tool for tongue mobility and pre-operative prediction of function loss is based on expert opinion instead of

  3. Experimentally testing the standard cosmological model

    Energy Technology Data Exchange (ETDEWEB)

    Schramm, D.N. (Chicago Univ., IL (USA) Fermi National Accelerator Lab., Batavia, IL (USA))

    1990-11-01

    The standard model of cosmology, the big bang, is now being tested and confirmed to remarkable accuracy. Recent high precision measurements relate to the microwave background; and big bang nucleosynthesis. This paper focuses on the latter since that relates more directly to high energy experiments. In particular, the recent LEP (and SLC) results on the number of neutrinos are discussed as a positive laboratory test of the standard cosmology scenario. Discussion is presented on the improved light element observational data as well as the improved neutron lifetime data. alternate nucleosynthesis scenarios of decaying matter or of quark-hadron induced inhomogeneities are discussed. It is shown that when these scenarios are made to fit the observed abundances accurately, the resulting conclusions on the baryonic density relative to the critical density, {Omega}{sub b}, remain approximately the same as in the standard homogeneous case, thus, adding to the robustness of the standard model conclusion that {Omega}{sub b} {approximately} 0.06. This latter point is the deriving force behind the need for non-baryonic dark matter (assuming {Omega}{sub total} = 1) and the need for dark baryonic matter, since {Omega}{sub visible} < {Omega}{sub b}. Recent accelerator constraints on non-baryonic matter are discussed, showing that any massive cold dark matter candidate must now have a mass M{sub x} {approx gt} 20 GeV and an interaction weaker than the Z{sup 0} coupling to a neutrino. It is also noted that recent hints regarding the solar neutrino experiments coupled with the see-saw model for {nu}-masses may imply that the {nu}{sub {tau}} is a good hot dark matter candidate. 73 refs., 5 figs.

  4. Skewness of the standard model possible implications

    International Nuclear Information System (INIS)

    Nielsen, H.B.; Brene, N.

    1989-09-01

    In this paper we consider combinations of gauge algebra and set of rules for quantization of gauge charges. We show that the combination of the algebra of the standard model and the rule satisfied by the electric charges of the quarks and leptons has an exceptional high degree of a kind of asymmetry which we call skewness. Assuming that skewness has physical significance and adding two other rather plausible assumptions, we may conclude that space time must have a non simply connected topology on very small distances. Such topology would allow a kind of symmetry breakdown leading to a more skew combination of gauge algebra and set of quantization rules. (orig.)

  5. Non standard analysis, polymer models, quantum fields

    International Nuclear Information System (INIS)

    Albeverio, S.

    1984-01-01

    We give an elementary introduction to non standard analysis and its applications to the theory of stochastic processes. This is based on a joint book with J.E. Fenstad, R. Hoeegh-Krohn and T. Lindstroeem. In particular we give a discussion of an hyperfinite theory of Dirichlet forms with applications to the study of the Hamiltonian for a quantum mechanical particle in the potential created by a polymer. We also discuss new results on the existence of attractive polymer measures in dimension d 1 2 phi 2 2 )sub(d)-model of interacting quantum fields. (orig.)

  6. Search for the standard model Higgs boson

    Science.gov (United States)

    Buskulic, D.; de Bonis, I.; Decamp, D.; Ghez, P.; Goy, C.; Lees, J.-P.; Minard, M.-N.; Pietrzyk, B.; Ariztizabal, F.; Comas, P.; Crespo, J. M.; Delfino, M.; Efthymiopoulos, I.; Fernandez, E.; Fernandez-Bosman, M.; Gaitan, V.; Garrido, Ll.; Mattison, T.; Pacheco, A.; Padilla, C.; Pascual, A.; Creanza, D.; de Palma, M.; Farilla, A.; Iaselli, G.; Maggi, G.; Natali, S.; Nuzzo, S.; Quattromini, M.; Ranieri, A.; Raso, G.; Romano, F.; Ruggieri, F.; Selvaggi, G.; Silvestris, L.; Tempesta, P.; Zito, G.; Chai, Y.; Hu, H.; Huang, D.; Huang, X.; Lin, J.; Wang, T.; Xie, Y.; Xu, D.; Xu, R.; Zhang, J.; Zhang, L.; Zhao, W.; Blucher, E.; Bonvicini, G.; Boudreau, J.; Casper, D.; Drevermann, H.; Forty, R. W.; Ganis, G.; Gay, C.; Hagelberg, R.; Harvey, J.; Hilgart, J.; Jacobsen, R.; Jost, B.; Knobloch, J.; Lehraus, I.; Lohse, T.; Maggi, M.; Markou, C.; Martinez, M.; Mato, P.; Meinhard, H.; Minten, A.; Miotto, A.; Miguel, R.; Moser, H.-G.; Palazzi, P.; Pater, J. R.; Perlas, J. A.; Pusztaszeri, J.-F.; Ranjard, F.; Redlinger, G.; Rolandi, L.; Rothberg, J.; Ruan, T.; Saich, M.; Schlatter, D.; Schmelling, M.; Sefkow, F.; Tejessy, W.; Tomalin, I. R.; Veenhof, R.; Wachsmuth, H.; Wasserbaech, S.; Wiedenmann, W.; Wildish, T.; Witzeling, W.; Wotschack, J.; Ajaltouni, Z.; Badaud, F.; Bardadin-Otwinowska, M.; El Fellous, R.; Falvard, A.; Gay, P.; Guicheney, C.; Henrard, P.; Jousset, J.; Michel, B.; Montret, J.-C.; Pallin, D.; Perret, P.; Podlyski, F.; Proriol, J.; Prulhière, F.; Saadi, F.; Fearnley, T.; Hansen, J. B.; Hansen, J. D.; Hansen, J. R.; Hansen, P. H.; Møllerud, R.; Nilsson, B. S.; Kyriakis, A.; Simopoulou, E.; Siotis, I.; Vayaki, A.; Zachariadou, K.; Badier, J.; Blondel, A.; Bonneaud, G.; Brient, J. C.; Fouque, G.; Orteu, S.; Rougé, A.; Rumpf, M.; Tanaka, R.; Verderi, M.; Videau, H.; Candlin, D. J.; Parsons, M. I.; Veitch, E.; Focardi, E.; Moneta, L.; Parrini, G.; Corden, M.; Georgiopoulos, C.; Ikeda, M.; Levinthal, D.; Antonelli, A.; Baldini, R.; Bencivenni, G.; Bologna, G.; Bossi, F.; Campana, P.; Capon, G.; Cerutti, F.; Chiarella, V.; D'Ettorre-Piazzoli, B.; Felici, G.; Laurelli, P.; Mannocchi, G.; Murtas, F.; Murtas, G. P.; Passalacqua, L.; Pepe-Altarelli, M.; Picchi, P.; Colrain, P.; Ten Have, I.; Lynch, J. G.; Maitland, W.; Morton, W. T.; Raine, C.; Reeves, P.; Scarr, J. M.; Smith, K.; Thompson, A. S.; Turnbull, R. M.; Brandl, B.; Braun, O.; Geweniger, C.; Hanke, P.; Hepp, V.; Kluge, E. E.; Maumary, Y.; Putzer, A.; Rensch, B.; Stahl, A.; Tittel, K.; Wunsch, M.; Beuselinck, R.; Binnie, D. M.; Cameron, W.; Cattaneo, M.; Colling, D. J.; Dornan, P. J.; Greene, A. M.; Hassard, J. F.; Lieske, N. M.; Moutoussi, A.; Nash, J.; Patton, S.; Payne, D. G.; Phillips, M. J.; San Martin, G.; Sedgbeer, J. K.; Wright, A. G.; Girtler, P.; Kuhn, D.; Rudolph, G.; Vogl, R.; Bowdery, C. K.; Brodbeck, T. J.; Finch, A. J.; Foster, F.; Hughes, G.; Jackson, D.; Keemer, N. R.; Nuttall, M.; Patel, A.; Sloan, T.; Snow, S. W.; Whelan, E. P.; Kleinknecht, K.; Raab, J.; Renk, B.; Sander, H.-G.; Schmidt, H.; Steeg, F.; Walther, S. M.; Wanke, R.; Wolf, B.; Bencheikh, A. M.; Benchouk, C.; Bonissent, A.; Carr, J.; Coyle, P.; Drinkard, J.; Etienne, F.; Nicod, D.; Papalexiou, S.; Payre, P.; Roos, L.; Rousseau, D.; Schwemling, P.; Talby, M.; Adlung, S.; Assmann, R.; Bauer, C.; Blum, W.; Brown, D.; Cattaneo, P.; Dehning, B.; Dietl, H.; Dydak, F.; Frank, M.; Halley, A. W.; Jakobs, K.; Lauber, J.; Lütjens, G.; Lutz, G.; Männer, W.; Richter, R.; Schröder, J.; Schwarz, A. S.; Settles, R.; Seywerd, H.; Stierlin, U.; Stiegler, U.; Dennis, R. St.; Wolf, G.; Alemany, R.; Boucrot, J.; Callot, O.; Cordier, A.; Davier, M.; Duflot, L.; Grivaz, J.-F.; Heusse, Ph.; Jaffe, D. E.; Janot, P.; Kim, D. W.; Le Diberder, F.; Lefrançois, J.; Lutz, A.-M.; Schune, M.-H.; Veillet, J.-J.; Videau, I.; Zhang, Z.; Abbaneo, D.; Bagliesi, G.; Batignani, G.; Bottigli, U.; Bozzi, C.; Calderini, G.; Carpinelli, M.; Ciocci, M. A.; Dell'Orso, R.; Ferrante, I.; Fidecaro, F.; Foà, L.; Forti, F.; Giassi, A.; Giorgi, M. A.; Gregorio, A.; Ligabue, F.; Lusiani, A.; Manneli, E. B.; Marrocchesi, P. S.; Messineo, A.; Palla, F.; Rizzo, G.; Sanguinetti, G.; Spagnolo, P.; Steinberger, J.; Techini, R.; Tonelli, G.; Triggiani, G.; Vannini, C.; Venturi, A.; Verdini, P. G.; Walsh, J.; Betteridge, A. P.; Gao, Y.; Green, M. G.; March, P. V.; Mir, Ll. M.; Medcalf, T.; Quazi, I. S.; Strong, J. A.; West, L. R.; Botterill, D. R.; Clifft, R. W.; Edgecock, T. R.; Haywood, S.; Norton, P. R.; Thompson, J. C.; Bloch-Devaux, B.; Colas, P.; Duarte, H.; Emery, S.; Kozanecki, W.; Lançon, E.; Lemaire, M. C.; Locci, E.; Marx, B.; Perez, P.; Rander, J.; Renardy, J.-F.; Rosowsky, A.; Roussarie, A.; Schuller, J.-P.; Schwindling, J.; Si Mohand, D.; Vallage, B.; Johnson, R. P.; Litke, A. M.; Taylor, G.; Wear, J.; Ashman, J. G.; Babbage, W.; Booth, C. N.; Buttar, C.; Cartwright, S.; Combley, F.; Dawson, I.; Thompson, L. F.; Barberio, E.; Böhrer, A.; Brandt, S.; Cowan, G.; Grupen, C.; Lutters, G.; Rivera, F.; Schäfer, U.; Smolik, L.; Bosisio, L.; Della Marina, R.; Giannini, G.; Gobbo, B.; Ragusa, F.; Bellantoni, L.; Chen, W.; Conway, J. S.; Feng, Z.; Ferguson, D. P. S.; Gao, Y. S.; Grahl, J.; Harton, J. L.; Hayes, O. J.; Nachtman, J. M.; Pan, Y. B.; Saadi, Y.; Schmitt, M.; Scott, I.; Sharma, V.; Shi, Z. H.; Turk, J. D.; Walsh, A. M.; Weber, F. V.; Sau Lan Wu; Wu, X.; Zheng, M.; Zobernig, G.; Aleph Collaboration

    1993-08-01

    Using a data sample corresponding to about 1 233 000 hadronic Z decays collected by the ALEPH experiment at LEP, the reaction e+e- → HZ∗ has been used to search for the standard model Higgs boson, in association with missing energy when Z∗ → v v¯, or with a pair of energetic leptons when Z∗ → e+e-or μ +μ -. No signal was found and, at the 95% confidence level, mH exceeds 58.4 GeV/ c2.

  7. Supporting Student Teachers' Professional Learning with Standards-Referenced Assessment

    Science.gov (United States)

    Tang, Sylvia Yee Fan; Cheng, May May Hung; So, Winnie Wing Mui

    2006-01-01

    Professional standards in teaching are developed in many education systems, with professional learning and quality assurance being the central purposes of these standards. This paper presents an initiative in developing a professional development progress map (hereafter, progress map) within a learning-oriented field experience assessment (LOFEA) …

  8. Standardization of figures and assessment procedures for DTM verticalaccuracy

    Directory of Open Access Journals (Sweden)

    Vittorio Casella

    2015-07-01

    Full Text Available Digital Terrain Models (DTMs are widely used in many sectors. They play a key role in hydrological risk prevention, risk mitigation and numeric simulations. This paper deals with two questions: (i when it is stated that a DTM has a given vertical accuracy, is this assertion univocal? (ii when DTM vertical accuracy is assessed by means of checkpoints, does their location influence results? First, the paper illustrates that two vertical accuracy definitions are conceivable: Vertical Accuracy at the Nodes (VAN, the average vertical distance between the model and the terrain, evaluated at the DTM's nodes and Vertical Accuracy at the interpolated Points (VAP, in which the vertical distance is evaluated at the generic points. These two quantities are not coincident and, when they are calculated for the same DTM, different numeric values are reached. Unfortunately, the two quantities are often interchanged, but this is misleading. Second, the paper shows a simulated example of a DTM vertical accuracy assessment, highlighting that the checkpoints’ location plays a key role: when checkpoints coincide with the DTM nodes, VAN is estimated; when checkpoints are randomly located, VAP is estimated, instead. Third, an in-depth, theoretical characterization of the two considered quantities is performed, based on symbolic computation, and suitable standardization coefficients are proposed. Finally, our discussion has a well-defined frame: it doesn't deal with all the items of the DTM vertical accuracy budget, which would require a much longer essay, but only with one, usually called fundamental vertical accuracy.

  9. B physics beyond the Standard Model

    International Nuclear Information System (INIS)

    Hewett, J.A.L.

    1997-12-01

    The ability of present and future experiments to test the Standard Model in the B meson sector is described. The authors examine the loop effects of new interactions in flavor changing neutral current B decays and in Z → b anti b, concentrating on supersymmetry and the left-right symmetric model as specific examples of new physics scenarios. The procedure for performing a global fit to the Wilson coefficients which describe b → s transitions is outlined, and the results of such a fit from Monte Carlo generated data is compared to the predictions of the two sample new physics scenarios. A fit to the Zb anti b couplings from present data is also given

  10. Complex singlet extension of the standard model

    International Nuclear Information System (INIS)

    Barger, V.; Langacker, P.; McCaskey, M.; Ramsey-Musolf, M.; Shaughnessy, G.

    2009-01-01

    We analyze a simple extension of the standard model (SM) obtained by adding a complex singlet to the scalar sector (cxSM). We show that the cxSM can contain one or two viable cold dark matter candidates and analyze the conditions on the parameters of the scalar potential that yield the observed relic density. When the cxSM potential contains a global U(1) symmetry that is both softly and spontaneously broken, it contains both a viable dark matter candidate and the ingredients necessary for a strong first order electroweak phase transition as needed for electroweak baryogenesis. We also study the implications of the model for discovery of a Higgs boson at the Large Hadron Collider

  11. Nanotechnologies: Risk assessment model

    Science.gov (United States)

    Giacobbe, F.; Monica, L.; Geraci, D.

    2009-05-01

    The development and use of nanomaterials has grown widely in the last years. Hence, it is necessary to carry out a careful and aimed risk assessment for the safety of the workers. The objective of this research is a specific assessment model finalized to the workplaces where the personnel work manipulating nanoparticles. This model mainly takes into account the number of exposed workers, the dimensions of particles, the information found in the safety data sheets and the uncertainties about the danger level coming from the exposition to nanomaterials. The evaluation algorithm considers the normal work conditions, the abnormal (e.g. breakdown air filter) and emergency situations (e.g. package cracking). It has been necessary to define several risk conditions in order to quantify the risk by increasing levels ("low", "middle" and "high" level). Each level includes appropriate behavioural procedures. In particular for the high level, it is advisable that the user carries out urgent interventions finalized to reduce the risk level (e.g. the utilization of vacuum box for the manipulation, high efficiency protection PPE, etc). The model has been implemented in a research laboratory where titanium dioxide and carbon nanotubes are used. The outcomes taken out from such specific evaluation gave a risk level equal to middle.

  12. Nanotechnologies: Risk assessment model

    International Nuclear Information System (INIS)

    Giacobbe, F; Monica, L; Geraci, D

    2009-01-01

    The development and use of nanomaterials has grown widely in the last years. Hence, it is necessary to carry out a careful and aimed risk assessment for the safety of the workers. The objective of this research is a specific assessment model finalized to the workplaces where the personnel work manipulating nanoparticles. This model mainly takes into account the number of exposed workers, the dimensions of particles, the information found in the safety data sheets and the uncertainties about the danger level coming from the exposition to nanomaterials. The evaluation algorithm considers the normal work conditions, the abnormal (e.g. breakdown air filter) and emergency situations (e.g. package cracking). It has been necessary to define several risk conditions in order to quantify the risk by increasing levels ('low', 'middle' and 'high' level). Each level includes appropriate behavioural procedures. In particular for the high level, it is advisable that the user carries out urgent interventions finalized to reduce the risk level (e.g. the utilization of vacuum box for the manipulation, high efficiency protection PPE, etc). The model has been implemented in a research laboratory where titanium dioxide and carbon nanotubes are used. The outcomes taken out from such specific evaluation gave a risk level equal to middle.

  13. Standard setting and quality of assessment: A conceptual approach ...

    African Journals Online (AJOL)

    Quality performance standards and the effect of assessment outcomes are important in the educational milieu, as assessment remains the representative ... not be seen as a methodological process of setting pass/fail cut-off points only, but as a powerful catalyst for quality improvements in HPE by promoting excellence in ...

  14. 24 CFR 115.206 - Performance assessments; Performance standards.

    Science.gov (United States)

    2010-04-01

    ... 24 Housing and Urban Development 1 2010-04-01 2010-04-01 false Performance assessments; Performance standards. 115.206 Section 115.206 Housing and Urban Development Regulations Relating to Housing... AGENCIES Certification of Substantially Equivalent Agencies § 115.206 Performance assessments; Performance...

  15. Integrated Assessment Modeling

    Energy Technology Data Exchange (ETDEWEB)

    Edmonds, James A.; Calvin, Katherine V.; Clarke, Leon E.; Janetos, Anthony C.; Kim, Son H.; Wise, Marshall A.; McJeon, Haewon C.

    2012-10-31

    This paper discusses the role of Integrated Assessment models (IAMs) in climate change research. IAMs are an interdisciplinary research platform, which constitutes a consistent scientific framework in which the large-scale interactions between human and natural Earth systems can be examined. In so doing, IAMs provide insights that would otherwise be unavailable from traditional single-discipline research. By providing a broader view of the issue, IAMs constitute an important tool for decision support. IAMs are also a home of human Earth system research and provide natural Earth system scientists information about the nature of human intervention in global biogeophysical and geochemical processes.

  16. [Standardization and modeling of surgical processes].

    Science.gov (United States)

    Strauss, G; Schmitz, P

    2016-12-01

    Due to the technological developments around the operating room, surgery in the twenty-first century is undergoing a paradigm shift. Which technologies have already been integrated into the surgical routine? How can a favorable cost-benefit balance be achieved by the implementation of new software-based assistance systems? This article presents the state of the art technology as exemplified by a semi-automated operation system for otorhinolaryngology surgery. The main focus is on systems for implementation of digital handbooks and navigational functions in situ. On the basis of continuous development in digital imaging, decisions may by facilitated by individual patient models thus allowing procedures to be optimized. The ongoing digitization and linking of all relevant information enable a high level of standardization in terms of operating procedures. This may be used by assistance systems as a basis for complete documentation and high process reliability. Automation of processes in the operating room results in an increase in quality, precision and standardization so that the effectiveness and efficiency of treatment can be improved; however, care must be taken that detrimental consequences, such as loss of skills and placing too much faith in technology must be avoided by adapted training concepts.

  17. Consistency test of the standard model

    International Nuclear Information System (INIS)

    Pawlowski, M.; Raczka, R.

    1997-01-01

    If the 'Higgs mass' is not the physical mass of a real particle but rather an effective ultraviolet cutoff then a process energy dependence of this cutoff must be admitted. Precision data from at least two energy scale experimental points are necessary to test this hypothesis. The first set of precision data is provided by the Z-boson peak experiments. We argue that the second set can be given by 10-20 GeV e + e - colliders. We pay attention to the special role of tau polarization experiments that can be sensitive to the 'Higgs mass' for a sample of ∼ 10 8 produced tau pairs. We argue that such a study may be regarded as a negative selfconsistency test of the Standard Model and of most of its extensions

  18. Symmetry breaking: The standard model and superstrings

    International Nuclear Information System (INIS)

    Gaillard, M.K.

    1988-01-01

    The outstanding unresolved issue of the highly successful standard model is the origin of electroweak symmetry breaking and of the mechanism that determines its scale, namely the vacuum expectation value (vev)v that is fixed by experiment at the value v = 4m//sub w//sup 2///g 2 = (√2G/sub F/)/sup /minus/1/ ≅ 1/4 TeV. In this talk I will discuss aspects of two approaches to this problem. One approach is straightforward and down to earth: the search for experimental signatures, as discussed previously by Pierre Darriulat. This approach covers the energy scales accessible to future and present laboratory experiments: roughly (10/sup /minus/9/ /minus/ 10 3 )GeV. The second approach involves theoretical speculations, such as technicolor and supersymmetry, that attempt to explain the TeV scale. 23 refs., 5 figs

  19. Symmetry breaking: The standard model and superstrings

    Energy Technology Data Exchange (ETDEWEB)

    Gaillard, M.K.

    1988-08-31

    The outstanding unresolved issue of the highly successful standard model is the origin of electroweak symmetry breaking and of the mechanism that determines its scale, namely the vacuum expectation value (vev)v that is fixed by experiment at the value v = 4m//sub w//sup 2///g/sup 2/ = (..sqrt..2G/sub F/)/sup /minus/1/ approx. = 1/4 TeV. In this talk I will discuss aspects of two approaches to this problem. One approach is straightforward and down to earth: the search for experimental signatures, as discussed previously by Pierre Darriulat. This approach covers the energy scales accessible to future and present laboratory experiments: roughly (10/sup /minus/9/ /minus/ 10/sup 3/)GeV. The second approach involves theoretical speculations, such as technicolor and supersymmetry, that attempt to explain the TeV scale. 23 refs., 5 figs.

  20. Outstanding questions: physics beyond the Standard Model

    CERN Document Server

    Ellis, John

    2012-01-01

    The Standard Model of particle physics agrees very well with experiment, but many important questions remain unanswered, among them are the following. What is the origin of particle masses and are they due to a Higgs boson? How does one understand the number of species of matter particles and how do they mix? What is the origin of the difference between matter and antimatter, and is it related to the origin of the matter in the Universe? What is the nature of the astrophysical dark matter? How does one unify the fundamental interactions? How does one quantize gravity? In this article, I introduce these questions and discuss how they may be addressed by experiments at the Large Hadron Collider, with particular attention to the search for the Higgs boson and supersymmetry.

  1. Standard model fermions and N=8 supergravity

    Energy Technology Data Exchange (ETDEWEB)

    Nicolai, Hermann [Max Planck Institute for Gravitational Physics (Albert Einstein Institute), Am Muehlenberg 1, Potsdam-Golm (Germany)

    2016-07-01

    In a scheme originally proposed by Gell-Mann, and subsequently shown to be realized at the SU(3) x U(1) stationary point of maximal gauged SO(8) supergravity, the 48 spin-1/2 fermions of the theory remaining after the removal of eight Goldstinos can be identified with the 48 quarks and leptons (including right-chiral neutrinos) of the Standard model, provided one identifies the residual SU(3) with the diagonal subgroup of the color group SU(3){sub c} and a family symmetry SU(3){sub f}. However, there remained a systematic mismatch in the electric charges by a spurion charge of ± 1/6. We here identify the ''missing'' U(1) that rectifies this mismatch, and that takes a surprisingly simple, though unexpected form, and show how it is related to the conjectured R symmetry K(E10) of M Theory.

  2. CMS standard model Higgs boson results

    Directory of Open Access Journals (Sweden)

    Garcia-Abia Pablo

    2013-11-01

    Full Text Available In July 2012 CMS announced the discovery of a new boson with properties resembling those of the long-sought Higgs boson. The analysis of the proton-proton collision data recorded by the CMS detector at the LHC, corresponding to integrated luminosities of 5.1 fb−1 at √s = 7 TeV and 19.6 fb−1 at √s = 8 TeV, confirm the Higgs-like nature of the new boson, with a signal strength associated with vector bosons and fermions consistent with the expectations for a standard model (SM Higgs boson, and spin-parity clearly favouring the scalar nature of the new boson. In this note I review the updated results of the CMS experiment.

  3. The standard model 30 years of glory

    International Nuclear Information System (INIS)

    Lefrancois, J.

    2001-03-01

    In these 3 lectures the author reviews the achievements of the past 30 years, which saw the birth and the detailed confirmation of the standard model. The first lecture is dedicated to quantum chromodynamics (QCD), deep inelastic scattering, neutrino scattering results, R(e + ,e - ), scaling violation, Drell-Yan reactions and the observation of jets. The second lecture deals with weak interactions and quark and lepton families, the discovery of W and Z bosons, of charm, of the tau lepton and B quarks are detailed. The third lecture focuses on the stunning progress that have been made in accuracy concerning detectors, the typical level of accuracy of previous e + e - experiments was about 5-10%, while the accuracy obtained at LEP/SLC is of order 0.1% to 0.5%. (A.C.)

  4. Budget impact model in moderate-to-severe psoriasis vulgaris assessing effects of calcipotriene and betamethasone dipropionate foam on per-patient standard of care costs.

    Science.gov (United States)

    Asche, Carl V; Kim, Minchul; Feldman, Steven R; Zografos, Panagiotis; Lu, Minyi

    2017-09-01

    To develop a budget impact model (BIM) for estimating the financial impact of formulary adoption and uptake of calcipotriene and betamethasone dipropionate (C/BD) foam (0.005%/0.064%) on the costs of biologics for treating moderate-to-severe psoriasis vulgaris in a hypothetical US healthcare plan with 1 million members. This BIM incorporated epidemiologic data, market uptake assumptions, and drug utilization costs, simulating the treatment mix for patients who are candidates for biologics before (Scenario #1) and after (Scenario #2) the introduction of C/BD foam. Predicted outcomes were expressed in terms of the annual cost of treatment (COT) and the COT per member per month (PMPM). At year 1, C/BD foam had the lowest per-patient cost ($9,913) necessary to achieve a Psoriasis Area and Severity Index (PASI)-75 response compared with etanercept ($73,773), adalimumab ($92,871), infliximab ($34,048), ustekinumab ($83,975), secukinumab ($113,858), apremilast ($47,960), and ixekizumab ($62,707). Following addition of C/BD foam to the formulary, the annual COT for moderate-to-severe psoriasis would decrease by $36,112,572 (17.91%, from $201,621,219 to $165,508,647). The COT PMPM is expected to decrease by $3.00 (17.86%, from $16.80 to $13.80). Drug costs were based on Medi-Span reference pricing (January 21, 2016); differences in treatment costs for drug administration, laboratory monitoring, or adverse events were not accounted for. Potentially confounding were the definition of "moderate-to-severe" and the heterogeneous efficacy data. The per-patient cost for PASI-75 response at year 1 was estimated from short-term efficacy data for C/BD foam and apremilast only. The introduction of C/BD foam is expected to decrease the annual COT for moderate-to-severe psoriasis treatable with biologics by $36,112,572 for a hypothetical US healthcare plan with 1 million plan members, and to lower the COT PMPM by $3.00.

  5. Standardized reporting for rapid relative effectiveness assessments of pharmaceuticals.

    Science.gov (United States)

    Kleijnen, Sarah; Pasternack, Iris; Van de Casteele, Marc; Rossi, Bernardette; Cangini, Agnese; Di Bidino, Rossella; Jelenc, Marjetka; Abrishami, Payam; Autti-Rämö, Ilona; Seyfried, Hans; Wildbacher, Ingrid; Goettsch, Wim G

    2014-11-01

    Many European countries perform rapid assessments of the relative effectiveness (RE) of pharmaceuticals as part of the reimbursement decision making process. Increased sharing of information on RE across countries may save costs and reduce duplication of work. The objective of this article is to describe the development of a tool for rapid assessment of RE of new pharmaceuticals that enter the market, the HTA Core Model® for Rapid Relative Effectiveness Assessment (REA) of Pharmaceuticals. Eighteen member organisations of the European Network of Health Technology Assessment (EUnetHTA) participated in the development of the model. Different versions of the model were developed and piloted in this collaboration and adjusted accordingly based on feedback on the content and feasibility of the model. The final model deviates from the traditional HTA Core Model® used for assessing other types of technologies. This is due to the limited scope (strong focus on RE), the timing of the assessment (just after market authorisation), and strict timelines (e.g. 90 days) required for performing the assessment. The number of domains and assessment elements was limited and it was decided that the primary information sources should preferably be a submission file provided by the marketing authorisation holder and the European Public Assessment Report. The HTA Core Model® for Rapid REA (version 3.0) was developed to produce standardised transparent RE information of pharmaceuticals. Further piloting can provide input for possible improvements, such as further refining the assessment elements and new methodological guidance on relevant areas.

  6. Environmental assessment. Energy efficiency standards for consumer products

    Energy Technology Data Exchange (ETDEWEB)

    McSwain, Berah

    1980-06-01

    The Energy Policy and Conservation Act of 1975 requires DOE to prescribe energy efficiency standards for 13 consumer products. The Consumer Products Efficiency Standards (CPES) program covers: refrigerators and refrigerator-freezers, freezers, clothes dryers, water heaters, room air conditioners, home heating equipment, kitchen ranges and ovens, central air conditioners (cooling and heat pumps), furnaces, dishwashers, television sets, clothes washers, and humidifiers and dehumidifiers. This Environmental Assessment evaluates the potential environmental and socioeconomic impacts expected as a result of setting efficiency standards for all of the consumer products covered by the CPES program. DOE has proposed standards for eight of the products covered by the Program in a Notice of Proposed Rulemaking (NOPR). DOE expects to propose standards for home heating equipment, central air conditioners (heat pumps only), dishwashers, television sets, clothes washers, and humidifiers and dehumidifiers in 1981. No significant adverse environmental or socioeconomic impacts have been found to result from instituting the CPES.

  7. Primordial lithium and the standard model(s)

    International Nuclear Information System (INIS)

    Deliyannis, C.P.; Demarque, P.; Kawaler, S.D.; Krauss, L.M.; Romanelli, P.

    1989-01-01

    We present the results of new theoretical work on surface 7 Li and 6 Li evolution in the oldest halo stars along with a new and refined analysis of the predicted primordial lithium abundance resulting from big-bang nucleosynthesis. This allows us to determine the constraints which can be imposed upon cosmology by a consideration of primordial lithium using both standard big-bang and standard stellar-evolution models. Such considerations lead to a constraint on the baryon density today of 0.0044 2 <0.025 (where the Hubble constant is 100h Km sec/sup -1/ Mpc /sup -1/), and impose limitations on alternative nucleosynthesis scenarios

  8. Standardized patient and standardized interdisciplinary team meeting: validation of a new performance-based assessment tool.

    Science.gov (United States)

    Yuasa, Misuzu; Nagoshi, Michael; Oshiro-Wong, Celeste; Tin, Maung; Wen, Aida; Masaki, Kamal

    2014-01-01

    The interdisciplinary team (IDT) approach is critical in the care of elderly adults. Performance-based tools to assess IDT skills have not been well validated. A novel assessment tool, the standardized patient (SP) and standardized interdisciplinary team meeting (SIDTM), consisting of two stations, was developed. First, trainees evaluate a SP hospitalized after a fall. Second, trainees play the role of the physician in a standardized IDT meeting with a standardized registered nurse (SRN) and standardized medical social worker (SMSW) for discharge planning. The SP-SIDTM was administered to 52 fourth-year medical students (MS4s) and six geriatric medicine fellows (GMFs) in 2011/12. The SP, SRN, and SMSW scored trainee performance on dichotomous checklists of clinical tasks and Likert scales of communication skills, which were compared according to level of training using t-tests. Trainees rated the SP-SIDTM experience as moderately difficult, length of time about right, and believability moderate to high. Reliability was high for both cases (Cronbach α = 0.73-0.87). Interobserver correlation between SRN and SMSW checklist scores (correlation coefficient (r) = 0.82, P < .001) and total scores (r = 0.69, P < .001) were high. The overall score on the SP-SIDTM case was significantly higher for GMF (75) than for MS4 (65, P = .002). These observations support the validity of this novel assessment tool. © 2013, Copyright the Authors Journal compilation © 2013, The American Geriatrics Society.

  9. Searches for Beyond Standard Model Physics with ATLAS and CMS

    CERN Document Server

    Rompotis, Nikolaos; The ATLAS collaboration

    2017-01-01

    The exploration of the high energy frontier with ATLAS and CMS experiments provides one of the best opportunities to look for physics beyond the Standard Model. In this talk, I review the motivation, the strategy and some recent results related to beyond Standard Model physics from these experiments. The review will cover beyond Standard Model Higgs boson searches, supersymmetry and searches for exotic particles.

  10. Connected formulas for amplitudes in standard model

    Energy Technology Data Exchange (ETDEWEB)

    He, Song [CAS Key Laboratory of Theoretical Physics,Institute of Theoretical Physics, Chinese Academy of Sciences,Beijing 100190 (China); School of Physical Sciences, University of Chinese Academy of Sciences,No. 19A Yuquan Road, Beijing 100049 (China); Zhang, Yong [Department of Physics, Beijing Normal University,Beijing 100875 (China); CAS Key Laboratory of Theoretical Physics,Institute of Theoretical Physics, Chinese Academy of Sciences,Beijing 100190 (China)

    2017-03-17

    Witten’s twistor string theory has led to new representations of S-matrix in massless QFT as a single object, including Cachazo-He-Yuan formulas in general and connected formulas in four dimensions. As a first step towards more realistic processes of the standard model, we extend the construction to QCD tree amplitudes with massless quarks and those with a Higgs boson. For both cases, we find connected formulas in four dimensions for all multiplicities which are very similar to the one for Yang-Mills amplitudes. The formula for quark-gluon color-ordered amplitudes differs from the pure-gluon case only by a Jacobian factor that depends on flavors and orderings of the quarks. In the formula for Higgs plus multi-parton amplitudes, the massive Higgs boson is effectively described by two additional massless legs which do not appear in the Parke-Taylor factor. The latter also represents the first twistor-string/connected formula for form factors.

  11. Experimental tests of the standard model

    International Nuclear Information System (INIS)

    Nodulman, L.

    1998-01-01

    The title implies an impossibly broad field, as the Standard Model includes the fermion matter states, as well as the forces and fields of SU(3) x SU(2) x U(1). For practical purposes, I will confine myself to electroweak unification, as discussed in the lectures of M. Herrero. Quarks and mixing were discussed in the lectures of R. Aleksan, and leptons and mixing were discussed in the lectures of K. Nakamura. I will essentially assume universality, that is flavor independence, rather than discussing tests of it. I will not pursue tests of QED beyond noting the consistency and precision of measurements of α EM in various processes including the Lamb shift, the anomalous magnetic moment (g-2) of the electron, and the quantum Hall effect. The fantastic precision and agreement of these predictions and measurements is something that convinces people that there may be something to this science enterprise. Also impressive is the success of the ''Universal Fermi Interaction'' description of beta decay processes, or in more modern parlance, weak charged current interactions. With one coupling constant G F , most precisely determined in muon decay, a huge number of nuclear instabilities are described. The slightly slow rate for neutron beta decay was one of the initial pieces of evidence for Cabbibo mixing, now generalized so that all charged current decays of any flavor are covered

  12. Experimental tests of the standard model.

    Energy Technology Data Exchange (ETDEWEB)

    Nodulman, L.

    1998-11-11

    The title implies an impossibly broad field, as the Standard Model includes the fermion matter states, as well as the forces and fields of SU(3) x SU(2) x U(1). For practical purposes, I will confine myself to electroweak unification, as discussed in the lectures of M. Herrero. Quarks and mixing were discussed in the lectures of R. Aleksan, and leptons and mixing were discussed in the lectures of K. Nakamura. I will essentially assume universality, that is flavor independence, rather than discussing tests of it. I will not pursue tests of QED beyond noting the consistency and precision of measurements of {alpha}{sub EM} in various processes including the Lamb shift, the anomalous magnetic moment (g-2) of the electron, and the quantum Hall effect. The fantastic precision and agreement of these predictions and measurements is something that convinces people that there may be something to this science enterprise. Also impressive is the success of the ''Universal Fermi Interaction'' description of beta decay processes, or in more modern parlance, weak charged current interactions. With one coupling constant G{sub F}, most precisely determined in muon decay, a huge number of nuclear instabilities are described. The slightly slow rate for neutron beta decay was one of the initial pieces of evidence for Cabbibo mixing, now generalized so that all charged current decays of any flavor are covered.

  13. Standards of Ombudsman Assessment: A New Normative Concept?

    Directory of Open Access Journals (Sweden)

    Milan Remac

    2013-07-01

    Full Text Available Today, an ombudsman is a traditional component of democratic legal systems. Generally, reports of the ombudsman are not legally binding. Due to this fact, the ombudsman can rely only on his own persuasiveness, on his acceptance by individuals and state institutions, on the understanding of the administration and on the accessibility and transparency of rules that underpin his reports. During investigations, ombudsmen assess whether the administration has acted in accordance with certain legal or extra-legal standards. Depending on the legal system, ombudsmen can investigate whether there is an instance of maladministration in the activities of administrative bodies, whether the administration has acted ‘properly’, whether it has acted in accordance with the law, whether administrative actions have breached the human rights of complainants or whether the actions of the administration were in accordance with anti-corruption rules etc. Regardless of the legislative standard of an ombudsman’s control, the ombudsman should consider and assess the situation described in complaints against certain criteria or against certain normative standards. A distinct set of standards which ombudsmen use during their investigation, or at least a clear statement of their assessment criteria, can increase the transparency of their procedures and the persuasiveness of their reports. Are the normative standards used by different ombudsmen the same? Do they possibly create a new normative concept? And can it possibly lead to a higher acceptance of their reports by the administration?

  14. STANDARDIZING QUALITY ASSESSMENT OF FUSED REMOTELY SENSED IMAGES

    Directory of Open Access Journals (Sweden)

    C. Pohl

    2017-09-01

    Full Text Available The multitude of available operational remote sensing satellites led to the development of many image fusion techniques to provide high spatial, spectral and temporal resolution images. The comparison of different techniques is necessary to obtain an optimized image for the different applications of remote sensing. There are two approaches in assessing image quality: 1. Quantitatively by visual interpretation and 2. Quantitatively using image quality indices. However an objective comparison is difficult due to the fact that a visual assessment is always subject and a quantitative assessment is done by different criteria. Depending on the criteria and indices the result varies. Therefore it is necessary to standardize both processes (qualitative and quantitative assessment in order to allow an objective image fusion quality evaluation. Various studies have been conducted at the University of Osnabrueck (UOS to establish a standardized process to objectively compare fused image quality. First established image fusion quality assessment protocols, i.e. Quality with No Reference (QNR and Khan's protocol, were compared on varies fusion experiments. Second the process of visual quality assessment was structured and standardized with the aim to provide an evaluation protocol. This manuscript reports on the results of the comparison and provides recommendations for future research.

  15. Standardizing Quality Assessment of Fused Remotely Sensed Images

    Science.gov (United States)

    Pohl, C.; Moellmann, J.; Fries, K.

    2017-09-01

    The multitude of available operational remote sensing satellites led to the development of many image fusion techniques to provide high spatial, spectral and temporal resolution images. The comparison of different techniques is necessary to obtain an optimized image for the different applications of remote sensing. There are two approaches in assessing image quality: 1. Quantitatively by visual interpretation and 2. Quantitatively using image quality indices. However an objective comparison is difficult due to the fact that a visual assessment is always subject and a quantitative assessment is done by different criteria. Depending on the criteria and indices the result varies. Therefore it is necessary to standardize both processes (qualitative and quantitative assessment) in order to allow an objective image fusion quality evaluation. Various studies have been conducted at the University of Osnabrueck (UOS) to establish a standardized process to objectively compare fused image quality. First established image fusion quality assessment protocols, i.e. Quality with No Reference (QNR) and Khan's protocol, were compared on varies fusion experiments. Second the process of visual quality assessment was structured and standardized with the aim to provide an evaluation protocol. This manuscript reports on the results of the comparison and provides recommendations for future research.

  16. Xpand chest drain: assessing equivalence to current standard ...

    African Journals Online (AJOL)

    leakage from 'open to air' system or breakage of glass bottle (with associated risk to ... and an air-leak detection system. It is connected to a ... need to add water. Xpand chest drain: assessing equivalence to current standard therapy – a randomised controlled trial. CHARL COOPER, M.B. CH.B. TIMOTHY HARDCASTLE ...

  17. Assessment of Ethical and Other Professional Standards in Private ...

    African Journals Online (AJOL)

    Compliance of 21 private medical laboratories in Osun State with ethical and other professional was assessed by the authors' pre and post inspection by the Medical Laboratory Science Council of Nigeria (MLSCN). Laboratory environment, personnel, equipment and adherence to Standard Operating Procedures (SOPs) ...

  18. Evaluation of the efficiency of standard assessment for Category C ...

    African Journals Online (AJOL)

    This article evaluates the application and efficiency of the Ethiopian standard tax assessment, as enshrined in the Income Tax Regulation No. 78/2002, against the tax liability of Category C Taxpayers, commonly known as small business taxpayers, referring to the practice in Eastern Zone Administration of the Tigray ...

  19. Motivational Effects of Standardized Language Assessment on Chinese Young Learners

    Science.gov (United States)

    Zhao, Chuqiao

    2016-01-01

    This review paper examines how standardized language assessment affects Chinese young learners' motivation for second-language learning. By presenting the historical and contemporary contexts of the testing system in China, this paper seeks to demonstrate the interrelationship among cultural, social, familial, and individual factors, which…

  20. Psychosocial Assessment as a Standard of Care in Pediatric Cancer

    NARCIS (Netherlands)

    Kazak, Anne E.; Abrams, Annah N.; Banks, Jaime; Christofferson, Jennifer; DiDonato, Stephen; Grootenhuis, Martha A.; Kabour, Marianne; Madan-Swain, Avi; Patel, Sunita K.; Zadeh, Sima; Kupst, Mary Jo

    2015-01-01

    This paper presents the evidence for a standard of care for psychosocial assessment in pediatric cancer. An interdisciplinary group of investigators utilized EBSCO, PubMed, PsycINFO, Ovid, and Google Scholar search databases, focusing on five areas: youth/family psychosocial adjustment, family

  1. 75 FR 66038 - Planning Resource Adequacy Assessment Reliability Standard

    Science.gov (United States)

    2010-10-27

    ... adequacy of specific loads (customer demand and energy requirements) within a Planning Authority Area... risks regarding the capability to balance resources and demand in a planning timeframe. Acknowledging...] Planning Resource Adequacy Assessment Reliability Standard Issued October 21, 2010. AGENCY: Federal Energy...

  2. Assessment of non-standard HIV antiretroviral therapy regimens at ...

    African Journals Online (AJOL)

    2016-03-06

    Mar 6, 2016 ... guidelines for children and not adults. Discussion. Less than 1% of the 17,000 patients receiving ART for treatment of HIV at Lighthouse Trust in 2012 were being treated with NS-ART, signifying a strong adherence to standardized regimens by clinicians. Assessing the reasons for use of NS-ART is essential ...

  3. Selective experimental review of the Standard Model

    International Nuclear Information System (INIS)

    Bloom, E.D.

    1985-02-01

    Before disussing experimental comparisons with the Standard Model, (S-M) it is probably wise to define more completely what is commonly meant by this popular term. This model is a gauge theory of SU(3)/sub f/ x SU(2)/sub L/ x U(1) with 18 parameters. The parameters are α/sub s/, α/sub qed/, theta/sub W/, M/sub W/ (M/sub Z/ = M/sub W//cos theta/sub W/, and thus is not an independent parameter), M/sub Higgs/; the lepton masses, M/sub e/, Mμ, M/sub r/; the quark masses, M/sub d/, M/sub s/, M/sub b/, and M/sub u/, M/sub c/, M/sub t/; and finally, the quark mixing angles, theta 1 , theta 2 , theta 3 , and the CP violating phase delta. The latter four parameters appear in the quark mixing matrix for the Kobayashi-Maskawa and Maiani forms. Clearly, the present S-M covers an enormous range of physics topics, and the author can only lightly cover a few such topics in this report. The measurement of R/sub hadron/ is fundamental as a test of the running coupling constant α/sub s/ in QCD. The author will discuss a selection of recent precision measurements of R/sub hadron/, as well as some other techniques for measuring α/sub s/. QCD also requires the self interaction of gluons. The search for the three gluon vertex may be practically realized in the clear identification of gluonic mesons. The author will present a limited review of recent progress in the attempt to untangle such mesons from the plethora q anti q states of the same quantum numbers which exist in the same mass range. The electroweak interactions provide some of the strongest evidence supporting the S-M that exists. Given the recent progress in this subfield, and particularly with the discovery of the W and Z bosons at CERN, many recent reviews obviate the need for further discussion in this report. In attempting to validate a theory, one frequently searches for new phenomena which would clearly invalidate it. 49 references, 28 figures

  4. Overuse Injury Assessment Model

    National Research Council Canada - National Science Library

    Stuhmiller, James H; Shen, Weixin; Sih, Bryant

    2005-01-01

    .... Previously, we developed a preliminary model that predicted the stress fracture rate and used biomechanical modeling, nonlinear optimization for muscle force, and bone structural analysis to estimate...

  5. Prediction of Phase Behavior of Spray-Dried Amorphous Solid Dispersions: Assessment of Thermodynamic Models, Standard Screening Methods and a Novel Atomization Screening Device with Regard to Prediction Accuracy

    Directory of Open Access Journals (Sweden)

    Aymeric Ousset

    2018-03-01

    Full Text Available The evaluation of drug–polymer miscibility in the early phase of drug development is essential to ensure successful amorphous solid dispersion (ASD manufacturing. This work investigates the comparison of thermodynamic models, conventional experimental screening methods (solvent casting, quench cooling, and a novel atomization screening device based on their ability to predict drug–polymer miscibility, solid state properties (Tg value and width, and adequate polymer selection during the development of spray-dried amorphous solid dispersions (SDASDs. Binary ASDs of four drugs and seven polymers were produced at 20:80, 40:60, 60:40, and 80:20 (w/w. Samples were systematically analyzed using modulated differential scanning calorimetry (mDSC and X-ray powder diffraction (XRPD. Principal component analysis (PCA was used to qualitatively assess the predictability of screening methods with regards to SDASD development. Poor correlation was found between theoretical models and experimentally-obtained results. Additionally, the limited ability of usual screening methods to predict the miscibility of SDASDs did not guarantee the appropriate selection of lead excipient for the manufacturing of robust SDASDs. Contrary to standard approaches, our novel screening device allowed the selection of optimal polymer and drug loading and established insight into the final properties and performance of SDASDs at an early stage, therefore enabling the optimization of the scaled-up late-stage development.

  6. Prediction of Phase Behavior of Spray-Dried Amorphous Solid Dispersions: Assessment of Thermodynamic Models, Standard Screening Methods and a Novel Atomization Screening Device with Regard to Prediction Accuracy.

    Science.gov (United States)

    Ousset, Aymeric; Chavez, Pierre-François; Meeus, Joke; Robin, Florent; Schubert, Martin Alexander; Somville, Pascal; Dodou, Kalliopi

    2018-03-07

    The evaluation of drug-polymer miscibility in the early phase of drug development is essential to ensure successful amorphous solid dispersion (ASD) manufacturing. This work investigates the comparison of thermodynamic models, conventional experimental screening methods (solvent casting, quench cooling), and a novel atomization screening device based on their ability to predict drug-polymer miscibility, solid state properties ( T g value and width), and adequate polymer selection during the development of spray-dried amorphous solid dispersions (SDASDs). Binary ASDs of four drugs and seven polymers were produced at 20:80, 40:60, 60:40, and 80:20 ( w / w ). Samples were systematically analyzed using modulated differential scanning calorimetry (mDSC) and X-ray powder diffraction (XRPD). Principal component analysis (PCA) was used to qualitatively assess the predictability of screening methods with regards to SDASD development. Poor correlation was found between theoretical models and experimentally-obtained results. Additionally, the limited ability of usual screening methods to predict the miscibility of SDASDs did not guarantee the appropriate selection of lead excipient for the manufacturing of robust SDASDs. Contrary to standard approaches, our novel screening device allowed the selection of optimal polymer and drug loading and established insight into the final properties and performance of SDASDs at an early stage, therefore enabling the optimization of the scaled-up late-stage development.

  7. Standards for psychological assessment of nuclear facility personnel. Technical report

    International Nuclear Information System (INIS)

    Frank, F.D.; Lindley, B.S.; Cohen, R.A.

    1981-07-01

    The subject of this study was the development of standards for the assessment of emotional instability in applicants for nuclear facility positions. The investigation covered all positions associated with a nuclear facility. Conclusions reached in this investigation focused on the ingredients of an integrated selection system including the use of personality tests, situational simulations, and the clinical interview; the need for professional standards to ensure quality control; the need for a uniform selection system as organizations vary considerably in terms of instruments presently used; and the need for an on-the-job behavioral observation program

  8. Assessment of resveratrol, apocynin and taurine on mechanical-metabolic uncoupling and oxidative stress in a mouse model of duchenne muscular dystrophy: A comparison with the gold standard, α-methyl prednisolone.

    Science.gov (United States)

    Capogrosso, Roberta Francesca; Cozzoli, Anna; Mantuano, Paola; Camerino, Giulia Maria; Massari, Ada Maria; Sblendorio, Valeriana Teresa; De Bellis, Michela; Tamma, Roberto; Giustino, Arcangela; Nico, Beatrice; Montagnani, Monica; De Luca, Annamaria

    2016-04-01

    Antioxidants have a great potential as adjuvant therapeutics in patients with Duchenne muscular dystrophy, although systematic comparisons at pre-clinical level are limited. The present study is a head-to-head assessment, in the exercised mdx mouse model of DMD, of natural compounds, resveratrol and apocynin, and of the amino acid taurine, in comparison with the gold standard α-methyl prednisolone (PDN). The rationale was to target the overproduction of reactive oxygen species (ROS) via disease-related pathways that are worsened by mechanical-metabolic impairment such as inflammation and over-activity of NADPH oxidase (NOX) (taurine and apocynin, respectively) or the failing ROS detoxification mechanisms via sirtuin-1 (SIRT1)-peroxisome proliferator-activated receptor γ coactivator 1α (PGC-1α) (resveratrol). Resveratrol (100mg/kg i.p. 5days/week), apocynin (38mg/kg/day per os), taurine (1g/kg/day per os), and PDN (1mg/kg i.p., 5days/week) were administered for 4-5 weeks to mdx mice in parallel with a standard protocol of treadmill exercise and the outcome was evaluated with a multidisciplinary approach in vivo and ex vivo on pathology-related end-points and biomarkers of oxidative stress. Resveratrol≥taurine>apocynin enhanced in vivo mouse force similarly to PDN. All the compounds reduced the production of superoxide anion, assessed by dihydroethidium staining, with apocynin being as effective as PDN, and ameliorated electrophysiological biomarkers of oxidative stress. Resveratrol also significantly reduced plasma levels of creatine kinase and lactate dehydrogenase. Force of isolated muscles was little ameliorated. However, the three compounds improved histopathology of gastrocnemius muscle more than PDN. Taurine>apocynin>PDN significantly decreased activated NF-kB positive myofibers. Thus, compounds targeting NOX-ROS or SIRT1/PGC-1α pathways differently modulate clinically relevant DMD-related endpoints according to their mechanism of action. With the

  9. Comparison of cosmological models using standard rulers and candles

    OpenAIRE

    Li, Xiaolei; Cao, Shuo; Zheng, Xiaogang; Li, Song; Biesiada, Marek

    2015-01-01

    In this paper, we used standard rulers and standard candles (separately and jointly) to explore five popular dark energy models under assumption of spatial flatness of the Universe. As standard rulers, we used a data set comprising 118 galactic-scale strong lensing systems (individual standard rulers if properly calibrated for the mass density profile) combined with BAO diagnostics (statistical standard ruler). Supernovae Ia served asstandard candles. Unlike in the most of previous statistica...

  10. Neutrinos: in and out of the standard model

    Energy Technology Data Exchange (ETDEWEB)

    Parke, Stephen; /Fermilab

    2006-07-01

    The particle physics Standard Model has been tremendously successful in predicting the outcome of a large number of experiments. In this model Neutrinos are massless. Yet recent evidence points to the fact that neutrinos are massive particles with tiny masses compared to the other particles in the Standard Model. These tiny masses allow the neutrinos to change flavor and oscillate. In this series of Lectures, I will review the properties of Neutrinos In the Standard Model and then discuss the physics of Neutrinos Beyond the Standard Model. Topics to be covered include Neutrino Flavor Transformations and Oscillations, Majorana versus Dirac Neutrino Masses, the Seesaw Mechanism and Leptogenesis.

  11. Standardizing Physiologic Assessment Data to Enable Big Data Analytics.

    Science.gov (United States)

    Matney, Susan A; Settergren, Theresa Tess; Carrington, Jane M; Richesson, Rachel L; Sheide, Amy; Westra, Bonnie L

    2016-07-18

    Disparate data must be represented in a common format to enable comparison across multiple institutions and facilitate big data science. Nursing assessments represent a rich source of information. However, a lack of agreement regarding essential concepts and standardized terminology prevent their use for big data science in the current state. The purpose of this study was to align a minimum set of physiological nursing assessment data elements with national standardized coding systems. Six institutions shared their 100 most common electronic health record nursing assessment data elements. From these, a set of distinct elements was mapped to nationally recognized Logical Observations Identifiers Names and Codes (LOINC®) and Systematized Nomenclature of Medicine-Clinical Terms (SNOMED CT®) standards. We identified 137 observation names (55% new to LOINC), and 348 observation values (20% new to SNOMED CT) organized into 16 panels (72% new LOINC). This reference set can support the exchange of nursing information, facilitate multi-site research, and provide a framework for nursing data analysis. © The Author(s) 2016.

  12. Relationship Between Faculty and Standardized Patient Assessment Scores of Podiatric Medical Students During a Standardized Performance Assessment Laboratory.

    Science.gov (United States)

    Mahoney, James M; Vardaxis, Vassilios; Anwar, Noreen; Hagenbucher, Jacob

    2016-03-01

    Direct assessment of health professional student performance of clinical skills can be accurately performed in the standardized performance assessment laboratory (SPAL), typically by health professional faculty. However, owing to time and economic considerations, nonmedical individuals have been specially trained to perform the same function (standardized patients [SPs]). This study compared the assessment scores of the history and physical examination components of a SPAL designed for second-year podiatric medical students at Des Moines University (DMU) by a podiatry medical faculty member and SPs. A total of 101 students from the classes of 2015 and 2016 were evaluated in 2013 and 2014 by 11 to 13 SPs from the DMU SPAL program. The video recordings of these 101 students were then evaluated by one faculty member from the College of Podiatric Medicine and Surgery at DMU. The Pearson correlation coefficient for each class showed a strong linear relationship between SP and faculty assessment scores. The associations between SP and faculty assessment scores in the history, physical examination, and combined history and physical examination components for the 2016 class (0.706, 0.925, and 0.911, respectively) were found to be stronger than those for the 2015 class (0.697, 0.791, and 0.791, respectively). This study indicated that there are strong associations between the assessment scores of trained SPs and faculty for the history, physical examination, and combined history and physical examination components of second-year SPAL activity for podiatric medical students.

  13. Prospects of experimentally reachable beyond Standard Model ...

    Indian Academy of Sciences (India)

    2016-01-06

    Jan 6, 2016 ... behaviour of the newly discovered particles and their strange interactions, during the first half of the 20th century, was culminated with the introduction of Standard ... various limitations. For a good summary on its excellencies and compulsions see [1], and for extensive details on SM and beyond, see [2].

  14. Why supersymmetry? Physics beyond the standard model

    Indian Academy of Sciences (India)

    2016-08-23

    Aug 23, 2016 ... This leads to an estimate of the naturalness breakdown scale for the electroweak theory as: N ∼ 1 TeV. 3. .... For supersymmetric model build- ing, see ref. [10]. Simplest supersymmetric model is ... gent restrictions for supersymmetry model building come from the requirement of sufficient suppression.

  15. Effects of tailored neck-shoulder pain treatment based on a decision model guided by clinical assessments and standardized functional tests. A study protocol of a randomized controlled trial

    Directory of Open Access Journals (Sweden)

    Björklund Martin

    2012-05-01

    Full Text Available Abstract Background A major problem with rehabilitation interventions for neck pain is that the condition may have multiple causes, thus a single treatment approach is seldom efficient. The present study protocol outlines a single blinded randomised controlled trial evaluating the effect of tailored treatment for neck-shoulder pain. The treatment is based on a decision model guided by standardized clinical assessment and functional tests with cut-off values. Our main hypothesis is that the tailored treatment has better short, intermediate and long-term effects than either non-tailored treatment or treatment-as-usual (TAU on pain and function. We sub-sequentially hypothesize that tailored and non-tailored treatment both have better effect than TAU. Methods/Design 120 working women with minimum six weeks of nonspecific neck-shoulder pain aged 20–65, are allocated by minimisation with the factors age, duration of pain, pain intensity and disability in to the groups tailored treatment (T, non-tailored treatment (NT or treatment-as-usual (TAU. Treatment is given to the groups T and NT for 11 weeks (27 sessions evenly distributed. An extensive presentation of the tests and treatment decision model is provided. The main treatment components are manual therapy, cranio-cervical flexion exercise and strength training, EMG-biofeedback training, treatment for cervicogenic headache, neck motor control training. A decision algorithm based on the baseline assessment determines the treatment components given to each participant of T- and NT-groups. Primary outcome measures are physical functioning (Neck Disability Index and average pain intensity last week (Numeric Rating Scale. Secondary outcomes are general improvement (Patient Global Impression of Change scale, symptoms (Profile Fitness Mapping neck questionnaire, capacity to work in the last 6 weeks (quality and quantity and pressure pain threshold of m. trapezius. Primary and secondary outcomes will

  16. Heterogeneous information network model for equipment-standard system

    Science.gov (United States)

    Yin, Liang; Shi, Li-Chen; Zhao, Jun-Yan; Du, Song-Yang; Xie, Wen-Bo; Yuan, Fei; Chen, Duan-Bing

    2018-01-01

    Entity information network is used to describe structural relationships between entities. Taking advantage of its extension and heterogeneity, entity information network is more and more widely applied to relationship modeling. Recent years, lots of researches about entity information network modeling have been proposed, while seldom of them concentrate on equipment-standard system with properties of multi-layer, multi-dimension and multi-scale. In order to efficiently deal with some complex issues in equipment-standard system such as standard revising, standard controlling, and production designing, a heterogeneous information network model for equipment-standard system is proposed in this paper. Three types of entities and six types of relationships are considered in the proposed model. Correspondingly, several different similarity-measuring methods are used in the modeling process. The experiments show that the heterogeneous information network model established in this paper can reflect relationships between entities accurately. Meanwhile, the modeling process has a good performance on time consumption.

  17. Working group report: Beyond the standard model

    Indian Academy of Sciences (India)

    Superstring-inspired phenomenology: This included. – models of low-scale quantum gravity with one or more extra dimensions,. – noncommutative geometry and gauge theories,. – string-inspired grand unification. • Models of supersymmetry-breaking: This included. – Supersymmetry-breaking in minimal supergravity ...

  18. Towards a quality model for semantic IS standards

    NARCIS (Netherlands)

    Folmer, Erwin Johan Albert; van Soest, J.

    2012-01-01

    This research focuses on developing a quality model for semantic information system (IS) standards. A lot of semantic IS standards are available in different industries. Often these standards are developed by a dedicated organisation. While these organisations have the goal of increasing

  19. Towards a quality model for semantic IS standards

    NARCIS (Netherlands)

    Folmer, Erwin Johan Albert; van Soest, Joris

    2011-01-01

    This research focuses on developing a quality model for semantic Information System (IS) standards. A lot of semantic IS standards are available in different industries. Often these standards are developed by a dedicated organization. While these organizations have the goal of increasing

  20. Overuse Injury Assessment Model

    Science.gov (United States)

    2006-03-01

    bones in celiac disease patients." Am J Gastroenterol 98(2): 382-390. Ferretti, J. L. 1997. "Noninvasive Assessment of Bone Architecture and...altered gait pretest anthropometry diet and nutrition genetics endocrine status and hormones bone disease (pathology) age initial bone health state...and between subjects can be expected. Consequently large numbers of subject are required to obtain statistically significant results. Even with

  1. Standardized training in nurse model travel clinics.

    Science.gov (United States)

    Sofarelli, Theresa A; Ricks, Jane H; Anand, Rahul; Hale, Devon C

    2011-01-01

    International travel plays a significant role in the emergence and redistribution of major human diseases. The importance of travel medicine clinics for preventing morbidity and mortality has been increasingly appreciated, although few studies have thus far examined the management and staff training strategies that result in successful travel-clinic operations. Here, we describe an example of travel-clinic operation and management coordinated through the University of Utah School of Medicine, Division of Infectious Diseases. This program, which involves eight separate clinics distributed statewide, functions both to provide patient consult and care services, as well as medical provider training and continuing medical education (CME). Initial training, the use of standardized forms and protocols, routine chart reviews and monthly continuing education meetings are the distinguishing attributes of this program. An Infectious Disease team consisting of one medical doctor (MD) and a physician assistant (PA) act as consultants to travel nurses who comprise the majority of clinic staff. Eight clinics distributed throughout the state of Utah serve approximately 6,000 travelers a year. Pre-travel medical services are provided by 11 nurses, including 10 registered nurses (RNs) and 1 licensed practical nurse (LPN). This trained nursing staff receives continuing travel medical education and participate in the training of new providers. All nurses have completed a full training program and 7 of the 11 (64%) of clinic nursing staff serve more than 10 patients a week. Quality assurance measures show that approximately 0.5% of charts reviewed contain a vaccine or prescription error which require patient notification for correction. Using an initial training program, standardized patient intake forms, vaccine and prescription protocols, preprinted prescriptions, and regular CME, highly trained nurses at travel clinics are able to provide standardized pre-travel care to

  2. Ecological models and pesticide risk assessment: current modeling practice.

    Science.gov (United States)

    Schmolke, Amelie; Thorbek, Pernille; Chapman, Peter; Grimm, Volker

    2010-04-01

    Ecological risk assessments of pesticides usually focus on risk at the level of individuals, and are carried out by comparing exposure and toxicological endpoints. However, in most cases the protection goal is populations rather than individuals. On the population level, effects of pesticides depend not only on exposure and toxicity, but also on factors such as life history characteristics, population structure, timing of application, presence of refuges in time and space, and landscape structure. Ecological models can integrate such factors and have the potential to become important tools for the prediction of population-level effects of exposure to pesticides, thus allowing extrapolations, for example, from laboratory to field. Indeed, a broad range of ecological models have been applied to chemical risk assessment in the scientific literature, but so far such models have only rarely been used to support regulatory risk assessments of pesticides. To better understand the reasons for this situation, the current modeling practice in this field was assessed in the present study. The scientific literature was searched for relevant models and assessed according to nine characteristics: model type, model complexity, toxicity measure, exposure pattern, other factors, taxonomic group, risk assessment endpoint, parameterization, and model evaluation. The present study found that, although most models were of a high scientific standard, many of them would need modification before they are suitable for regulatory risk assessments. The main shortcomings of currently available models in the context of regulatory pesticide risk assessments were identified. When ecological models are applied to regulatory risk assessments, we recommend reviewing these models according to the nine characteristics evaluated here. (c) 2010 SETAC.

  3. Reef Fish Survey Techniques: Assessing the Potential for Standardizing Methodologies.

    Directory of Open Access Journals (Sweden)

    Zachary R Caldwell

    Full Text Available Dramatic changes in populations of fishes living on coral reefs have been documented globally and, in response, the research community has initiated efforts to assess and monitor reef fish assemblages. A variety of visual census techniques are employed, however results are often incomparable due to differential methodological performance. Although comparability of data may promote improved assessment of fish populations, and thus management of often critically important nearshore fisheries, to date no standardized and agreed-upon survey method has emerged. This study describes the use of methods across the research community and identifies potential drivers of method selection. An online survey was distributed to researchers from academic, governmental, and non-governmental organizations internationally. Although many methods were identified, 89% of survey-based projects employed one of three methods-belt transect, stationary point count, and some variation of the timed swim method. The selection of survey method was independent of the research design (i.e., assessment goal and region of study, but was related to the researcher's home institution. While some researchers expressed willingness to modify their current survey protocols to more standardized protocols (76%, their willingness decreased when methodologies were tied to long-term datasets spanning five or more years. Willingness to modify current methodologies was also less common among academic researchers than resource managers. By understanding both the current application of methods and the reported motivations for method selection, we hope to focus discussions towards increasing the comparability of quantitative reef fish survey data.

  4. The Impact of Statistical Adjustment on Conditional Standard Errors of Measurement in the Assessment of Physician Communication Skills

    Science.gov (United States)

    Raymond, Mark R.; Clauser, Brian E.; Furman, Gail E.

    2010-01-01

    The use of standardized patients to assess communication skills is now an essential part of assessing a physician's readiness for practice. To improve the reliability of communication scores, it has become increasingly common in recent years to use statistical models to adjust ratings provided by standardized patients. This study employed ordinary…

  5. The thermal evolution of universe: standard model

    International Nuclear Information System (INIS)

    Nascimento, L.C.S. do.

    1975-08-01

    A description of the dynamical evolution of the Universe following a model based on the theory of General Relativity is made. The model admits the Cosmological principle,the principle of Equivalence and the Robertson-Walker metric (of which an original derivation is presented). In this model, the universe is considered as a perfect fluid, ideal and symmetric relatively to the number of particles and antiparticles. The thermodynamic relations deriving from these hypothesis are derived, and from them the several eras of the thermal evolution of the universe are established. Finally, the problems arising from certain specific predictions of the model are studied, and the predictions of the abundances of the elements according to nucleosynthesis and the actual behavior of the universe are analysed in detail. (author) [pt

  6. Toward a Standard Model of Core Collapse Supernovae

    OpenAIRE

    Mezzacappa, A.

    2000-01-01

    In this paper, we discuss the current status of core collapse supernova models and the future developments needed to achieve significant advances in understanding the supernova mechanism and supernova phenomenology, i.e., in developing a supernova standard model.

  7. Positive animal welfare states and reference standards for welfare assessment.

    Science.gov (United States)

    Mellor, D J

    2015-01-01

    Developments in affective neuroscience and behavioural science during the last 10-15 years have together made it increasingly apparent that sentient animals are potentially much more sensitive to their environmental and social circumstances than was previously thought to be the case. It therefore seems likely that both the range and magnitude of welfare trade-offs that occur when animals are managed for human purposes have been underestimated even when minimalistic but arguably well-intentioned attempts have been made to maintain high levels of welfare. In light of these neuroscience-supported behaviour-based insights, the present review considers the extent to which the use of currently available reference standards might draw attention to these previously neglected areas of concern. It is concluded that the natural living orientation cannot provide an all-embracing or definitive welfare benchmark because of its primary focus on behavioural freedom. However assessments of this type, supported by neuroscience insights into behavioural motivation, may now carry greater weight when used to identify management practices that should be avoided, discontinued or substantially modified. Using currently accepted baseline standards as welfare reference points may result in small changes being accorded greater significance than would be the case if they were compared with higher standards, and this could slow the progress towards better levels of welfare. On the other hand, using "what animals want" as a reference standard has the appeal of focusing on the specific resources or conditions the animals would choose themselves and can potentially improve their welfare more quickly than the approach of making small increments above baseline standards. It is concluded that the cautious use of these approaches in different combinations could lead to recommendations that would more effectively promote positive welfare states in hitherto neglected areas of concern.

  8. Integrated Environmental Assessment Modelling

    Energy Technology Data Exchange (ETDEWEB)

    Guardanz, R.; Gimeno, B. S.; Bermejo, V.; Elvira, S.; Martin, F.; Palacios, M.; Rodriguez, E.; Donaire, I. [Ciemat, Madrid (Spain)

    2000-07-01

    This report describes the results of the Spanish participation in the project Coupling CORINAIR data to cost-effect emission reduction strategies based on critical threshold. (EU/LIFE97/ENV/FIN/336). The subproject has focused on three tasks. Develop tools to improve knowledge on the spatial and temporal details of emissions of air pollutants in Spain. Exploit existing experimental information on plant response to air pollutants in temperate ecosystem and Integrate these findings in a modelling framework that can asses with more accuracy the impact of air pollutants to temperate ecosystems. The results obtained during the execution of this project have significantly improved the models of the impact of alternative emission control strategies on ecosystems and crops in the Iberian Peninsula. (Author) 375 refs.

  9. Basic Laparoscopic Skills Assessment Study: Validation and Standard Setting among Canadian Urology Trainees.

    Science.gov (United States)

    Lee, Jason Y; Andonian, Sero; Pace, Kenneth T; Grober, Ethan

    2017-06-01

    As urology training programs move to a competency based medical education model, iterative assessments with objective standards will be required. To develop a valid set of technical skills standards we initiated a national skills assessment study focusing initially on laparoscopic skills. Between February 2014 and March 2016 the basic laparoscopic skill of Canadian urology trainees and attending urologists was assessed using 4 standardized tasks from the AUA (American Urological Association) BLUS (Basic Laparoscopic Urological Surgery) curriculum, including peg transfer, pattern cutting, suturing and knot tying, and vascular clip applying. All performances were video recorded and assessed using 3 methods, including time and error based scoring, expert global rating scores and C-SATS (Crowd-Sourced Assessments of Technical Skill Global Rating Scale), a novel, crowd sourced assessment platform. Different methods of standard setting were used to develop pass-fail cut points. Six attending urologists and 99 trainees completed testing. Reported laparoscopic experience and training level correlated with performance (p standard setting methods to define pass-fail cut points for all 4 AUA BLUS tasks. The 4 AUA BLUS tasks demonstrated good construct validity evidence for use in assessing basic laparoscopic skill. Performance scores using the novel C-SATS platform correlated well with traditional time-consuming methods of assessment. Various standard setting methods were used to develop pass-fail cut points for educators to use when making formative and summative assessments of basic laparoscopic skill. Copyright © 2017 American Urological Association Education and Research, Inc. Published by Elsevier Inc. All rights reserved.

  10. Assessment of the Japanese Energy Efficiency Standards Program

    Directory of Open Access Journals (Sweden)

    Jun Arakawa

    2015-03-01

    Full Text Available Japanese energy efficiency standards program for appliances is a unique program which sets and revises mandatory standards based on the products of the highest energy efficiency on the markets. This study assessed the cost-effectiveness of the standard settings for air conditioner as a major residential appliance or typical example in the program. Based on analyses of empirical data, the net costs and effects from 1999 to 2040 were estimated. When applying a discount rate of 3%, the cost of abating CO2 emissions realized through the considered standards was estimated to be -13700 JPY/t-CO2. The sensitivity analysis, however, showed the cost turns into positive at a discount rate of 26% or higher. The authors also revealed that the standards’ “excellent” cost-effectiveness largely depends on that of the 1st standard setting, and the CO2 abatement cost through the 2nd standard was estimated to be as high as 26800 JPY/t-CO2. The results imply that the government is required to be careful about the possible economic burden imposed when considering introducing new, additional standards.

  11. Standard Model-like corrections to Dilatonic Dynamics

    DEFF Research Database (Denmark)

    Antipin, Oleg; Krog, Jens; Mølgaard, Esben

    2013-01-01

    We examine the effects of standard model-like interactions on the near-conformal dynamics of a theory featuring a dilatonic state identified with the standard model-like Higgs. As template for near-conformal dynamics we use a gauge theory with fermionic matter and elementary mesons possessing...... conformal dynamics could accommodate the observed Higgs-like properties....

  12. Can An Amended Standard Model Account For Cold Dark Matter?

    International Nuclear Information System (INIS)

    Goldhaber, Maurice

    2004-01-01

    It is generally believed that one has to invoke theories beyond the Standard Model to account for cold dark matter particles. However, there may be undiscovered universal interactions that, if added to the Standard Model, would lead to new members of the three generations of elementary fermions that might be candidates for cold dark matter particles

  13. The Standard Model from LHC to future colliders

    Energy Technology Data Exchange (ETDEWEB)

    Forte, S., E-mail: forte@mi.infn.it [Dipartimento di Fisica, Università di Milano, Via Celoria 16, 20133, Milan (Italy); INFN, Sezione di Milano, Via Celoria 16, 20133, Milan (Italy); Nisati, A. [INFN, Sezione di Roma, Piazzale Aldo Moro 2, 00185, Rome (Italy); Passarino, G. [Dipartimento di Fisica, Università di Torino, Via P. Giuria 1, 10125, Turin (Italy); INFN, Sezione di Torino, Via P. Giuria 1, 10125, Turin (Italy); Tenchini, R. [INFN, Sezione di Pisa, Largo B. Pontecorvo 3, 56127, Pisa (Italy); Calame, C. M. Carloni [Dipartimento di Fisica, Università di Pavia, via Bassi 6, 27100, Pavia (Italy); Chiesa, M. [INFN, Sezione di Pavia, via Bassi 6, 27100, Pavia (Italy); Cobal, M. [Dipartimento di Chimica, Fisica e Ambiente, Università di Udine, Via delle Scienze, 206, 33100, Udine (Italy); INFN, Gruppo Collegato di Udine, Via delle Scienze, 206, 33100, Udine (Italy); Corcella, G. [INFN, Laboratori Nazionali di Frascati, Via E. Fermi 40, 00044, Frascati (Italy); Degrassi, G. [Dipartimento di Matematica e Fisica, Università’ Roma Tre, Via della Vasca Navale 84, 00146, Rome (Italy); INFN, Sezione di Roma Tre, Via della Vasca Navale 84, 00146, Rome (Italy); Ferrera, G. [Dipartimento di Fisica, Università di Milano, Via Celoria 16, 20133, Milan (Italy); INFN, Sezione di Milano, Via Celoria 16, 20133, Milan (Italy); Magnea, L. [Dipartimento di Fisica, Università di Torino, Via P. Giuria 1, 10125, Turin (Italy); INFN, Sezione di Torino, Via P. Giuria 1, 10125, Turin (Italy); Maltoni, F. [Centre for Cosmology, Particle Physics and Phenomenology (CP3), Université Catholique de Louvain, 1348, Louvain-la-Neuve (Belgium); Montagna, G. [Dipartimento di Fisica, Università di Pavia, via Bassi 6, 27100, Pavia (Italy); INFN, Sezione di Pavia, via Bassi 6, 27100, Pavia (Italy); Nason, P. [INFN, Sezione di Milano-Bicocca, Piazza della Scienza 3, 20126, Milan (Italy); Nicrosini, O. [INFN, Sezione di Pavia, via Bassi 6, 27100, Pavia (Italy); Oleari, C. [Dipartimento di Fisica, Università di Milano-Bicocca, Piazza della Scienza 3, 20126, Milan (Italy); INFN, Sezione di Milano-Bicocca, Piazza della Scienza 3, 20126, Milan (Italy); Piccinini, F. [INFN, Sezione di Pavia, via Bassi 6, 27100, Pavia (Italy); Riva, F. [Institut de Théorie des Phénoménes Physiques, École Polytechnique Fédérale de Lausanne, 1015, Lausanne (Switzerland); Vicini, A. [Dipartimento di Fisica, Università di Milano, Via Celoria 16, 20133, Milan (Italy); INFN, Sezione di Milano, Via Celoria 16, 20133, Milan (Italy)

    2015-11-25

    This review summarizes the results of the activities which have taken place in 2014 within the Standard Model Working Group of the “What Next” Workshop organized by INFN, Italy. We present a framework, general questions, and some indications of possible answers on the main issue for Standard Model physics in the LHC era and in view of possible future accelerators.

  14. Neutrinos and Physics Beyond Electroweak and Cosmological Standard Models

    CERN Document Server

    Kirilova, Daniela

    2014-01-01

    This is a short review of the established and the proposed by physics beyond Standard Electroweak Model and beyond Standard Cosmological Model neutrino characteristics. In particular, cosmological effects of and cosmological constraints on: extra neutrino families, neutrino mass differences and mixing, lepton asymmetry in the neutrino sector, neutrino masses, light sterile neutrino, are discussed.

  15. Prospects of experimentally reachable beyond Standard Model ...

    Indian Academy of Sciences (India)

    2016-01-06

    Jan 6, 2016 ... Dirac mass MH = ±M + μS/2. As μS does not play much role in any other prediction, we assume that it fits the neutrino oscillation data and one can determine it by inverting the inverse see-saw formula and using experimental results of neutrino masses and mixings. The model achieves precision gauge ...

  16. Standardization of A Physiologic Hypoparathyroidism Animal Model.

    Science.gov (United States)

    Jung, Soo Yeon; Kim, Ha Yeong; Park, Hae Sang; Yin, Xiang Yun; Chung, Sung Min; Kim, Han Su

    2016-01-01

    Ideal hypoparathyroidism animal models are a prerequisite to developing new treatment modalities for this disorder. The purpose of this study was to evaluate the feasibility of a model whereby rats were parathyroidectomized (PTX) using a fluorescent-identification method and the ideal calcium content of the diet was determined. Thirty male rats were divided into surgical sham (SHAM, n = 5) and PTX plus 0, 0.5, and 2% calcium diet groups (PTX-FC (n = 5), PTX-NC (n = 10), and PTX-HC (n = 10), respectively). Serum parathyroid hormone levels decreased to non-detectable levels in all PTX groups. All animals in the PTX-FC group died within 4 days after the operation. All animals survived when supplied calcium in the diet. However, serum calcium levels were higher in the PTX-HC than the SHAM group. The PTX-NC group demonstrated the most representative modeling of primary hypothyroidism. Serum calcium levels decreased and phosphorus levels increased, and bone volume was increased. All animals survived without further treatment and did not show nephrotoxicity including calcium deposits. These findings demonstrate that PTX animal models produced by using the fluorescent-identification method, and fed a 0.5% calcium diet, are appropriate for hypoparathyroidism treatment studies.

  17. Standardization of A Physiologic Hypoparathyroidism Animal Model.

    Directory of Open Access Journals (Sweden)

    Soo Yeon Jung

    Full Text Available Ideal hypoparathyroidism animal models are a prerequisite to developing new treatment modalities for this disorder. The purpose of this study was to evaluate the feasibility of a model whereby rats were parathyroidectomized (PTX using a fluorescent-identification method and the ideal calcium content of the diet was determined. Thirty male rats were divided into surgical sham (SHAM, n = 5 and PTX plus 0, 0.5, and 2% calcium diet groups (PTX-FC (n = 5, PTX-NC (n = 10, and PTX-HC (n = 10, respectively. Serum parathyroid hormone levels decreased to non-detectable levels in all PTX groups. All animals in the PTX-FC group died within 4 days after the operation. All animals survived when supplied calcium in the diet. However, serum calcium levels were higher in the PTX-HC than the SHAM group. The PTX-NC group demonstrated the most representative modeling of primary hypothyroidism. Serum calcium levels decreased and phosphorus levels increased, and bone volume was increased. All animals survived without further treatment and did not show nephrotoxicity including calcium deposits. These findings demonstrate that PTX animal models produced by using the fluorescent-identification method, and fed a 0.5% calcium diet, are appropriate for hypoparathyroidism treatment studies.

  18. Electroweak symmetry breaking beyond the Standard Model

    Indian Academy of Sciences (India)

    In this paper, two key issues related to electroweak symmetry breaking are addressed. First, how fine-tuned different models are that trigger this phenomenon? Second, even if a light Higgs boson exists, does it have to be necessarily elementary? After a brief introduction, the fine-tuning aspects of the MSSM, NMSSM, ...

  19. Big bang nucleosynthesis - The standard model and alternatives

    Science.gov (United States)

    Schramm, David N.

    1991-01-01

    The standard homogeneous-isotropic calculation of the big bang cosmological model is reviewed, and alternate models are discussed. The standard model is shown to agree with the light element abundances for He-4, H-2, He-3, and Li-7 that are available. Improved observational data from recent LEP collider and SLC results are discussed. The data agree with the standard model in terms of the number of neutrinos, and provide improved information regarding neutron lifetimes. Alternate models are reviewed which describe different scenarios for decaying matter or quark-hadron induced inhomogeneities. The baryonic density relative to the critical density in the alternate models is similar to that of the standard model when they are made to fit the abundances. This reinforces the conclusion that the baryonic density relative to critical density is about 0.06, and also reinforces the need for both nonbaryonic dark matter and dark baryonic matter.

  20. Self-assessment: Strategy for higher standards, consistency, and performance

    International Nuclear Information System (INIS)

    Ide, W.E.

    1996-01-01

    In late 1994, Palo Verde operations underwent a transformation from a unitized structure to a single functional unit. It was necessary to build consistency in watchstanding practices and create a shared mission. Because there was a lack of focus on actual plant operations and because personnel were deeply involved with administrative tasks, command and control of evolutions were weak. Improvement was needed. Consistent performance standards have been set for all three operating units. These expectation focus on nuclear, radiological, and industrial safety. Straightforward descriptions of watchstanding and monitoring practices have been provided to all department personnel. The desired professional and leadership qualities for employee conduct have been defined and communicated thoroughly. A healthy and competitive atmosphere developed with the successful implementation of these standards. Overall performance improved. The auxiliary operators demonstrated increased pride and ownership in the performance of their work activities. In addition, their morale improved. Crew teamwork improved as well as the quality of shift briefs. There was a decrease in the noise level and the administrative functions in the control room. The use of self-assessment helped to anchor and define higher and more consistent standards. The proof of Palo Verde's success was evident when an Institute of Nuclear Power Operations finding was turned into a strength within 1 yr

  1. Radiation protection standards: A practical exercise in risk assessment

    International Nuclear Information System (INIS)

    Clarke, Roger H.

    1992-01-01

    Within 12 months of the discovery of x-rays in 1895, it was reported that large doses of radiation were harmful to living human tissues. The first radiation protection standards were set to avoid the early effects of acute irradiation. By the 1950s, evidence was mounting for late somatic effects - mainly a small excess of cancers - in irradiated populations. In the late 1980's, sufficient human epidemiological data had been accumulated to allow a comprehensive assessment of carcinogenic radiation risks following the delivery of moderately high doses. Workers and the public are exposed to lower doses and dose-rates than the groups from whom good data are available so that risks have had to be estimated for protection purposes. However, in the 1990s, some confirmation of these risk factors has been derived occupationally exposed populations. If an estimate is made of the risk per unit dose, then in order to set dose limits, an unacceptable level of risk must be established for both workers and the public. There has been and continues to be a debate about the definitions of 'acceptable' and 'tolerable' and the attributing of numerical values to these definitions. This paper discusses the issues involved in the quantification of these terms and their application to setting dose limits on risk grounds. Conclusions are drawn about the present protection standards and the application of the methods to other fields of risk assessment. (author)

  2. The Beyond the standard model working group: Summary report

    Energy Technology Data Exchange (ETDEWEB)

    G. Azuelos et al.

    2004-03-18

    dimensions. There, we considered: constraints on Kaluza Klein (KK) excitations of the SM gauge bosons from existing data (part XIII) and the corresponding projected LHC reach (part XIV); techniques for discovering and studying the radion field which is generic in most extra-dimensional scenarios (part XV); the impact of mixing between the radion and the Higgs sector, a fully generic possibility in extra-dimensional models (part XVI); production rates and signatures of universal extra dimensions at hadron colliders (part XVII); black hole production at hadron colliders, which would lead to truly spectacular events (part XVIII). The above contributions represent a tremendous amount of work on the part of the individuals involved and represent the state of the art for many of the currently most important phenomenological research avenues. Of course, much more remains to be done. For example, one should continue to work on assessing the extent to which the discovery reach will be extended if one goes beyond the LHC to the super-high-luminosity LHC (SLHC) or to a very large hadron collider (VLHC) with {radical}s {approx} 40 TeV. Overall, we believe our work shows that the LHC and future hadronic colliders will play a pivotal role in the discovery and study of any kind of new physics beyond the Standard Model. They provide tremendous potential for incredibly exciting new discoveries.

  3. Development of the Test Of Astronomy STandards (TOAST) Assessment Instrument

    Science.gov (United States)

    Slater, Timothy F.; Slater, S. J.

    2008-05-01

    Considerable effort in the astronomy education research (AER) community over the past several years has focused on developing assessment tools in the form of multiple-choice conceptual diagnostics and content knowledge surveys. This has been critically important in advancing the AER discipline so that researchers could establish the initial knowledge state of students as well as to attempt measure some of the impacts of innovative instructional interventions. Unfortunately, few of the existing instruments were constructed upon a solid list of clearly articulated and widely agreed upon learning objectives. This was not done in oversight, but rather as a result of the relative youth of AER as a discipline. Now that several important science education reform documents exist and are generally accepted by the AER community, we are in a position to develop, validate, and disseminate a new assessment instrument which is tightly aligned to the consensus learning goals stated by the American Astronomical Society - Chair's Conference on ASTRO 101, the American Association of the Advancement of Science's Project 2061 Benchmarks, and the National Research Council's National Science Education Standards. In response, researchers from the Cognition in Astronomy, Physics and Earth sciences Research (CAPER) Team at the University of Wyoming's Science & Math Teaching Center (UWYO SMTC) have designed a criterion-referenced assessment tool, called the Test Of Astronomy STandards (TOAST). Through iterative development, this instrument has a high degree of reliability and validity for instructors and researchers needing information on students’ initial knowledge state at the beginning of a course and can be used, in aggregate, to help measure the impact of course-length duration instructional strategies for courses with learning goals tightly aligned to the consensus goals of our community.

  4. Standard State Space Models of Unawareness (Extended Abstract

    Directory of Open Access Journals (Sweden)

    Peter Fritz

    2016-06-01

    Full Text Available The impossibility theorem of Dekel, Lipman and Rustichini has been thought to demonstrate that standard state-space models cannot be used to represent unawareness. We first show that Dekel, Lipman and Rustichini do not establish this claim. We then distinguish three notions of awareness, and argue that although one of them may not be adequately modeled using standard state spaces, there is no reason to think that standard state spaces cannot provide models of the other two notions. In fact, standard space models of these forms of awareness are attractively simple. They allow us to prove completeness and decidability results with ease, to carry over standard techniques from decision theory, and to add propositional quantifiers straightforwardly.

  5. Physics Beyond the Standard Model: Supersymmetry

    Energy Technology Data Exchange (ETDEWEB)

    Nojiri, M.M.; /KEK, Tsukuba /Tsukuba, Graduate U. Adv. Studies /Tokyo U.; Plehn, T.; /Edinburgh U.; Polesello, G.; /INFN, Pavia; Alexander, John M.; /Edinburgh U.; Allanach, B.C.; /Cambridge U.; Barr, Alan J.; /Oxford U.; Benakli, K.; /Paris U., VI-VII; Boudjema, F.; /Annecy, LAPTH; Freitas, A.; /Zurich U.; Gwenlan, C.; /University Coll. London; Jager, S.; /CERN /LPSC, Grenoble

    2008-02-01

    This collection of studies on new physics at the LHC constitutes the report of the supersymmetry working group at the Workshop 'Physics at TeV Colliders', Les Houches, France, 2007. They cover the wide spectrum of phenomenology in the LHC era, from alternative models and signatures to the extraction of relevant observables, the study of the MSSM parameter space and finally to the interplay of LHC observations with additional data expected on a similar time scale. The special feature of this collection is that while not each of the studies is explicitly performed together by theoretical and experimental LHC physicists, all of them were inspired by and discussed in this particular environment.

  6. Emission standards versus immission standards for assessing the impact of urban drainage on ephemeral receiving water bodies.

    Science.gov (United States)

    Freni, Gabriele; Mannina, Giorgio; Viviani, Gaspare

    2010-01-01

    In the past, emission standard indicators have been adopted by environmental regulation authorities in order to preserve the quality of a receiving water body. Such indicators are based on the frequency or magnitude of a polluted discharge that may be continuous or intermittent. In order to properly maintain the quality of receiving waters, the Water Framework Directive, following the basic ideas of British Urban Pollution Manual, has been established. The Directive has overtaken the emission-standard concept, substituting it with the stream-standard concept that fixes discharge limits for each polluting substance depending on the self-depurative characteristics of receiving waters. Stream-standard assessment requires the deployment of measurement campaigns that can be very expensive; furthermore, the measurement campaigns are usually not able to provide a link between the receiving water quality and the polluting sources. Therefore, it would be very useful to find a correlation between the quality status of the natural waters and the emission-based indicators. Thus, this study is aimed to finding a possible connection between the receiving water quality indicators drawn by environmental regulation authorities and emission-based indicators while considering both continuous (i.e. from the wastewater treatment plants) and intermittent pollution discharges (mainly from combined sewer overflows). Such research has been carried out by means of long-term analysis adopting a holistic modelling approach. The different parts of the integrated urban drainage system were modelled by a parsimonious integrated model. The analysis was applied to an ephemeral river bounding Bologna (Italy). The study concluded that the correlation between receiving water quality and polluting emissions cannot be generally stated. Nevertheless, specific analyses on polluting emissions were pointed out in the study highlighting cause-effect link between polluting sources and receiving water quality.

  7. ATLAS Searches for Beyond the Standard Model Higgs Bosons

    CERN Document Server

    Potter, C T

    2013-01-01

    The present status of ATLAS searches for Higgs bosons in extensions of the Standard Model (SM) is presented. This includes searches for the Higgs bosons of the Two-Higgs-Doublet Model (2HDM), the Minimal Supersymmetric Model (MSSM), the Next-to-Minimal Supersymmetric Model (NMSSM) and models with an invisibly decaying Higgs boson. A review of the phenomenology of the Higgs sectors of these models is given together with the search strategy and the resulting experimental constraints.

  8. An assessment model for quality management

    Science.gov (United States)

    Völcker, Chr.; Cass, A.; Dorling, A.; Zilioli, P.; Secchi, P.

    2002-07-01

    SYNSPACE together with InterSPICE and Alenia Spazio is developing an assessment method to determine the capability of an organisation in the area of quality management. The method, sponsored by the European Space Agency (ESA), is called S9kS (SPiCE- 9000 for SPACE). S9kS is based on ISO 9001:2000 with additions from the quality standards issued by the European Committee for Space Standardization (ECSS) and ISO 15504 - Process Assessments. The result is a reference model that supports the expansion of the generic process assessment framework provided by ISO 15504 to nonsoftware areas. In order to be compliant with ISO 15504, requirements from ISO 9001 and ECSS-Q-20 and Q-20-09 have been turned into process definitions in terms of Purpose and Outcomes, supported by a list of detailed indicators such as Practices, Work Products and Work Product Characteristics. In coordination with this project, the capability dimension of ISO 15504 has been revised to be consistent with ISO 9001. As contributions from ISO 9001 and the space quality assurance standards are separable, the stripped down version S9k offers organisations in all industries an assessment model based solely on ISO 9001, and is therefore interesting to all organisations, which intend to improve their quality management system based on ISO 9001.

  9. The Standard Model is Natural as Magnetic Gauge Theory

    DEFF Research Database (Denmark)

    Sannino, Francesco

    2011-01-01

    matter. The absence of scalars in the electric theory indicates that the associated magnetic theory is free from quadratic divergences. Our novel solution to the Standard Model hierarchy problem leads also to a new insight on the mystery of the observed number of fundamental fermion generations......We suggest that the Standard Model can be viewed as the magnetic dual of a gauge theory featuring only fermionic matter content. We show this by first introducing a Pati-Salam like extension of the Standard Model and then relating it to a possible dual electric theory featuring only fermionic...

  10. Simulation and Modeling Capability for Standard Modular Hydropower Technology

    Energy Technology Data Exchange (ETDEWEB)

    Stewart, Kevin M. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Smith, Brennan T. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Witt, Adam M. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); DeNeale, Scott T. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Bevelhimer, Mark S. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Pries, Jason L. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Burress, Timothy A. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Kao, Shih-Chieh [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Mobley, Miles H. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Lee, Kyutae [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Curd, Shelaine L. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Tsakiris, Achilleas [Univ. of Tennessee, Knoxville, TN (United States); Mooneyham, Christian [Univ. of Tennessee, Knoxville, TN (United States); Papanicolaou, Thanos [Univ. of Tennessee, Knoxville, TN (United States); Ekici, Kivanc [Univ. of Tennessee, Knoxville, TN (United States); Whisenant, Matthew J. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Welch, Tim [US Department of Energy, Washington, DC (United States); Rabon, Daniel [US Department of Energy, Washington, DC (United States)

    2017-08-01

    Grounded in the stakeholder-validated framework established in Oak Ridge National Laboratory’s SMH Exemplary Design Envelope Specification, this report on Simulation and Modeling Capability for Standard Modular Hydropower (SMH) Technology provides insight into the concepts, use cases, needs, gaps, and challenges associated with modeling and simulating SMH technologies. The SMH concept envisions a network of generation, passage, and foundation modules that achieve environmentally compatible, cost-optimized hydropower using standardization and modularity. The development of standardized modeling approaches and simulation techniques for SMH (as described in this report) will pave the way for reliable, cost-effective methods for technology evaluation, optimization, and verification.

  11. Assessing NHS trusts' compliance with child health policy standards.

    Science.gov (United States)

    Coles, Leslie; Glasper, Alan; Battrick, Cath; Brown, Sara

    An audit tool to undertake a baseline assessment of NHS trust compliance with contemporary healthcare polices was designed collaboratively by senior children's nurses across one English strategic health authority (SHA). Children's units in hospitals across the SHA were benchmarked against the audit tool standards throughout 2009. The aim was to identify good and less optimum compliance with best practice policy-driven benchmarks of care, using a 1-5 scale. Each NHS trust within the SHA was contacted to make arrangements with members of the interprofessional team to complete the baseline benchmarking exercise. The audit was conducted over 1 or 2 days. The majority of the evidence sourced comprised documented evidence and verbal affirmation of the individual perceptions of key informants with regard to how the range of clinical areas scored against the best practice benchmarks. Scores of policy compliance in some trusts audited ranged from 1 (non-compliant) to 5 (full compliance). The results demonstrate that many trusts are making good efforts to ensure full compliance to policy guidelines and mandates. However, there are some aspects of policy standards that trusts have yet to fully embrace. This initial benchmarking exercise on behalf of an English SHA has revealed many areas of outstanding and good practice which have the potential to be shared.

  12. Improving pest risk assessment and management through the aid of geospatial information technology standards

    Directory of Open Access Journals (Sweden)

    Trond Rafoss

    2013-09-01

    Full Text Available Delivery of geospatial information over the Internet for the management of risks from invasive alien species is an increasingly important service. The evolution of information technology standards for geospatial data is a key factor to simplify network publishing and exchange of maps and data. The World Wide Web Consortium (W3C-geolocation specification is a recent addition that may prove useful for pest risk management. In this article we implement the W3C-geolocation specification and Open Geospatial Consortium (OGC mapping standards in a Web browser application for smartphones and tablet computers to improve field surveys for alien invasive species. We report our first season field experiences using this tool for online mapping of plant disease outbreaks and host plant occurrence. It is expected that the improved field data collection tools will result in increased data availability and thereby new opportunities for risk assessment, because data-needs and availability are crucial for species distribution modelling and model-based forecasts of pest establishment potential. Finally, we close with a comment on the future potential of geospatial information standards to enhance the translation from data to decisions regarding pest risks, which should enable earlier detection of emerging risks as well as more robust projections of pest risks in novel areas. The forthcoming standard for processing of geospatial information, the Web Processing Standard (WPS, should open new technological capabilities both for automatic initiation and updating of risk assessment models based on new incoming data, and subsequent early warning.

  13. Phasic Electrodermal Activity During the Standardized Assessment of Concussion (SAC).

    Science.gov (United States)

    Raikes, Adam C; Schaefer, Sydney Y

    2016-07-01

    The long-term effects of concussion on brain function during cognitive tasks are not fully understood and neuroimaging findings are equivocal. Some images show hyperactivation of prefrontal brain regions in previously concussed individuals relative to controls, suggesting increased cognitive resource allocation. Others show prefrontal hypoactivation and hyperactivation in other regions as a presumed compensatory mechanism. Given the relationship between sympathetic arousal and neural activation, physiologic measures of arousal, such as electrodermal activity, may provide additional insight into the brain's functional changes in those with a history of concussion. To quantify differences in electrodermal activity during a commonly used standardized neurocognitive assessment between individuals with or without a history of concussion. Descriptive laboratory study. Research laboratory. Seven asymptomatic individuals with a self-reported history of physician-diagnosed, sport-related concussion (number of previous concussions = 1.43 ± 0.53; time since most recent concussion = 0.75 to 6 years, median = 3 years) and 10 individuals without a history of concussion participated in this study. All participants wore bilateral wrist electrodermal activity sensors during the Standardized Assessment of Concussion. We measured normalized phasic (reactive) electrodermal activity during each test element (orientation, immediate recall, concentration, delayed recall). A significant group-by-test element interaction was present (P = .003). Individuals with a history of concussion had greater phasic activity during delayed recall (P concussed individuals relative to healthy control participants, supporting previous neuroimaging findings of increased prefrontal cortex activity during memory tasks after concussion. Given similar task performance and arousal patterns across the test, our results suggest that previously concussed individuals incur additional cognitive demands in a short

  14. Mathematical models of cytotoxic effects in endpoint tumor cell line assays: critical assessment of the application of a single parametric value as a standard criterion to quantify the dose-response effects and new unexplored proposal formats.

    Science.gov (United States)

    Calhelha, Ricardo C; Martínez, Mireia A; Prieto, M A; Ferreira, Isabel C F R

    2017-10-23

    The development of convenient tools for describing and quantifying the effects of standard and novel therapeutic agents is essential for the research community, to perform more precise evaluations. Although mathematical models and quantification criteria have been exchanged in the last decade between different fields of study, there are relevant methodologies that lack proper mathematical descriptions and standard criteria to quantify their responses. Therefore, part of the relevant information that can be drawn from the experimental results obtained and the quantification of its statistical reliability are lost. Despite its relevance, there is not a standard form for the in vitro endpoint tumor cell lines' assays (TCLA) that enables the evaluation of the cytotoxic dose-response effects of anti-tumor drugs. The analysis of all the specific problems associated with the diverse nature of the available TCLA used is unfeasible. However, since most TCLA share the main objectives and similar operative requirements, we have chosen the sulforhodamine B (SRB) colorimetric assay for cytotoxicity screening of tumor cell lines as an experimental case study. In this work, the common biological and practical non-linear dose-response mathematical models are tested against experimental data and, following several statistical analyses, the model based on the Weibull distribution was confirmed as the convenient approximation to test the cytotoxic effectiveness of anti-tumor compounds. Then, the advantages and disadvantages of all the different parametric criteria derived from the model, which enable the quantification of the dose-response drug-effects, are extensively discussed. Therefore, model and standard criteria for easily performing the comparisons between different compounds are established. The advantages include a simple application, provision of parametric estimations that characterize the response as standard criteria, economization of experimental effort and enabling

  15. The Beyond the Standard Model Working Group: Summary Report

    Energy Technology Data Exchange (ETDEWEB)

    Rizzo, Thomas G.

    2002-08-08

    Various theoretical aspects of physics beyond the Standard Model at hadron colliders are discussed. Our focus will be on those issues that most immediately impact the projects pursued as part of the BSM group at this meeting.

  16. Workshop on What Comes Beyond the Standard Model?

    CERN Document Server

    Borstnik, N M; Nielsen, Holger Bech; Froggatt, Colin D; What Comes Beyond the Standard Model?

    1999-01-01

    The Proceedings collects the results of ten days of discussions on the open questions of the Standard electroweak model as well as the review of the introductory talks, connected with the discussions.

  17. Modern elementary particle physics explaining and extending the standard model

    CERN Document Server

    Kane, Gordon

    2017-01-01

    This book is written for students and scientists wanting to learn about the Standard Model of particle physics. Only an introductory course knowledge about quantum theory is needed. The text provides a pedagogical description of the theory, and incorporates the recent Higgs boson and top quark discoveries. With its clear and engaging style, this new edition retains its essential simplicity. Long and detailed calculations are replaced by simple approximate ones. It includes introductions to accelerators, colliders, and detectors, and several main experimental tests of the Standard Model are explained. Descriptions of some well-motivated extensions of the Standard Model prepare the reader for new developments. It emphasizes the concepts of gauge theories and Higgs physics, electroweak unification and symmetry breaking, and how force strengths vary with energy, providing a solid foundation for those working in the field, and for those who simply want to learn about the Standard Model.

  18. Tests of the standard electroweak model in beta decay

    Energy Technology Data Exchange (ETDEWEB)

    Severijns, N.; Beck, M. [Universite Catholique de Louvain (UCL), Louvain-la-Neuve (Belgium); Naviliat-Cuncic, O. [Caen Univ., CNRS-ENSI, 14 (France). Lab. de Physique Corpusculaire

    2006-05-15

    We review the current status of precision measurements in allowed nuclear beta decay, including neutron decay, with emphasis on their potential to look for new physics beyond the standard electroweak model. The experimental results are interpreted in the framework of phenomenological model-independent descriptions of nuclear beta decay as well as in some specific extensions of the standard model. The values of the standard couplings and the constraints on the exotic couplings of the general beta decay Hamiltonian are updated. For the ratio between the axial and the vector couplings we obtain C{sub A},/C{sub V} = -1.26992(69) under the standard model assumptions. Particular attention is devoted to the discussion of the sensitivity and complementarity of different precision experiments in direct beta decay. The prospects and the impact of recent developments of precision tools and of high intensity low energy beams are also addressed. (author)

  19. Standard model status (in search of ''new physics'')

    International Nuclear Information System (INIS)

    Marciano, W.J.

    1993-03-01

    A perspective on successes and shortcomings of the standard model is given. The complementarity between direct high energy probes of new physics and lower energy searches via precision measurements and rare reactions is described. Several illustrative examples are discussed

  20. CP violation and electroweak baryogenesis in the Standard Model

    Directory of Open Access Journals (Sweden)

    Brauner Tomáš

    2014-04-01

    Full Text Available One of the major unresolved problems in current physics is understanding the origin of the observed asymmetry between matter and antimatter in the Universe. It has become a common lore to claim that the Standard Model of particle physics cannot produce sufficient asymmetry to explain the observation. Our results suggest that this conclusion can be alleviated in the so-called cold electroweak baryogenesis scenario. On the Standard Model side, we continue the program initiated by Smit eight years ago; one derives the effective CP-violating action for the Standard Model bosons and uses the resulting effective theory in numerical simulations. We address a disagreement between two previous computations performed effectively at zero temperature, and demonstrate that it is very important to include temperature effects properly. Our conclusion is that the cold electroweak baryogenesis scenario within the Standard Model is tightly constrained, yet producing enough baryon asymmetry using just known physics still seems possible.

  1. Overview of the Higgs and Standard Model physics at ATLAS

    CERN Document Server

    Vazquez Schroeder, Tamara; The ATLAS collaboration

    2018-01-01

    This talk presents selected aspects of recent physics results from the ATLAS collaboration in the Standard Model and Higgs sectors, with a focus on the recent evidence for the associated production of the Higgs boson and a top quark pair.

  2. Enhancements to ASHRAE Standard 90.1 Prototype Building Models

    Energy Technology Data Exchange (ETDEWEB)

    Goel, Supriya; Athalye, Rahul A.; Wang, Weimin; Zhang, Jian; Rosenberg, Michael I.; Xie, YuLong; Hart, Philip R.; Mendon, Vrushali V.

    2014-04-16

    This report focuses on enhancements to prototype building models used to determine the energy impact of various versions of ANSI/ASHRAE/IES Standard 90.1. Since the last publication of the prototype building models, PNNL has made numerous enhancements to the original prototype models compliant with the 2004, 2007, and 2010 editions of Standard 90.1. Those enhancements are described here and were made for several reasons: (1) to change or improve prototype design assumptions; (2) to improve the simulation accuracy; (3) to improve the simulation infrastructure; and (4) to add additional detail to the models needed to capture certain energy impacts from Standard 90.1 improvements. These enhancements impact simulated prototype energy use, and consequently impact the savings estimated from edition to edition of Standard 90.1.

  3. Standards for the assessment of salivary glands – an update

    Directory of Open Access Journals (Sweden)

    Piotr Zajkowski

    2016-06-01

    Full Text Available The paper is an update of 2011 Standards for Ultrasound Assessment of Salivary Glands, which were developed by the Polish Ultrasound Society. We have described current ultrasound technical requirements, assessment and measurement techniques as well as guidelines for ultrasound description. We have also discussed an ultrasound image of normal salivary glands as well as the most important pathologies, such as inflammation, sialosis, collagenosis, injuries and proliferative processes, with particular emphasis on lesions indicating high risk of malignancy. In acute bacterial inflammation, the salivary glands appear as hypoechoic, enlarged or normal-sized, with increased parenchymal flow. The echogenicity is significantly increased in viral infections. Degenerative lesions may be seen in chronic inflammations. Hyperechoic deposits with acoustic shadowing can be visualized in lithiasis. Parenchymal fibrosis is a dominant feature of sialosis. Sjögren syndrome produces different pictures of salivary gland parenchymal lesions at different stages of the disease. Pleomorphic adenomas are usually hypoechoic, well-defined and polycyclic in most cases. Warthin tumor usually presents as a hypoechoic, oval-shaped lesion with anechoic cystic spaces. Malignancies are characterized by blurred outlines, irregular shape, usually heterogeneous echogenicity and pathological neovascularization. The accompanying metastatic lesions are another indicator of malignancy, however, final diagnosis should be based on biopsy findings.

  4. Almost-commutative geometries beyond the standard model

    International Nuclear Information System (INIS)

    Stephan, Christoph A

    2006-01-01

    In Iochum et al (2004 J. Math. Phys. 45 5003), Jureit and Stephan (2005 J. Math. Phys. 46 043512), Schuecker T (2005 Preprint hep-th/0501181) and Jureit et al (2005 J. Math. Phys. 46 072303), a conjecture is presented that almost-commutative geometries, with respect to sensible physical constraints, allow only the standard model of particle physics and electro-strong models as Yang-Mills-Higgs theories. In this paper, a counter-example will be given. The corresponding almost-commutative geometry leads to a Yang-Mills-Higgs model which consists of the standard model of particle physics and two new fermions of opposite electro-magnetic charge. This is the second Yang-Mills-Higgs model within noncommutative geometry, after the standard model, which could be compatible with experiments. Combined to a hydrogen-like composite particle, these new particles provide a novel dark matter candidate

  5. Alignment between South African mathematics assessment standards and the TIMSS assessment frameworks

    Directory of Open Access Journals (Sweden)

    Mdutshekelwa Ndlovu

    2012-12-01

    Full Text Available South Africa’s performance in international benchmark tests is a major cause for concern amongst educators and policymakers, raising questions about the effectiveness of the curriculum reform efforts of the democratic era. The purpose of the study reported in this article was to investigate the degree of alignment between the TIMSS 2003 Grade 8 Mathematics assessment frameworks and the Revised National Curriculum Statements (RNCS assessment standards for Grade 8 Mathematics, later revised to become the Curriculum and Assessment Policy Statements (CAPS. Such an investigation could help to partly shed light on why South African learners do not perform well and point out discrepancies that need to be attended to. The methodology of document analysis was adopted for the study, with the RNCS and the TIMSS 2003 Grade 8 Mathematics frameworks forming the principal documents. Porter’s moderately complex index of alignment was adopted for its simplicity. The computed index of 0.751 for the alignment between the RNCS assessment standards and the TIMSS assessment objectives was found to be significantly statistically low, at the alpha level of 0.05, according to Fulmer’s critical values for 20 cells and 90 or 120 standard points. The study suggests that inadequate attention has been paid to the alignment of the South African mathematics curriculum to the successive TIMSS assessment frameworks in terms of the cognitive level descriptions. The study recommends that participation in TIMSS should rigorously and critically inform ongoing curriculum reform efforts.

  6. Alignment between South African mathematics assessment standards and the TIMSS assessment frameworks

    Directory of Open Access Journals (Sweden)

    Mdutshekelwa Ndlovu

    2012-10-01

    Full Text Available South Africa’s performance in international benchmark tests is a major cause for concern amongst educators and policymakers, raising questions about the effectiveness of the curriculum reform efforts of the democratic era. The purpose of the study reported in this article was to investigate the degree of alignment between the TIMSS 2003 Grade 8 Mathematics assessment frameworks and the Revised National Curriculum Statements (RNCS assessment standards for Grade 8 Mathematics, later revised to become the Curriculum and Assessment Policy Statements (CAPS. Such an investigation could help to partly shed light on why South African learners do not perform well and point out discrepancies that need to be attended to. The methodology of document analysis was adopted for the study, with the RNCS and the TIMSS 2003 Grade 8 Mathematics frameworks forming the principal documents. Porter’s moderately complex index of alignment was adopted for its simplicity. The computed index of 0.751 for the alignment between the RNCS assessment standards and the TIMSS assessment objectives was found to be significantly statistically low, at the alpha level of 0.05, according to Fulmer’s critical values for 20 cells and 90 or 120 standard points. The study suggests that inadequate attention has been paid to the alignment of the South African mathematics curriculum to the successive TIMSS assessment frameworks in terms of the cognitive level descriptions. The study recommends that participation in TIMSS should rigorously and critically inform ongoing curriculum reform efforts.

  7. Standard Model Higgs boson searches with the ATLAS detector at ...

    Indian Academy of Sciences (India)

    experimental results on the search of the Standard Model Higgs boson with 1 to 2 fb. −1 of proton– ... expectations from Standard Model processes, and the production of a Higgs boson is excluded at 95% Confidence Level for the mass ... lνlν and H → Z Z. (∗) → 4l,llνν as they play important roles in setting the overall result.

  8. NASA Standard for Models and Simulations: Philosophy and Requirements Overview

    Science.gov (United States)

    Blattnig, Steve R.; Luckring, James M.; Morrison, Joseph H.; Sylvester, Andre J.; Tripathi, Ram K.; Zang, Thomas A.

    2013-01-01

    Following the Columbia Accident Investigation Board report, the NASA Administrator chartered an executive team (known as the Diaz Team) to identify those CAIB report elements with NASA-wide applicability and to develop corrective measures to address each element. One such measure was the development of a standard for the development, documentation, and operation of models and simulations. This report describes the philosophy and requirements overview of the resulting NASA Standard for Models and Simulations.

  9. Neutrinos from the Early Universe and physics beyond standard models

    Directory of Open Access Journals (Sweden)

    Kirilova Daniela

    2015-01-01

    Full Text Available Neutrino oscillations present the only robust example of experimentally detected physics beyond the standard model. This review discusses the established and several hypothetical beyond standard models neutrino characteristics and their cosmological effects and constraints. Particularly, the contemporary cosmological constraints on the number of neutrino families, neutrino mass differences and mixing, lepton asymmetry in the neutrino sector, neutrino masses, light sterile neutrino are briefly reviewed.

  10. The Standard Model from LHC to future colliders

    Energy Technology Data Exchange (ETDEWEB)

    Forte, S.; Ferrera, G.; Vicini, A. [Universita di Milano, Dipartimento di Fisica, Milan (Italy); INFN, Sezione di Milano, Milan (Italy); Nisati, A. [INFN, Sezione di Roma, Rome (Italy); Passarino, G.; Magnea, L. [Universita di Torino, Dipartimento di Fisica, Turin (Italy); INFN, Sezione di Torino, Turin (Italy); Tenchini, R. [INFN, Sezione di Pisa, Pisa (Italy); Calame, C.M.C. [Universita di Pavia, Dipartimento di Fisica, Pavia (Italy); Chiesa, M.; Nicrosini, O.; Piccinini, F. [INFN, Sezione di Pavia, Pavia (Italy); Cobal, M. [Universita di Udine, Dipartimento di Chimica, Fisica e Ambiente, Udine (Italy); INFN, Gruppo Collegato di Udine, Udine (Italy); Corcella, G. [INFN, Laboratori Nazionali di Frascati, Frascati (Italy); Degrassi, G. [Universita' Roma Tre, Dipartimento di Matematica e Fisica, Rome (Italy); INFN, Sezione di Roma Tre, Rome (Italy); Maltoni, F. [Universite Catholique de Louvain, Centre for Cosmology, Particle Physics and Phenomenology (CP3), Louvain-la-Neuve (Belgium); Montagna, G. [Universita di Pavia, Dipartimento di Fisica, Pavia (Italy); INFN, Sezione di Pavia, Pavia (Italy); Nason, P. [INFN, Sezione di Milano-Bicocca, Milan (Italy); Oleari, C. [Universita di Milano-Bicocca, Dipartimento di Fisica, Milan (Italy); INFN, Sezione di Milano-Bicocca, Milan (Italy); Riva, F. [Ecole Polytechnique Federale de Lausanne, Institut de Theorie des Phenomenes Physiques, Lausanne (Switzerland)

    2015-11-15

    This review summarizes the results of the activities which have taken place in 2014 within the Standard Model Working Group of the ''What Next'' Workshop organized by INFN, Italy. We present a framework, general questions, and some indications of possible answers on the main issue for Standard Model physics in the LHC era and in view of possible future accelerators. (orig.)

  11. Exploring standardized precipitation evapotranspiration index for drought assessment in Bangladesh.

    Science.gov (United States)

    Miah, Md Giashuddin; Abdullah, Hasan Muhammad; Jeong, Changyoon

    2017-10-09

    Drought is a critical issue, and it has a pressing, negative impact on agriculture, ecosystems, livelihoods, food security, and sustainability. The problem has been studied globally, but its regional or even local dimension is sometimes overlooked. Local-level drought assessment is necessary for developing adaptation and mitigation strategies for that particular region. Keeping this in understanding, an attempt was made to create a detailed assessment of drought characteristics at the local scale in Bangladesh. Standardized precipitation evapotranspiration (SPEI) is a new drought index that mainly considers the rainfall and evapotranspiration data set. Globally, SPEI has become a useful drought index, but its local scale application is not common. SPEI base (0.5° grid data) for 110 years (1901-2011) was utilized to overcome the lack of long-term climate data in Bangladesh. Available weather data (1955-2011) from Bangladesh Meteorology Department (BMD) were analyzed to calculate SPEI weather station using the SPEI calculator. The drivers for climate change-induced droughts were characterized by residual temperature and residual rainfall data from different BMD stations. Grid data (SPEI base ) of 26 stations of BMD were used for drought mapping. The findings revealed that the frequency and intensity of drought are higher in the northwestern part of the country which makes it vulnerable to both extreme and severe droughts. Based on the results, the SPEI-based drought intensity and frequency analyses were carried out, emphasizing Rangpur (northwest region) as a hot spot, to get an insight of drought assessment in Bangladesh. The findings of this study revealed that SPEI could be a valuable tool to understand the evolution and evaluation of the drought induced by climate change in the country. The study also justified the immediate need for drought risk reduction strategies that should lead to relevant policy formulations and agricultural innovations for developing

  12. The Assessment Cycle: A Model for Learning through Peer Assessment

    Science.gov (United States)

    Reinholz, Daniel

    2016-01-01

    This paper advances a model describing how peer assessment supports self-assessment. Although prior research demonstrates that peer assessment promotes self-assessment, the connection between these two activities is underspecified. This model, the assessment cycle, draws from theories of self-assessment to elaborate how learning takes place…

  13. Model uncertainty in safety assessment

    International Nuclear Information System (INIS)

    Pulkkinen, U.; Huovinen, T.

    1996-01-01

    The uncertainty analyses are an essential part of any risk assessment. Usually the uncertainties of reliability model parameter values are described by probability distributions and the uncertainty is propagated through the whole risk model. In addition to the parameter uncertainties, the assumptions behind the risk models may be based on insufficient experimental observations and the models themselves may not be exact descriptions of the phenomena under analysis. The description and quantification of this type of uncertainty, model uncertainty, is the topic of this report. The model uncertainty is characterized and some approaches to model and quantify it are discussed. The emphasis is on so called mixture models, which have been applied in PSAs. Some of the possible disadvantages of the mixture model are addressed. In addition to quantitative analyses, also qualitative analysis is discussed shortly. To illustrate the models, two simple case studies on failure intensity and human error modeling are described. In both examples, the analysis is based on simple mixture models, which are observed to apply in PSA analyses. (orig.) (36 refs., 6 figs., 2 tabs.)

  14. Assessment and Next Generation Standards: An Interview with Olivia Gude

    Science.gov (United States)

    Sweeny, Robert

    2014-01-01

    This article provides a transcript of an interview with Olivia Gude, member of the National Coalition for Core Arts Standards Writing Team. In the interview, Gude provides an overview of the process for writing the new visual arts standards.

  15. Assessing the standard Molybdenum projector augmented wave VASP potentials

    Energy Technology Data Exchange (ETDEWEB)

    Mattsson, Ann E. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States). Multi-Scale Science

    2014-07-01

    Density Functional Theory (DFT) based Equation of State (EOS) construction is a prominent part of Sandia’s capabilities to support engineering sciences. This capability is based on augmenting experimental data with information gained from computational investigations, especially in those parts of the phase space where experimental data is hard, dangerous, or expensive to obtain. A key part of the success of the Sandia approach is the fundamental science work supporting the computational capability. Not only does this work enhance the capability to perform highly accurate calculations but it also provides crucial insight into the limitations of the computational tools, providing high confidence in the results even where results cannot be, or have not yet been, validated by experimental data. This report concerns the key ingredient of projector augmented-wave (PAW) potentials for use in pseudo-potential computational codes. Using the tools discussed in SAND2012-7389 we assess the standard Vienna Ab-initio Simulation Package (VASP) PAWs for Molybdenum.

  16. Assessing the costs and benefits of US renewable portfolio standards

    International Nuclear Information System (INIS)

    Wiser, Ryan; Mai, Trieu T.; Millstein, Dev; Barbose, Galen; Bird, Lori A.

    2017-01-01

    In this study, renewable portfolio standards (RPS) exist in 29 US states and the District of Columbia. This article summarizes the first national-level, integrated assessment of the future costs and benefits of existing RPS policies; the same metrics are evaluated under a second scenario in which widespread expansion of these policies is assumed to occur. Depending on assumptions about renewable energy technology advancement and natural gas prices, existing RPS policies increase electric system costs by as much as 31 billion dollars, on a present-value basis over 2015-2050. The expanded renewable deployment scenario yields incremental costs that range from 23 billion to 194 billion dollars, depending on the assumptions employed. The monetized value of improved air quality and reduced climate damages exceed these costs. Using central assumptions, existing RPS policies yield 97 billion dollars in air-pollution health benefits and 161 billion dollars in climate damage reductions. Under the expanded RPS case, health benefits total 558 billion dollars and climate benefits equal 599 billion dollars. These scenarios also yield benefits in the form of reduced water use. RPS programs are not likely to represent the most cost effective path towards achieving air quality and climate benefits. Nonetheless, the findings suggest that US RPS programs are, on a national basis, cost effective when considering externalities.

  17. Assessing the costs and benefits of US renewable portfolio standards

    Science.gov (United States)

    Wiser, Ryan; Mai, Trieu; Millstein, Dev; Barbose, Galen; Bird, Lori; Heeter, Jenny; Keyser, David; Krishnan, Venkat; Macknick, Jordan

    2017-09-01

    Renewable portfolio standards (RPS) exist in 29 US states and the District of Columbia. This article summarizes the first national-level, integrated assessment of the future costs and benefits of existing RPS policies; the same metrics are evaluated under a second scenario in which widespread expansion of these policies is assumed to occur. Depending on assumptions about renewable energy technology advancement and natural gas prices, existing RPS policies increase electric system costs by as much as 31 billion, on a present-value basis over 2015-2050. The expanded renewable deployment scenario yields incremental costs that range from 23 billion to 194 billion, depending on the assumptions employed. The monetized value of improved air quality and reduced climate damages exceed these costs. Using central assumptions, existing RPS policies yield 97 billion in air-pollution health benefits and 161 billion in climate damage reductions. Under the expanded RPS case, health benefits total 558 billion and climate benefits equal 599 billion. These scenarios also yield benefits in the form of reduced water use. RPS programs are not likely to represent the most cost effective path towards achieving air quality and climate benefits. Nonetheless, the findings suggest that US RPS programs are, on a national basis, cost effective when considering externalities.

  18. Precision tests of the standard model at LEP

    International Nuclear Information System (INIS)

    Mele, Barbara; Universita La Sapienza, Rome

    1994-01-01

    Recent LEP results on electroweak precision measurements are reviewed. Line-shape and asymmetries analysis on the Z 0 peak is described. Then, the consistency of the Standard Model predictions with experimental data and consequent limits on the top mass are discussed. Finally, the possibility of extracting information and constrains on new theoretical models from present data is examined. (author). 20 refs., 5 tabs

  19. Open standard CMO for parametric modelling based on semantic web

    NARCIS (Netherlands)

    Bonsma, P.; Bonsma, I.; Zayakova, T.; Van Delft, A.; Sebastian, R.; Böhms, M.

    2015-01-01

    The Open Standard Concept Modelling Ontology (CMO) with Extensions makes it possible to store parametric modelling semantics and parametric geometry in a Semantic Web environment. The parametric and geometrical part of CMO with Extensions is developed within the EU project Proficient. The nature of

  20. Standard model for safety analysis report of fuel reprocessing plants

    International Nuclear Information System (INIS)

    1979-12-01

    A standard model for a safety analysis report of fuel reprocessing plants is established. This model shows the presentation format, the origin, and the details of the minimal information required by CNEN (Comissao Nacional de Energia Nuclear) aiming to evaluate the requests of construction permits and operation licenses made according to the legislation in force. (E.G.) [pt

  1. Standard model for safety analysis report of fuel fabrication plants

    International Nuclear Information System (INIS)

    1980-09-01

    A standard model for a safety analysis report of fuel fabrication plants is established. This model shows the presentation format, the origin, and the details of the minimal information required by CNEN (Comissao Nacional de Energia Nuclear) aiming to evaluate the requests of construction permits and operation licenses made according to the legislation in force. (E.G.) [pt

  2. Informatics in radiology: an information model of the DICOM standard.

    Science.gov (United States)

    Kahn, Charles E; Langlotz, Curtis P; Channin, David S; Rubin, Daniel L

    2011-01-01

    The Digital Imaging and Communications in Medicine (DICOM) Standard is a key foundational technology for radiology. However, its complexity creates challenges for information system developers because the current DICOM specification requires human interpretation and is subject to nonstandard implementation. To address this problem, a formally sound and computationally accessible information model of the DICOM Standard was created. The DICOM Standard was modeled as an ontology, a machine-accessible and human-interpretable representation that may be viewed and manipulated by information-modeling tools. The DICOM Ontology includes a real-world model and a DICOM entity model. The real-world model describes patients, studies, images, and other features of medical imaging. The DICOM entity model describes connections between real-world entities and the classes that model the corresponding DICOM information entities. The DICOM Ontology was created to support the Cancer Biomedical Informatics Grid (caBIG) initiative, and it may be extended to encompass the entire DICOM Standard and serve as a foundation of medical imaging systems for research and patient care. RSNA, 2010

  3. Physics beyond the standard model and cosmological connections ...

    Indian Academy of Sciences (India)

    tween collider physics and cosmology and how collider searches for dark matter candidates in supersymmetry and other models can lead us to a determination of dark matter parameters and how this precision information may influence cos- mology. This paper presents a summary of the work on beyond standard model.

  4. Conformal Extensions of the Standard Model with Veltman Conditions

    DEFF Research Database (Denmark)

    Antipin, Oleg; Mojaza, Matin; Sannino, Francesco

    2014-01-01

    the Higgs is predicted to have the experimental value of the mass equal to 126 GeV. This model also predicts the existence of one more standard model singlet scalar boson with a mass of 541 GeV and the Higgs self-coupling to emerge radiatively. We study several other PNC examples that generally predict...... a somewhat smaller mass of the Higgs to the perturbative order we have investigated them. Our results can be a useful guide when building extensions of the standard model featuring fundamental scalars....

  5. ATLAS Z Excess in Minimal Supersymmetric Standard Model

    International Nuclear Information System (INIS)

    Lu, Xiaochuan; Terada, Takahiro

    2015-06-01

    Recently the ATLAS collaboration reported a 3 sigma excess in the search for the events containing a dilepton pair from a Z boson and large missing transverse energy. Although the excess is not sufficiently significant yet, it is quite tempting to explain this excess by a well-motivated model beyond the standard model. In this paper we study a possibility of the minimal supersymmetric standard model (MSSM) for this excess. Especially, we focus on the MSSM spectrum where the sfermions are heavier than the gauginos and Higgsinos. We show that the excess can be explained by the reasonable MSSM mass spectrum.

  6. Search for Higgs Bosons Beyond the Standard Model

    CERN Document Server

    Mankel, Rainer

    2015-01-01

    While the existence of a Higgs boson with a mass near 125 GeV has been clearly established, the detailed structure of the entire Higgs sector is yet unclear. Besides the standard model interpretation, various possibilities for extended Higgs sectors are being considered. Such options include the minimal and next-to-minimal supersymmetric extensions (MSSM and NMSSM) of the standard model, more generic Two-Higgs Doublet models (2HDM), as well as truly exotic Higgs bosons decaying e.g. into totally invisible final states are considered. The talk presents recent results from the CMS experiment.

  7. Standard Model Vacuum Stability and Weyl Consistency Conditions

    DEFF Research Database (Denmark)

    Antipin, Oleg; Gillioz, Marc; Krog, Jens

    2013-01-01

    At high energy the standard model possesses conformal symmetry at the classical level. This is reflected at the quantum level by relations between the different beta functions of the model. These relations are known as the Weyl consistency conditions. We show that it is possible to satisfy them...... order by order in perturbation theory, provided that a suitable coupling constant counting scheme is used. As a direct phenomenological application, we study the stability of the standard model vacuum at high energies and compare with previous computations violating the Weyl consistency conditions....

  8. Search for Higgs bosons beyond the Standard Model

    Directory of Open Access Journals (Sweden)

    Mankel Rainer

    2015-01-01

    Full Text Available While the existence of a Higgs boson with a mass near 125 GeV has been clearly established, the detailed structure of the entire Higgs sector is yet unclear. Beyond the standard model interpretation, various scenarios for extended Higgs sectors are being considered. Such options include the minimal and next-to-minimal supersymmetric extensions (MSSM and NMSSM of the standard model, more generic Two-Higgs Doublet models (2HDM, as well as truly exotic Higgs bosons decaying e.g. into totally invisible final states. This article presents recent results from the CMS experiment.

  9. Precision calculations in supersymmetric extensions of the Standard Model

    International Nuclear Information System (INIS)

    Slavich, P.

    2013-01-01

    This dissertation is organized as follows: in the next chapter I will summarize the structure of the supersymmetric extensions of the standard model (SM), namely the MSSM (Minimal Supersymmetric Standard Model) and the NMSSM (Next-to-Minimal Supersymmetric Standard Model), I will provide a brief overview of different patterns of SUSY (supersymmetry) breaking and discuss some issues on the renormalization of the input parameters that are common to all calculations of higher-order corrections in SUSY models. In chapter 3 I will review and describe computations on the production of MSSM Higgs bosons in gluon fusion. In chapter 4 I will review results on the radiative corrections to the Higgs boson masses in the NMSSM. In chapter 5 I will review the calculation of BR(B → X s γ in the MSSM with Minimal Flavor Violation (MFV). Finally, in chapter 6 I will briefly summarize the outlook of my future research. (author)

  10. The Effective Standard Model after LHC Run I

    CERN Document Server

    Ellis, John; You, Tevong

    2015-01-01

    We treat the Standard Model as the low-energy limit of an effective field theory that incorporates higher-dimensional operators to capture the effects of decoupled new physics. We consider the constraints imposed on the coefficients of dimension-6 operators by electroweak precision tests (EWPTs), applying a framework for the effects of dimension-6 operators on electroweak precision tests that is more general than the standard $S,T$ formalism, and use measurements of Higgs couplings and the kinematics of associated Higgs production at the Tevatron and LHC, as well as triple-gauge couplings at the LHC. We highlight the complementarity between EWPTs, Tevatron and LHC measurements in obtaining model-independent limits on the effective Standard Model after LHC Run~1. We illustrate the combined constraints with the example of the two-Higgs doublet model.

  11. The effective Standard Model after LHC Run I

    International Nuclear Information System (INIS)

    Ellis, John; Sanz, Verónica; You, Tevong

    2015-01-01

    We treat the Standard Model as the low-energy limit of an effective field theory that incorporates higher-dimensional operators to capture the effects of decoupled new physics. We consider the constraints imposed on the coefficients of dimension-6 operators by electroweak precision tests (EWPTs), applying a framework for the effects of dimension-6 operators on electroweak precision tests that is more general than the standard S,T formalism, and use measurements of Higgs couplings and the kinematics of associated Higgs production at the Tevatron and LHC, as well as triple-gauge couplings at the LHC. We highlight the complementarity between EWPTs, Tevatron and LHC measurements in obtaining model-independent limits on the effective Standard Model after LHC Run 1. We illustrate the combined constraints with the example of the two-Higgs doublet model.

  12. When standards become business models: Reinterpreting "failure" in the standardization paradigm

    NARCIS (Netherlands)

    Hawkins, R.; Ballon, P.

    2007-01-01

    Purpose - This paper aims to explore the question: 'What is the relationship between standards and business models?' and illustrate the conceptual linkage with reference to developments in the mobile communications industry. Design/methodology/approach - A succinct overview of literature on

  13. Development of BMD-1 model standard pulse current generator

    International Nuclear Information System (INIS)

    Lai Bingquan

    1998-12-01

    The BMD-1 Model Standard Pulse Current Generator is a pulse current calibration instrument. It is used to calibrate current probe, amplifier of current probe and other current measurement instruments. The standard pulse current generator uses a perfect current switch to transfer the standard direct current into the standard pulse current. It provides a variable output current ranges from 1 mA to 1 A, current accuracy is +-(0.25% + 2μA). The standard pulse generator provides three work modes of output current: DC, signal pulse and variable frequencies from 10 Hz to 1 MHz, and provides a variable pulse current widths from 0.5 to 50 μs

  14. Empirical generalization assessment of neural network models

    DEFF Research Database (Denmark)

    Larsen, Jan; Hansen, Lars Kai

    1995-01-01

    competing models. Since all models are trained on the same data, a key issue is to take this dependency into account. The optimal split of the data set of size N into a cross-validation set of size Nγ and a training set of size N(1-γ) is discussed. Asymptotically (large data sees), γopt→1......This paper addresses the assessment of generalization performance of neural network models by use of empirical techniques. We suggest to use the cross-validation scheme combined with a resampling technique to obtain an estimate of the generalization performance distribution of a specific model....... This enables the formulation of a bulk of new generalization performance measures. Numerical results demonstrate the viability of the approach compared to the standard technique of using algebraic estimates like the FPE. Moreover, we consider the problem of comparing the generalization performance of different...

  15. Higgs Phenomenology in the Standard Model and Beyond

    CERN Document Server

    Field, Bryan Jonathan; Dawson, Sally

    2005-01-01

    The way in which the electroweak symmetry is broken in nature is currently unknown. The electroweak symmetry is theoretically broken in the Standard Model by the Higgs mechanism which generates masses for the particle content and introduces a single scalar to the particle spectrum, the Higgs boson. This particle has not yet been observed and the value of it mass is a free parameter in the Standard Model. The observation of one (or more) Higgs bosons would confirm our understanding of the Standard Model. In this thesis, we study the phenomenology of the Standard Model Higgs boson and compare its production observables to those of the Pseudoscalar Higgs boson and the lightest scalar Higgs boson of the Minimally Supersymmetric Standard Model. We study the production at both the Fermilab Tevatron and the future CERN Large Hadron Collider (LHC). In the first part of the thesis, we present the results of our calculations in the framework of perturbative QCD. In the second part, we present our resummed calculations.

  16. Standardized outcome assessment in brain injury rehabilitation for younger adults.

    Science.gov (United States)

    Turner-Stokes, L

    2002-05-10

    To explore possible candidates for a common outcome measure for brain injury rehabilitation in younger adults. Patients recovering from brain injury pass through several different stages of rehabilitation, illustrated by the 'Slinky model'. Outcome measures used to assess progress must not only meet scientific criteria for validity and reliability--they must be practical to use in a clinical setting and relevant to the rehabilitation goals at each stage. Within most major rehabilitation settings, the commonest goals focus on reducing disability or dependency. Among the most widely used measures in the UK are the Barthel Index, the Functional Independence Measure (FIM) and the extended Functional Assessment Measure (FIM + FAM). The relationship between these instruments is discussed. No single outcome measure is suitable for all brain injury rehabilitation, but by taking these most widely used measures and understanding the relationship between them, we already have a potential common language in disability measurement between the majority of rehabilitation centres in the UK and beyond. These instruments, however, have clear floor and ceiling effects and further work is needed to agree common measures for rehabilitation intervention that falls outside the sensitivity range of these three scales.

  17. Physics beyond the Standard Model and Collider Phenomenology

    CERN Document Server

    Burikham, P

    2005-01-01

    We briefly review the Standard Model of the particle physics focussing on the gauge hierachy problem and the naturalness problem regarding the stabilization of the light Higgs mass. We list the alternative models which address the hierachy problem in addition to conventional Supersymmetric models and Composite models. They include extra dimensional models and Little Higgs models. We investigate the production of heavy $W_{H}$ at the linear $e^{+}e^{-}$ collider at high centre-of-mass energies at 3 and 5 TeV using the Littlest Higgs model where the global group is $SU(5)/SO(5)$. In certain region of the parameter space, the heavy boson induced signals could be distinguishable from the Standard Model background. Based on tree-level open-string scattering amplitudes in the low string-scale scenario, we derive the massless fermion scattering amplitudes. The amplitudes are required to reproduce those of the Standard Model at tree level in the low energy limit. We then obtain four-fermion contact interactions by ex...

  18. Genetic Programming and Standardization in Water Temperature Modelling

    Directory of Open Access Journals (Sweden)

    Maritza Arganis

    2009-01-01

    Full Text Available An application of Genetic Programming (an evolutionary computational tool without and with standardization data is presented with the aim of modeling the behavior of the water temperature in a river in terms of meteorological variables that are easily measured, to explore their explanatory power and to emphasize the utility of the standardization of variables in order to reduce the effect of those with large variance. Recorded data corresponding to the water temperature behavior at the Ebro River, Spain, are used as analysis case, showing a performance improvement on the developed model when data are standardized. This improvement is reflected in a reduction of the mean square error. Finally, the models obtained in this document were applied to estimate the water temperature in 2004, in order to provide evidence about their applicability to forecasting purposes.

  19. Beyond the Standard Model Higgs searches at the LHC

    CERN Document Server

    Meridiani, P

    2015-01-01

    The Run I at the LHC marks the birth of the "Higgs physics", a path which will be followed at its full extent in the future runs of the LHC. Indeed there are two complementary paths to be followed to new physics in the Higgs sector: precision measurements of the Higgs properties (couplings, mass, spin and parity), where new physics can manifest as deviation from the Standard Model, or direct search for processes not foreseen in the Standard Model (Higgs decays not foreseen in the Standard Model, additional scalars which would indicate an extended Higgs sector). The current status of these studies at the LHC is presented, focussing in particular on the direct searches for rare or invisible Higgs decays or for an extended Higgs sector. The results are based on the analysis of the proton-proton collisions at 7 and 8 TeV center-of-mass energy at the LHC by the ATLAS and CMS collaborations.

  20. CP violation in the standard model and beyond

    International Nuclear Information System (INIS)

    Buras, A.J.

    1984-01-01

    The present status of CP violation in the standard six quark model is reviewed and a combined analysis with B-meson decays is presented. The theoretical uncertainties in the analysis are discussed and the resulting KM weak mixing angles, the phase delta and the ratio epsilon'/epsilon are given as functions of Tsub(B), GAMMA(b -> u)/GAMMA(b -> c), msub(t) and the B parameter. For certain ranges of the values of these parameters the standard model is not capable in reproducing the experimental values for epsilon' and epsilon parameters. Anticipating possible difficulties we discuss various alternatives to the standard explanation of CP violation such as horizontal interactions, left-right symmetric models and supersymmetry. CP violation outside the kaon system is also briefly discussed. (orig.)

  1. Search for the Standard Model Higgs Boson at LEP

    CERN Document Server

    CERN. Geneva

    2002-01-01

    The four LEP collaborations, ALEPH, DELPHI, L3 and OPAL, have collected 2465 pb-1 of e+e- collision data at energies between 189 and 209 GeV, of which 542 pb-1 were collected above 206 GeV. Searches for the Standard Model Higgs boson have been performed by each of the LEP collaborations. Their data have been combined and examined for their consistency with the Standard Model background and various Standard Model Higgs boson mass hypotheses. A lower bound of 114.1 GeV has been obtained at the 95% confidence level for the mass of the Higgs boson. The likelihood analysis shows a preference for a Higgs boson with a mass of 115.6 GeV. At this mass, the probability for the background to generate the observed effect is 3.5%.

  2. Lattice Gauge Theories Within and Beyond the Standard Model

    Energy Technology Data Exchange (ETDEWEB)

    Gelzer, Zechariah John [Iowa U.

    2017-01-01

    The Standard Model of particle physics has been very successful in describing fundamental interactions up to the highest energies currently probed in particle accelerator experiments. However, the Standard Model is incomplete and currently exhibits tension with experimental data for interactions involving $B$~mesons. Consequently, $B$-meson physics is of great interest to both experimentalists and theorists. Experimentalists worldwide are studying the decay and mixing processes of $B$~mesons in particle accelerators. Theorists are working to understand the data by employing lattice gauge theories within and beyond the Standard Model. This work addresses the theoretical effort and is divided into two main parts. In the first part, I present a lattice-QCD calculation of form factors for exclusive semileptonic decays of $B$~mesons that are mediated by both charged currents ($B \\to \\pi \\ell \

  3. Applying OGC Standards to Develop a Land Surveying Measurement Model

    Directory of Open Access Journals (Sweden)

    Ioannis Sofos

    2017-02-01

    Full Text Available The Open Geospatial Consortium (OGC is committed to developing quality open standards for the global geospatial community, thus enhancing the interoperability of geographic information. In the domain of sensor networks, the Sensor Web Enablement (SWE initiative has been developed to define the necessary context by introducing modeling standards, like ‘Observation & Measurement’ (O&M and services to provide interaction like ‘Sensor Observation Service’ (SOS. Land surveying measurements on the other hand comprise a domain where observation information structures and services have not been aligned to the OGC observation model. In this paper, an OGC-compatible, aligned to the ‘Observation and Measurements’ standard, model for land surveying observations has been developed and discussed. Furthermore, a case study instantiates the above model, and an SOS implementation has been developed based on the 52° North SOS platform. Finally, a visualization schema is used to produce ‘Web Map Service (WMS’ observation maps. Even though there are elements that differentiate this work from classic ‘O&M’ modeling cases, the proposed model and flows are developed in order to provide the benefits of standardizing land surveying measurement data (cost reducing by reusability, higher precision level, data fusion of multiple sources, raw observation spatiotemporal repository access, development of Measurement-Based GIS (MBGIS to the geoinformation community.

  4. Evaluation of habitat suitability index models for assessing biotic resources

    Science.gov (United States)

    John C. Rennie; Joseph D. Clark; James M. Sweeney

    2000-01-01

    Existing habitat suitability index (HSI) models are evaluated for assessing the biotic resources on Champion International Corporation (CIC) lands with data from a standard and an expanded timber inventory. Forty HSI models for 34 species that occur in the Southern Appalachians have been identified from the literature. All of the variables for 14 models are provided (...

  5. A Student Assessment Tool for Standardized Patient Simulations (SAT-SPS): Psychometric analysis.

    Science.gov (United States)

    Castro-Yuste, Cristina; García-Cabanillas, María José; Rodríguez-Cornejo, María Jesús; Carnicer-Fuentes, Concepción; Paloma-Castro, Olga; Moreno-Corral, Luis Javier

    2018-05-01

    The evaluation of the level of clinical competence acquired by the student is a complex process that must meet various requirements to ensure its quality. The psychometric analysis of the data collected by the assessment tools used is a fundamental aspect to guarantee the student's competence level. To conduct a psychometric analysis of an instrument which assesses clinical competence in nursing students at simulation stations with standardized patients in OSCE-format tests. The construct of clinical competence was operationalized as a set of observable and measurable behaviors, measured by the newly-created Student Assessment Tool for Standardized Patient Simulations (SAT-SPS), which was comprised of 27 items. The categories assigned to the items were 'incorrect or not performed' (0), 'acceptable' (1), and 'correct' (2). 499 nursing students. Data were collected by two independent observers during the assessment of the students' performance at a four-station OSCE with standardized patients. Descriptive statistics were used to summarize the variables. The difficulty levels and floor and ceiling effects were determined for each item. Reliability was analyzed using internal consistency and inter-observer reliability. The validity analysis was performed considering face validity, content and construct validity (through exploratory factor analysis), and criterion validity. Internal reliability and inter-observer reliability were higher than 0.80. The construct validity analysis suggested a three-factor model accounting for 37.1% of the variance. These three factors were named 'Nursing process', 'Communication skills', and 'Safe practice'. A significant correlation was found between the scores obtained and the students' grades in general, as well as with the grades obtained in subjects with clinical content. The assessment tool has proven to be sufficiently reliable and valid for the assessment of the clinical competence of nursing students using standardized patients

  6. Constraining new physics with collider measurements of Standard Model signatures

    Energy Technology Data Exchange (ETDEWEB)

    Butterworth, Jonathan M. [Department of Physics and Astronomy, University College London,Gower St., London, WC1E 6BT (United Kingdom); Grellscheid, David [IPPP, Department of Physics, Durham University,Durham, DH1 3LE (United Kingdom); Krämer, Michael; Sarrazin, Björn [Institute for Theoretical Particle Physics and Cosmology, RWTH Aachen University,Sommerfeldstr. 16, 52056 Aachen (Germany); Yallup, David [Department of Physics and Astronomy, University College London,Gower St., London, WC1E 6BT (United Kingdom)

    2017-03-14

    A new method providing general consistency constraints for Beyond-the-Standard-Model (BSM) theories, using measurements at particle colliders, is presented. The method, ‘Constraints On New Theories Using Rivet’, CONTUR, exploits the fact that particle-level differential measurements made in fiducial regions of phase-space have a high degree of model-independence. These measurements can therefore be compared to BSM physics implemented in Monte Carlo generators in a very generic way, allowing a wider array of final states to be considered than is typically the case. The CONTUR approach should be seen as complementary to the discovery potential of direct searches, being designed to eliminate inconsistent BSM proposals in a context where many (but perhaps not all) measurements are consistent with the Standard Model. We demonstrate, using a competitive simplified dark matter model, the power of this approach. The CONTUR method is highly scaleable to other models and future measurements.

  7. Tying Together the Common Core of Standards, Instruction, and Assessments

    Science.gov (United States)

    Phillips, Vicki; Wong, Carina

    2010-01-01

    Clear, high standards will enable us to develop an education system that ensures that high school graduates are ready for college. The Bill & Melinda Gates Foundation has been working with other organizations to develop a Common Core of Standards. The partners working with the foundation are developing tools that will show teachers what is…

  8. Assessment of non-standard HIV antiretroviral therapy regimens at ...

    African Journals Online (AJOL)

    2016-03-06

    Mar 6, 2016 ... Aim. Lighthouse Trust in Lilongwe, Malawi serves approximately 25,000 patients with HIV antiretroviral therapy (ART) regimens standardized according to national treatment guidelines. However, as a referral centre for complex cases, Lighthouse Trust occasionally treats patients with non-standard ART.

  9. Social Moderation, Assessment and Assuring Standards for Accounting Graduates

    Science.gov (United States)

    Watty, Kim; Freeman, Mark; Howieson, Bryan; Hancock, Phil; O'Connell, Brendan; de Lange, Paul; Abraham, Anne

    2014-01-01

    Evidencing student achievement of standards is a growing imperative worldwide. Key stakeholders (including current and prospective students, government, regulators and employers) want confidence that threshold learning standards in an accounting degree have been assured. Australia's new higher education regulatory environment requires that student…

  10. SEDIMENT TOXICITY ASSESSMENT: COMPARISON OF STANDARD AND NEW TESTING DESIGNS

    Science.gov (United States)

    Standard methods of sediment toxicity testing are fairly well accepted; however, as with all else, evolution of these methods is inevitable. We compared a standard ASTM 10-day amphipod toxicity testing method with smaller, 48- and 96-h test methods using very toxic and reference ...

  11. Primordial alchemy: a test of the standard model

    International Nuclear Information System (INIS)

    Steigman, G.

    1987-01-01

    Big Bang Nucleosynthesis provides the only probe of the early evolution of the Universe constrained by observational data. The standard, hot, big bang model predicts the synthesis of the light elements (D, 3 He, 4 He, 7 Li) in astrophysically interesting abundances during the first few minutes in the evolution of the Universe. A quantitative comparison of the predicted abundances with those observed astronomically confirms the consistency of the standard model and yields valuable constraints on the parameters of cosmology and elementary particle physics. The current status of the comparison between theory and observation will be reviewed and the opportunities for future advances outlined

  12. Climate Modeling Computing Needs Assessment

    Science.gov (United States)

    Petraska, K. E.; McCabe, J. D.

    2011-12-01

    This paper discusses early findings of an assessment of computing needs for NASA science, engineering and flight communities. The purpose of this assessment is to document a comprehensive set of computing needs that will allow us to better evaluate whether our computing assets are adequately structured to meet evolving demand. The early results are interesting, already pointing out improvements we can make today to get more out of the computing capacity we have, as well as potential game changing innovations for the future in how we apply information technology to science computing. Our objective is to learn how to leverage our resources in the best way possible to do more science for less money. Our approach in this assessment is threefold: Development of use case studies for science workflows; Creating a taxonomy and structure for describing science computing requirements; and characterizing agency computing, analysis, and visualization resources. As projects evolve, science data sets increase in a number of ways: in size, scope, timelines, complexity, and fidelity. Generating, processing, moving, and analyzing these data sets places distinct and discernable requirements on underlying computing, analysis, storage, and visualization systems. The initial focus group for this assessment is the Earth Science modeling community within NASA's Science Mission Directorate (SMD). As the assessment evolves, this focus will expand to other science communities across the agency. We will discuss our use cases, our framework for requirements and our characterizations, as well as our interview process, what we learned and how we plan to improve our materials after using them in the first round of interviews in the Earth Science Modeling community. We will describe our plans for how to expand this assessment, first into the Earth Science data analysis and remote sensing communities, and then throughout the full community of science, engineering and flight at NASA.

  13. Standardizing Benchmark Dose Calculations to Improve Science-Based Decisions in Human Health Assessments

    Science.gov (United States)

    Wignall, Jessica A.; Shapiro, Andrew J.; Wright, Fred A.; Woodruff, Tracey J.; Chiu, Weihsueh A.; Guyton, Kathryn Z.

    2014-01-01

    Background: Benchmark dose (BMD) modeling computes the dose associated with a prespecified response level. While offering advantages over traditional points of departure (PODs), such as no-observed-adverse-effect-levels (NOAELs), BMD methods have lacked consistency and transparency in application, interpretation, and reporting in human health assessments of chemicals. Objectives: We aimed to apply a standardized process for conducting BMD modeling to reduce inconsistencies in model fitting and selection. Methods: We evaluated 880 dose–response data sets for 352 environmental chemicals with existing human health assessments. We calculated benchmark doses and their lower limits [10% extra risk, or change in the mean equal to 1 SD (BMD/L10/1SD)] for each chemical in a standardized way with prespecified criteria for model fit acceptance. We identified study design features associated with acceptable model fits. Results: We derived values for 255 (72%) of the chemicals. Batch-calculated BMD/L10/1SD values were significantly and highly correlated (R2 of 0.95 and 0.83, respectively, n = 42) with PODs previously used in human health assessments, with values similar to reported NOAELs. Specifically, the median ratio of BMDs10/1SD:NOAELs was 1.96, and the median ratio of BMDLs10/1SD:NOAELs was 0.89. We also observed a significant trend of increasing model viability with increasing number of dose groups. Conclusions: BMD/L10/1SD values can be calculated in a standardized way for use in health assessments on a large number of chemicals and critical effects. This facilitates the exploration of health effects across multiple studies of a given chemical or, when chemicals need to be compared, providing greater transparency and efficiency than current approaches. Citation: Wignall JA, Shapiro AJ, Wright FA, Woodruff TJ, Chiu WA, Guyton KZ, Rusyn I. 2014. Standardizing benchmark dose calculations to improve science-based decisions in human health assessments. Environ Health

  14. The Wada Test: contributions to standardization of the stimulus for language and memory assessment

    Directory of Open Access Journals (Sweden)

    Mäder Maria Joana

    2004-01-01

    Full Text Available The Wada Test (WT is part of the presurgical evaluation for refractory epilepsy. The WT is not standardized and the protocols differ in important ways, including stimulus type of material presented for memory testing, timing of presentations and methods of assessment. The aim of this study was to contribute to establish parameters for a WT to Brazilian population investigating the performance of 100 normal subjects, without medication. Two parallel models were used based on Montreal Procedure adapted from Gail Risse's (MEG-MN,EUA protocol. The proportions of correct responses of normal subjects submitted to two parallel WT models were investigated and the two models were compared. The results showed that the two models are similar but significant differences among the stimulus type were observed. The results suggest that the stimulus type may influence the results of the WT and should be considered when constructing models and comparing different protocols.

  15. Model based risk assessment - the CORAS framework

    Energy Technology Data Exchange (ETDEWEB)

    Gran, Bjoern Axel; Fredriksen, Rune; Thunem, Atoosa P-J.

    2004-04-15

    Traditional risk analysis and assessment is based on failure-oriented models of the system. In contrast to this, model-based risk assessment (MBRA) utilizes success-oriented models describing all intended system aspects, including functional, operational and organizational aspects of the target. The target models are then used as input sources for complementary risk analysis and assessment techniques, as well as a basis for the documentation of the assessment results. The EU-funded CORAS project developed a tool-supported methodology for the application of MBRA in security-critical systems. The methodology has been tested with successful outcome through a series of seven trial within the telemedicine and ecommerce areas. The CORAS project in general and the CORAS application of MBRA in particular have contributed positively to the visibility of model-based risk assessment and thus to the disclosure of several potentials for further exploitation of various aspects within this important research field. In that connection, the CORAS methodology's possibilities for further improvement towards utilization in more complex architectures and also in other application domains such as the nuclear field can be addressed. The latter calls for adapting the framework to address nuclear standards such as IEC 60880 and IEC 61513. For this development we recommend applying a trial driven approach within the nuclear field. The tool supported approach for combining risk analysis and system development also fits well with the HRP proposal for developing an Integrated Design Environment (IDE) providing efficient methods and tools to support control room systems design. (Author)

  16. Standard fire behavior fuel models: a comprehensive set for use with Rothermel's surface fire spread model

    Science.gov (United States)

    Joe H. Scott; Robert E. Burgan

    2005-01-01

    This report describes a new set of standard fire behavior fuel models for use with Rothermel's surface fire spread model and the relationship of the new set to the original set of 13 fire behavior fuel models. To assist with transition to using the new fuel models, a fuel model selection guide, fuel model crosswalk, and set of fuel model photos are provided.

  17. A CDO option market model on standardized CDS index tranches

    DEFF Research Database (Denmark)

    Dorn, Jochen

    We provide a market model which implies a dynamic for standardized CDS index tranche spreads. This model is useful for pricing options on tranches with future Issue Dates as well as for modeling emerging options on struc- tured credit derivatives. With the upcoming regulation of the CDS market...... in perspective, the model presented here is also an attempt to face the e ects on pricing approaches provoked by an eventual Clearing Chamber . It becomes also possible to calibrate Index Tranche Options with bespoke tenors/tranche subordination to market data obtained by more liquid Index Tranche Options...

  18. Search for the standard model Higgs boson in $l\

    Energy Technology Data Exchange (ETDEWEB)

    Li, Dikai [Pierre and Marie Curie Univ., Paris (France)

    2013-01-01

    Humans have always attempted to understand the mystery of Nature, and more recently physicists have established theories to describe the observed phenomena. The most recent theory is a gauge quantum field theory framework, called Standard Model (SM), which proposes a model comprised of elementary matter particles and interaction particles which are fundamental force carriers in the most unified way. The Standard Model contains the internal symmetries of the unitary product group SU(3)c ⓍSU(2)L Ⓧ U(1)Y , describes the electromagnetic, weak and strong interactions; the model also describes how quarks interact with each other through all of these three interactions, how leptons interact with each other through electromagnetic and weak forces, and how force carriers mediate the fundamental interactions.

  19. Numerical Models of Sewage Dispersion and Statistica Bathing Water Standards

    DEFF Research Database (Denmark)

    Petersen, Ole; Larsen, Torben

    1991-01-01

    As bathing water standards usually are founded in statistical methods, the numerical models used in outfall design should reflect this. A statistical approach, where stochastic variations in source strength and bacterial disappearance is incorporated into a numerical dilution model is presented. ....... It is demonstrated for a specific outfall how the method can be used to estimate the bathing water quality. The ambition with the paper has been to demonstrate how stochastic variations in a simple manner can be included in the analysis of water quality.......As bathing water standards usually are founded in statistical methods, the numerical models used in outfall design should reflect this. A statistical approach, where stochastic variations in source strength and bacterial disappearance is incorporated into a numerical dilution model is presented...

  20. Supersymmetric standard model from the heterotic string (II)

    Energy Technology Data Exchange (ETDEWEB)

    Buchmueller, W. [Deutsches Elektronen-Synchrotron (DESY), Hamburg (Germany); Hamaguchi, K. [Deutsches Elektronen-Synchrotron (DESY), Hamburg (Germany)]|[Tokyo Univ. (Japan). Dept. of Physics; Lebedev, O.; Ratz, M. [Bonn Univ. (Germany). Physikalisches Inst.

    2006-06-15

    We describe in detail a Z{sub 6} orbifold compactification of the heterotic E{sub 8} x E{sub 8} string which leads to the (supersymmetric) standard model gauge group and matter content. The quarks and leptons appear as three 16-plets of SO(10), two of which are localized at fixed points with local SO(10) symmetry. The model has supersymmetric vacua without exotics at low energies and is consistent with gauge coupling unification. Supersymmetry can be broken via gaugino condensation in the hidden sector. The model has large vacuum degeneracy. Certain vacua with approximate B-L symmetry have attractive phenomenological features. The top quark Yukawa coupling arises from gauge interactions and is of the order of the gauge couplings. The other Yukawa couplings are suppressed by powers of standard model singlet fields, similarly to the Froggatt-Nielsen mechanism. (Orig.)

  1. Non-perturbative effective interactions in the standard model

    CERN Document Server

    Arbuzov, Boris A

    2014-01-01

    This monograph is devoted to the nonperturbative dynamics in the Standard Model (SM), the basic theory of all, but gravity, fundamental interactions in nature. The Standard Model is devided into two parts: the Quantum chromodynamics (QCD) and the Electro-weak theory (EWT) are well-defined renormalizable theories in which the perturbation theory is valid. However, for the adequate description of the real physics nonperturbative effects are inevitable. This book describes how these nonperturbative effects may be obtained in the framework of spontaneous generation of effective interactions. The well-known example of such effective interaction is provided by the famous Nambu--Jona-Lasinio effective interaction. Also a spontaneous generation of this interaction in the framework of QCD is described and applied to the method for other effective interactions in QCD and EWT. The method is based on N.N. Bogoliubov conception of compensation equations. As a result we then describe the principle feathures of the Standard...

  2. Astrophysical neutrinos flavored with beyond the Standard Model physics

    International Nuclear Information System (INIS)

    Rasmussen, Rasmus W.; Ackermann, Markus; Winter, Walter; Lechner, Lukas; Kowalski, Marek; Humboldt-Universitaet, Berlin

    2017-07-01

    We systematically study the allowed parameter space for the flavor composition of astrophysical neutrinos measured at Earth, including beyond the Standard Model theories at production, during propagation, and at detection. One motivation is to illustrate the discrimination power of the next-generation neutrino telescopes such as IceCube-Gen2. We identify several examples that lead to potential deviations from the standard neutrino mixing expectation such as significant sterile neutrino production at the source, effective operators modifying the neutrino propagation at high energies, dark matter interactions in neutrino propagation, or non-standard interactions in Earth matter. IceCube-Gen2 can exclude about 90% of the allowed parameter space in these cases, and hence will allow to efficiently test and discriminate models. More detailed information can be obtained from additional observables such as the energy-dependence of the effect, fraction of electron antineutrinos at the Glashow resonance, or number of tau neutrino events.

  3. ATLAS Standard Model Measurements Using Jet Grooming and Substructure

    CERN Document Server

    Ucchielli, Giulia; The ATLAS collaboration

    2017-01-01

    Boosted topologies allow to explore Standard Model processes in kinematical regimes never tested before. In such LHC challenging environments, standard reconstruction techniques quickly hit the wall. Targeting hadronic final states means to properly reconstruct energy and multiplicity of the jets in the event. In order to be able to identify the decay product of boosted objects, i.e. W bosons, $t\\bar{t}$ pairs or Higgs produced in association with $t\\bar{t}$ pairs, ATLAS experiment is currently exploiting several algorithms using jet grooming and jet substructure. This contribution will mainly cover the following ATLAS measurements: $t\\bar{t}$ differential cross section production and jet mass using the soft drop procedure. Standard Model measurements offer the perfect field to test the performances of new jet tagging techniques which will become even more important in the search for new physics in highly boosted topologies.”

  4. Environmental assessment for the Consumer Products Efficiency Standards program

    Energy Technology Data Exchange (ETDEWEB)

    1980-05-23

    The Energy Policy and Conservation Act of 1975 as amended by the National Energy Conservation Policy Act of 1978, requires the DOE to prescribe energy efficiency standards for thirteen consumer products. The Consumer Products Efficiency Standards (CPES) program covers the following products: refrigerators and refrigerator-freezers; freezers;clothes dryers;water heaters; room air conditioners; home heating equipment (not including furnaces); kitchen ranges and ovens; central air conditioners (cooling and heat pumps); furnaces; dishwashers; television sets; clothes washers; and humidifiers and dehumidifiers. DOE is proposing two sets of standards for all thirteen consumer products: intermediate standards to become effective in 1981 for the first nine products and in 1982 for the second four products, and final standards to become effective in 1986 and 1987, respectively. The final standards are more restrictive than the intermediate standards and will provide manufacturers with the maximum time permitted under the Act to plan and develop extensive new lines of efficient consumer products. The final standards proposed by DOE require the maximum improvements in efficiency which are technologically feasible and economically justified, as required by Section 325(c) of EPCA. The thirteen consumer products account for approximately 90% of all the energy consumed in the nation's residences, or more than 20% of the nation's energy needs. Increases in the energy efficiency of these consumer products can help to narrow the gap between the nation's increasing demand for energy and decreasing supplies of domestic oil and natural gas. Improvements in the efficiency of consumer products can thus help to solve the nation's energy crisis.

  5. Assessment of the Impacts of Standards and Labeling Programs inMexico (four products).

    Energy Technology Data Exchange (ETDEWEB)

    Sanchez, Itha; Pulido, Henry; McNeil, Michael A.; Turiel, Isaac; della Cava, Mirka

    2007-06-12

    This study analyzes impacts from energy efficiency standards and labeling in Mexico from 1994 through 2005 for four major products: household refrigerators, room air conditioners, three-phase (squirrel cage) induction motors, and clothes washers. It is a retrospective analysis, seeking to assess verified impacts on product efficiency in the Mexican market in the first ten years after standards were implemented. Such an analysis allows the Mexican government to compare actual to originally forecast program benefits. In addition, it provides an extremely valuable benchmark for other countries considering standards, and to the energy policy community as a whole. The methodology for evaluation begins with historical test data taken for a large number of models of each product type between 1994 and 2005. The pre-standard efficiency of models in 1994 is taken as a baseline throughout the analysis. Model efficiency data were provided by an independent certification laboratory (ANCE), which tested products as part of the certification and enforcement mechanism defined by the standards program. Using this data, together with economic and market data provided by both government and private sector sources, the analysis considers several types of national level program impacts. These include: Energy savings; Environmental (emissions) impacts, and Net financial impacts to consumers, manufacturers and utilities. Energy savings impacts are calculated using the same methodology as the original projections, allowing a comparison. Other impacts are calculated using a robust and sophisticated methodology developed by the Instituto de Investigaciones Electricas (IIE) and Lawrence Berkeley National Laboratory (LBNL), in a collaboration supported by the Collaborative Labeling and Standards Program (CLASP).

  6. Search for Higgs boson in beyond standard model scenarios at ...

    Indian Academy of Sciences (India)

    The principal physics motivation of the LHC experiments is to search for the Higgs boson and to probe the physics of TeV energy scale. Potential of discovery for Higgs bosons in various scenarios beyond standard model have been estimated for both CMS and ATLAS experiments through detailed detector simulations.

  7. Searches for phenomena beyond the Standard Model at the Large ...

    Indian Academy of Sciences (India)

    Keywords. LHC; ATLAS; CMS; BSM; supersymmetry; exotic. Abstract. The LHC has delivered several fb-1 of data in spring and summer 2011, opening new windows of opportunity for discovering phenomena beyond the Standard Model. A summary of the searches conducted by the ATLAS and CMS experiments based on ...

  8. 15th International Workshop "What Comes Beyond the Standard Models"

    CERN Document Server

    Nielsen, Holger Bech; Lukman, Dragan

    2013-01-01

    The contribution contains the preface to the Proceedings to the 15 th Workshop What Comes Beyond the Standard Models, Bled, July 9 - 19, 2012, published in Bled workshops in physics, Vol.13, No. 2, DMFA-Zaloznistvo, Ljubljana, Dec. 2012, and links to the published contributions.

  9. 14th Workshop on What Comes Beyond the Standard Models

    CERN Document Server

    Nielsen, Holger Bech; Lukman, Dragan; 14th Bled Workshop 2011

    2013-01-01

    The contribution contains the preface to the Proceedings to the 14th Workshop What Comes Beyond the Standard Models, Bled, July 11 - 21, 2011, published in Bled workshops in physics, Vol.12, No. 2, DMFA-Zaloznistvo, Ljubljana, Dec. 2011, and links to the published contributions.

  10. Charged and neutral minimal supersymmetric standard model Higgs ...

    Indian Academy of Sciences (India)

    physics pp. 759–763. Charged and neutral minimal supersymmetric standard model Higgs boson decays and measurement of tan β at the compact linear collider. E CONIAVITIS and A FERRARI∗. Department of Nuclear and Particle Physics, Uppsala University, 75121 Uppsala, Sweden. ∗E-mail: ferrari@tsl.uu.se. Abstract.

  11. Land administration domain model is an ISO standard now

    NARCIS (Netherlands)

    Lemmen, C.H.J.; Van Oosterom, P.J.M.; Uitermark, H.T.; De Zeeuw, K.

    2013-01-01

    A group of land administration professionals initiated the development of a data model that facilitates the quick and efficient set-up of land registrations. Just like social issues benefit from proper land administration, land administration systems themselves benefit from proper data standards. In

  12. The Dawn of physics beyond the standard model

    CERN Multimedia

    Kane, Gordon

    2003-01-01

    "The Standard Model of particle physics is at a pivotal moment in its history: it is both at the height of its success and on the verge of being surpassed [...] A new era in particle physics could soon be heralded by the detection of supersymmetric particles at the Tevatron collider at Fermi National Accelerator Laboratory in Batavia, Ill." (8 pages)

  13. Real gauge singlet scalar extension of the Standard Model: A ...

    Indian Academy of Sciences (India)

    2013-03-05

    Mar 5, 2013 ... Abstract. The simplest extension of Standard Model (SM) is considered in which a real SM gauge singlet scalar with an additional discrete symmetry Z2 is introduced to SM. This additional scalar can be a viable candidate of cold dark matter (CDM) since the stability of S is achieved by the application of Z2 ...

  14. The hierarchy problem and Physics Beyond the Standard Model

    Indian Academy of Sciences (India)

    f . Fine-tuning has to be done order by order in perturbation theory. Hierarchy problem. What guarantees the stability of v against quantum fluctuations? ⇒ Physics Beyond the Standard Model. Experimental side: Dark matter, neutrino mass, matter-antimatter asymmetry, ... Gautam Bhattacharyya. IASc Annual Meeting, IISER, ...

  15. B decays in the standard model and beyond

    International Nuclear Information System (INIS)

    London, D.

    1993-01-01

    This paper is a brief review of a set of B decays in and beyond the standard model. The author discusses only right-handed B decays, certain rare B decays, B c decays, B s 0 B s 0 mixing, and T violation

  16. 2006: Particle Physics in the Standard Model and beyond

    Indian Academy of Sciences (India)

    journal of. October 2006 physics pp. 561–577. 2006: Particle Physics in the Standard Model and beyond. GUIDO ALTARELLI. Department of Physics, Theory Division, ..... that the gauge symmetry is unbroken in the vertices of the theory: all currents and charges ... Here, when talking of divergences, we are not worried of ac-.

  17. Standard Model Higgs boson searches with the ATLAS detector at ...

    Indian Academy of Sciences (India)

    The investigation of the mechanism responsible for electroweak symmetry breaking is one of the most important tasks of the scientific program of the Large Hadron Collider. The experimental results on the search of the Standard Model Higgs boson with 1 to 2 fb-1 of proton–proton collision data at s = 7 TeV recorded by the ...

  18. Challenging the Standard Model with the muon g− 2

    Indian Academy of Sciences (India)

    Abstract. The discrepancy between experiment and the Standard Model prediction of the muon −2 is reviewed. The possibility to bridge it by hypothetical increases in the hadronic cross-section used to determine the leading hadronic contribution to the latter is analysed.

  19. Searches for phenomena beyond the Standard Model at the Large

    Indian Academy of Sciences (India)

    The LHC has delivered several fb-1 of data in spring and summer 2011, opening new windows of opportunity for discovering phenomena beyond the Standard Model. A summary of the searches conducted by the ATLAS and CMS experiments based on about 1 fb-1 of data is presented.

  20. Search for Higgs boson in beyond standard model scenarios

    Indian Academy of Sciences (India)

    The principal physics motivation of the LHC experiments is to search for the Higgs boson and to probe the physics of TeV energy scale. Potential of discovery for Higgs bosons in various scenarios beyond standard model have been estimated for both CMS and ATLAS experiments through detailed detector simulations.

  1. Standard Model Higgs boson searches with the ATLAS detector

    Indian Academy of Sciences (India)

    The investigation of the mechanism responsible for electroweak symmetry breaking is one of the most important tasks of the scientific program of the Large Hadron Collider. The experimental results on the search of the Standard Model Higgs boson with 1 to 2 fb-1 of proton–proton collision data at s = 7 TeV recorded by the ...

  2. Mathematical Modeling, Sense Making, and the Common Core State Standards

    Science.gov (United States)

    Schoenfeld, Alan H.

    2013-01-01

    On October 14, 2013 the Mathematics Education Department at Teachers College hosted a full-day conference focused on the Common Core Standards Mathematical Modeling requirements to be implemented in September 2014 and in honor of Professor Henry Pollak's 25 years of service to the school. This article is adapted from my talk at this conference…

  3. Model food standards regulation. S3. Irradiation of food

    International Nuclear Information System (INIS)

    1987-01-01

    This revised Model Food Standards Regulation S3 for the irradiation of food replaces the regulation adopted in June 1982. It specifies the types of ionizing radiations which may be used, lists the foods which may be processed and describes the requirements for an approved facility. It lists the records which are required to be kept and requirements for labelling of irradiated food

  4. Challenging the Standard Model with the muon g − 2

    Indian Academy of Sciences (India)

    the muon g−2 is reviewed. The possibility to bridge it by hypothetical increases in the hadronic cross-section used to determine the leading hadronic contribution to the latter is analysed. Keywords. Muon anomalous magnetic moment; Standard Model Higgs boson. PACS Nos 13.40.Em; 14.60.Ef; 12.15.Lk; 14.80.Bn. 1.

  5. Standardized binomial models for risk or prevalence ratios and differences.

    Science.gov (United States)

    Richardson, David B; Kinlaw, Alan C; MacLehose, Richard F; Cole, Stephen R

    2015-10-01

    Epidemiologists often analyse binary outcomes in cohort and cross-sectional studies using multivariable logistic regression models, yielding estimates of adjusted odds ratios. It is widely known that the odds ratio closely approximates the risk or prevalence ratio when the outcome is rare, and it does not do so when the outcome is common. Consequently, investigators may decide to directly estimate the risk or prevalence ratio using a log binomial regression model. We describe the use of a marginal structural binomial regression model to estimate standardized risk or prevalence ratios and differences. We illustrate the proposed approach using data from a cohort study of coronary heart disease status in Evans County, Georgia, USA. The approach reduces problems with model convergence typical of log binomial regression by shifting all explanatory variables except the exposures of primary interest from the linear predictor of the outcome regression model to a model for the standardization weights. The approach also facilitates evaluation of departures from additivity in the joint effects of two exposures. Epidemiologists should consider reporting standardized risk or prevalence ratios and differences in cohort and cross-sectional studies. These are readily-obtained using the SAS, Stata and R statistical software packages. The proposed approach estimates the exposure effect in the total population. © The Author 2015; all rights reserved. Published by Oxford University Press on behalf of the International Epidemiological Association.

  6. Reconsidering the risk assessment concept: Standardizing the impact description as a building block for vulnerability assessment

    Directory of Open Access Journals (Sweden)

    K. Hollenstein

    2005-01-01

    Full Text Available Risk assessments for natural hazards are becoming more widely used and accepted. Using an extended definition of risk, it becomes obvious that performant procedures for vulnerability assessments are vital for the success of the risk concept. However, there are large gaps in knowledge about vulnerability. To alleviate the situation, a conceptual extension of the scope of existing and new models is suggested. The basis of the suggested concept is a stadardization of the output of hazard assessments. This is achieved by defining states of the target objects that depend on the impact and at the same time affect the object's performance characteristics. The possible state variables can be related to a limited set of impact descriptors termed generic impact description interface. The concept suggests that both hazard and vulnerability assessment models are developed according to the specification of this interface, thus facilitating modularized risk assessments. Potential problems related to the application of the concept include acceptance issues and the lacking accuracy of transformation of outputs of existing models. Potential applications and simple examples for adapting existing models are briefly discussed.

  7. Characterization of a Standardized Ex-vivo Porcine Model to Assess Short Term Intraocular Pressure Changes and Trabecular Meshwork Vitality After Pars Plana Vitrectomy with Different Silicone Oil and BSS Tamponades.

    Science.gov (United States)

    Ebner, Martina; Mariacher, Siegfried; Hurst, José; Szurman, Peter; Schnichels, Sven; Spitzer, Martin S; Januschowski, Kai

    2017-08-01

    The aim of this study was to characterize a standardized porcine ex-vivo testing system for intraocular pressure (IOP) monitoring after vitrectomy with different endotamponades. Twenty-four pig eyes, six per endotamponade group were obtained immediately postmortem. After pars plana vitrectomy, vitreous substitutes (silicone oil 1000 mPas, 2000 mPas, 5000 mPas, and Balanced Salt Solution (BSS)) were instillated and IOP was observed over 24-hours. Infusion pumps with Dulbecco's Modified Eagle Medium (DMEM) simulated a constant aqueous humor circulation. A histological examination of the trabecular meshwork with DAPI- and TUNEL-staining was performed to detect the amount of apoptotic cells. TUNEL-assay showed a mean cell death rate of 3.78% (SD ± 1.46%) for silicone oil endotamponades compared to 5.05% (SD ± 2.18%) in BSS group. One-way ANOVA (p = 0.425) showed no significant difference between both groups. Mean IOP in silicone oil endotamponades was 9.50 mmHg (SD ± 1.68 mmHg) at baseline, 13.23 mmHg (SD ± 0.79 mmHg) after 1 hour, 18.46 mmHg (SD ± 2.13 mmHg) after 12 hours and 15.51 mmHg (SD ± 2.82 mmHg) 24 hours after instillation. A comparison of all silicone oil groups (one-way ANOVA, Bonferroni post-hoc test, p = 0.269 to 1.000) didn't reveal significant differences in mean IOP. The standardized ex-vivo porcine model represents an effective alternative to the in-vivo testing in animals. Maintaining the trabecular and uveoscleral outflow pathway enables a pseudo in-vivo analysis.

  8. Variation in Students' Conceptions of Self-Assessment and Standards

    Directory of Open Access Journals (Sweden)

    Heng Kiat Kelvin Tan

    2011-01-01

    Full Text Available This paper reports the results of a phenomenographic study on the different ways that secondary students understood and utilized student self-assessment and how various ego types could affect the accuracy of self-assessment. The study sought to contribute to the growing literature which recognizes the critical role that students play in assessment processes, and in particular the different roles that they assume in student self-assessment. The results of the study provide insights into how different students experience self-assessment by articulating the variation in the perception and purposes of assessing one's own learning. This variation is depicted as a hierarchy of logically related students' conceptions of self-assessment.

  9. Using the Many-Facet Rasch Model to Evaluate Standard-Setting Judgments: Setting Performance Standards for Advanced Placement® Examinations

    Science.gov (United States)

    Kaliski, Pamela; Wind, Stefanie A.; Engelhard, George, Jr.; Morgan, Deanna; Plake, Barbara; Reshetar, Rosemary

    2012-01-01

    The Many-Facet Rasch (MFR) Model is traditionally used to evaluate the quality of ratings on constructed response assessments; however, it can also be used to evaluate the quality of judgments from panel-based standard setting procedures. The current study illustrates the use of the MFR Model by examining the quality of ratings obtained from a…

  10. Assessment of galactic cosmic ray models

    Science.gov (United States)

    Mrigakshi, Alankrita Isha; Matthiä, Daniel; Berger, Thomas; Reitz, Günther; Wimmer-Schweingruber, Robert F.

    2012-08-01

    Among several factors involved in the development of a manned space mission concept, the astronauts' health is a major concern that needs to be considered carefully. Galactic cosmic rays (GCRs), which mainly consist of high-energetic nuclei ranging from hydrogen to iron and beyond, pose a major radiation health risk in long-term space missions. It is therefore required to assess the radiation exposure of astronauts in order to estimate their radiation risks. This can be done either by performing direct measurements or by making computer based simulations from which the dose can be derived. A necessary prerequisite for an accurate estimation of the exposure using simulations is a reliable description of the GCR spectra. The aim of this work is to compare GCR models and to test their applicability for the exposure assessment of astronauts. To achieve this, commonly used models capable of describing both light and heavy GCR particle spectra were evaluated by investigating the model spectra for various particles over several decades. The updated Badhwar-O'Neill model published in the year 2010, CREME2009 which uses the International Standard model for GCR, CREME96 and the Burger-Usoskin model were examined. Hydrogen, helium, oxygen and iron nuclei spectra calculated by the different models are compared with measurements from various high-altitude balloon and space-borne experiments. During certain epochs in the last decade, there are large discrepancies between the GCR energy spectra described by the models and the measurements. All the models exhibit weaknesses in describing the increased GCR flux that was observed in 2009-2010.

  11. Improving Flood Damage Assessment Models in Italy

    Science.gov (United States)

    Amadio, M.; Mysiak, J.; Carrera, L.; Koks, E.

    2015-12-01

    The use of Stage-Damage Curve (SDC) models is prevalent in ex-ante assessments of flood risk. To assess the potential damage of a flood event, SDCs describe a relation between water depth and the associated potential economic damage over land use. This relation is normally developed and calibrated through site-specific analysis based on ex-post damage observations. In some cases (e.g. Italy) SDCs are transferred from other countries, undermining the accuracy and reliability of simulation results. Against this background, we developed a refined SDC model for Northern Italy, underpinned by damage compensation records from a recent flood event. Our analysis considers both damage to physical assets and production losses from business interruptions. While the first is calculated based on land use information, production losses are measured through the spatial distribution of Gross Value Added (GVA). An additional component of the model assesses crop-specific agricultural losses as a function of flood seasonality. Our results show an overestimation of asset damage from non-calibrated SDC values up to a factor of 4.5 for tested land use categories. Furthermore, we estimate that production losses amount to around 6 per cent of the annual GVA. Also, maximum yield losses are less than a half of the amount predicted by the standard SDC methods.

  12. Innovation Process Planning Model in the Bpmn Standard

    Directory of Open Access Journals (Sweden)

    Jurczyk-Bunkowska Magdalena

    2013-12-01

    Full Text Available The aim of the article is to show the relations in the innovation process planning model. The relations argued here guarantee the stable and reliable way to achieve the result in the form of an increased competitiveness by a professionally directed development of the company. The manager needs to specify the effect while initiating the realisation of the process, has to be achieved this by the system of indirect goals. The original model proposed here shows the standard of dependence between the plans of the fragments of the innovation process which make up for achieving its final goal. The relation in the present article was shown by using the standard Business Process Model and Notation. This enabled the specification of interrelations between the decision levels at which subsequent fragments of the innovation process are planned. This gives the possibility of a better coordination of the process, reducing the time needed for the achievement of its effect. The model has been compiled on the basis of the practises followed in Polish companies. It is not, however, the reflection of these practises, but rather an idealised standard of proceedings which aims at improving the effectiveness of the management of innovations on the operational level. The model shown could be the basis of the creation of systems supporting the decision making, supporting the knowledge management or those supporting the communication in the innovation processes.

  13. Assessment of technologies to meet a low carbon fuel standard.

    Science.gov (United States)

    Yeh, Sonia; Lutsey, Nicholas P; Parker, Nathan C

    2009-09-15

    California's low carbon fuel standard (LCFS) was designed to incentivize a diverse array of available strategies for reducing transportation greenhouse gas (GHG) emissions. It provides strong incentives for fuels with lower GHG emissions, while explicitly requiring a 10% reduction in California's transportation fuel GHG intensity by 2020. This paper investigates the potential for cost-effective GHG reductions from electrification and expanded use of biofuels. The analysis indicates that fuel providers could meetthe standard using a portfolio approach that employs both biofuels and electricity, which would reduce the risks and uncertainties associated with the progress of cellulosic and battery technologies, feedstock prices, land availability, and the sustainability of the various compliance approaches. Our analysis is based on the details of California's development of an LCFS; however, this research approach could be generalizable to a national U.S. standard and to similar programs in Europe and Canada.

  14. Transformative Shifts in Art History Teaching: The Impact of Standards-Based Assessment

    Science.gov (United States)

    Ormond, Barbara

    2011-01-01

    This article examines pedagogical shifts in art history teaching that have developed as a response to the implementation of a standards-based assessment regime. The specific characteristics of art history standards-based assessment in the context of New Zealand secondary schools are explained to demonstrate how an exacting form of assessment has…

  15. Comparing Panelists' Understanding of Standard Setting across Multiple Levels of an Alternate Science Assessment

    Science.gov (United States)

    Hansen, Mary A.; Lyon, Steven R.; Heh, Peter; Zigmond, Naomi

    2013-01-01

    Large-scale assessment programs, including alternate assessments based on alternate achievement standards (AA-AAS), must provide evidence of technical quality and validity. This study provides information about the technical quality of one AA-AAS by evaluating the standard setting for the science component. The assessment was designed to have…

  16. Overview of the Standard Model Measurements with the ATLAS Detector

    CERN Document Server

    Liu, Yanwen; The ATLAS collaboration

    2017-01-01

    The ATLAS Collaboration is engaged in precision measurement of fundamental Standard Model parameters, such as the W boson mass, the weak-mixing angle or the strong coupling constant. In addition, the production cross-sections of a large variety of final states involving high energetic jets, photons as well as single and multi vector bosons are measured multi differentially at several center of mass energies. This allows to test perturbative QCD calculations to highest precision. In addition, these measurements allow also to test models beyond the SM, e.g. those leading to anomalous gauge couplings. In this talk, we give a broad overview of the Standard Model measurement campaign of the ATLAS collaboration, where selected topics will be discussed in more detail.

  17. Neutron electric dipole moment and extension of the standard model

    International Nuclear Information System (INIS)

    Oshimo, Noriyuki

    2001-01-01

    A nonvanishing value for the electric dipole moment (EDM) of the neutron is a prominent signature for CP violation. The EDM induced by the Kobayashi-Maskawa mechanism of the standard model (SM) has a small magnitude and its detection will be very difficult. However, since baryon asymmetry of the universe cannot be accounted for by the SM, there should exist some other source of CP violation, which may generate a large magnitude for the EDM. One of the most hopeful candidates for physics beyond the SM is the supersymmetric standard model, which contains such sources of CP violation. This model suggests that the EDM has a magnitude not much smaller than the present experimental bounds. Progress in measuring the EDM provides very interesting information about extension of the SM. (author)

  18. Standardizing measurement, sampling and reporting for public exposure assessments

    Energy Technology Data Exchange (ETDEWEB)

    Rochedo, Elaine R.R. [Instituto de Radioprotecao e Dosimetria, Comissao Nacional de Energia Nuclear, Av. Salvador Allende s/No. CEP 22780-160 Rio de Janeiro, RJ (Brazil)], E-mail: elaine@ird.gov.br

    2008-11-15

    UNSCEAR assesses worldwide public exposure from natural and man-made sources of ionizing radiation based on information submitted to UNSCEAR by United Nations Member States and from peer reviewed scientific literature. These assessments are used as a basis for radiation protection programs of international and national regulatory and research organizations. Although UNSCEAR describes its assessment methodologies, the data are based on various monitoring approaches. In order to reduce uncertainties and improve confidence in public exposure assessments, it would be necessary to harmonize the methodologies used for sampling, measuring and reporting of environmental results.

  19. Irrigation in dose assessments models

    Energy Technology Data Exchange (ETDEWEB)

    Bergstroem, Ulla; Barkefors, Catarina [Studsvik RadWaste AB, Nykoeping (Sweden)

    2004-05-01

    SKB has carried out several safety analyses for repositories for radioactive waste, one of which was SR 97, a multi-site study concerned with a future deep bedrock repository for high-level waste. In case of future releases due to unforeseen failure of the protective multiple barrier system, radionuclides may be transported with groundwater and may reach the biosphere. Assessments of doses have to be carried out with a long-term perspective. Specific models are therefore employed to estimate consequences to man. It has been determined that the main pathway for nuclides from groundwater or surface water to soil is via irrigation. Irrigation may cause contamination of crops directly by e.g. interception or rain-splash, and indirectly via root-uptake from contaminated soil. The exposed people are in many safety assessments assumed to be self-sufficient, i.e. their food is produced locally where the concentration of radionuclides may be the highest. Irrigation therefore plays an important role when estimating consequences. The present study is therefore concerned with a more extensive analysis of the role of irrigation for possible future doses to people living in the area surrounding a repository. Current irrigation practices in Sweden are summarised, showing that vegetables and potatoes are the most common crops for irrigation. In general, however, irrigation is not so common in Sweden. The irrigation model used in the latest assessments is described. A sensitivity analysis is performed showing that, as expected, interception of irrigation water and retention on vegetation surfaces are important parameters. The parameters used to describe this are discussed. A summary is also given how irrigation is proposed to be handled in the international BIOMASS (BIOsphere Modelling and ASSessment) project and in models like TAME and BIOTRAC. Similarities and differences are pointed out. Some numerical results are presented showing that surface contamination in general gives the

  20. Irrigation in dose assessments models

    International Nuclear Information System (INIS)

    Bergstroem, Ulla; Barkefors, Catarina

    2004-05-01

    SKB has carried out several safety analyses for repositories for radioactive waste, one of which was SR 97, a multi-site study concerned with a future deep bedrock repository for high-level waste. In case of future releases due to unforeseen failure of the protective multiple barrier system, radionuclides may be transported with groundwater and may reach the biosphere. Assessments of doses have to be carried out with a long-term perspective. Specific models are therefore employed to estimate consequences to man. It has been determined that the main pathway for nuclides from groundwater or surface water to soil is via irrigation. Irrigation may cause contamination of crops directly by e.g. interception or rain-splash, and indirectly via root-uptake from contaminated soil. The exposed people are in many safety assessments assumed to be self-sufficient, i.e. their food is produced locally where the concentration of radionuclides may be the highest. Irrigation therefore plays an important role when estimating consequences. The present study is therefore concerned with a more extensive analysis of the role of irrigation for possible future doses to people living in the area surrounding a repository. Current irrigation practices in Sweden are summarised, showing that vegetables and potatoes are the most common crops for irrigation. In general, however, irrigation is not so common in Sweden. The irrigation model used in the latest assessments is described. A sensitivity analysis is performed showing that, as expected, interception of irrigation water and retention on vegetation surfaces are important parameters. The parameters used to describe this are discussed. A summary is also given how irrigation is proposed to be handled in the international BIOMASS (BIOsphere Modelling and ASSessment) project and in models like TAME and BIOTRAC. Similarities and differences are pointed out. Some numerical results are presented showing that surface contamination in general gives the

  1. Xpand chest drain: assessing equivalence to current standard ...

    African Journals Online (AJOL)

    This device incorporates a one-way valve with a fluid reservoir and permits the detection of an air leak, as well as intrapleural pressure differences. Aim. To prove equivalence of the Xpand chest drain compared with standard underwater bottle drainage. Methods. In a non-blinded randomised control trial 67 patients with ...

  2. Assessing Learning Environment for Achieving Standard in Primary ...

    African Journals Online (AJOL)

    The paper is a descriptive survey which sought to identify how the provision of adequate learning environment would affect standard in primary education and subsequently empower both staff and students for capacity development through counselling. Fifteen (15) public primary schools out of 96 identified in Oru-East, ...

  3. Development and Application of Assessment Standards to Advanced Written Assignments

    Science.gov (United States)

    Miihkinen, Antti; Virtanen, Tuija

    2018-01-01

    This study describes the results of a project that focused on developing an assessment rubric to be used as the assessment criteria for the written thesis of accounting majors and the quality of the coursework during the seminar. We used descriptive analysis and the survey method to collect information for the development work and to examine the…

  4. Tests of the standard model and searches for new physics

    Energy Technology Data Exchange (ETDEWEB)

    Langacker, Paul [Pennsylvania Univ., PA (United States). Dept. of Physics

    1996-07-01

    Earlier chapters of this volume have detailed described the standard model and its renormalization, the various types of precision experiments, and their implications. This chapter is devoted to global analysis of the Z-pole, M{sub W}, and neutral current data, which contains more information that any one class of experiments. The subsequent sections will summarize some of the relevant data and theoretical formulas, the status of the standard model tests and parameter determinations, the possible classes of new physics, and the implications of the precision experiments. In particular, the model independent analysis of neutral current couplings, which establishes the standard model to first approximation; the implication of supersymmetry; supersymmetric grand unification; and a number if specific types of new physics, including heavy Z{sup '} bosons, new sources of SU{sub 2} breaking, new contributions to the gauge boson self-energies, Zb b-bar vertex corrections, certain types of new 4-Fermi operators and leptoquarks, and the exotic fermions are described.

  5. Standard guide for use of modeling for passive gamma measurements

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    2010-01-01

    1.1 This guide addresses the use of models with passive gamma-ray measurement systems. Mathematical models based on physical principles can be used to assist in calibration of gamma-ray measurement systems and in analysis of measurement data. Some nondestructive assay (NDA) measurement programs involve the assay of a wide variety of item geometries and matrix combinations for which the development of physical standards are not practical. In these situations, modeling may provide a cost-effective means of meeting user’s data quality objectives. 1.2 A scientific knowledge of radiation sources and detectors, calibration procedures, geometry and error analysis is needed for users of this standard. This guide assumes that the user has, at a minimum, a basic understanding of these principles and good NDA practices (see Guide C1592), as defined for an NDA professional in Guide C1490. The user of this standard must have at least a basic understanding of the software used for modeling. Instructions or further train...

  6. E-health stakeholders experiences with clinical modelling and standardizations.

    Science.gov (United States)

    Gøeg, Kirstine Rosenbeck; Elberg, Pia Britt; Højen, Anne Randorff

    2015-01-01

    Stakeholders in e-health such as governance officials, health IT-implementers and vendors have to co-operate to achieve the goal of a future-proof interoperable e-health infrastructure. Co-operation requires knowledge on the responsibility and competences of stakeholder groups. To increase awareness on clinical modeling and standardization we conducted a workshop for Danish and a few Norwegian e-health stakeholders' and made them discuss their views on different aspects of clinical modeling using a theoretical model as a point of departure. Based on the model, we traced stakeholders' experiences. Our results showed there was a tendency that stakeholders were more familiar with e-health requirements than with design methods, clinical information models and clinical terminology as they are described in the scientific literature. The workshop made it possible for stakeholders to discuss their roles and expectations to each other.

  7. 49 CFR 1572.5 - Standards for security threat assessments.

    Science.gov (United States)

    2010-10-01

    ... assessment includes biometric identification and a biometric credential. (2) To apply for a comparability... process and provide biometric information to obtain a TWIC, if the applicant seeks unescorted access to a...

  8. Assessing cultural validity in standardized tests in stem education

    Science.gov (United States)

    Gassant, Lunes

    This quantitative ex post facto study examined how race and gender, as elements of culture, influence the development of common misconceptions among STEM students. Primary data came from a standardized test: the Digital Logic Concept Inventory (DLCI) developed by Drs. Geoffrey L. Herman, Michael C. Louis, and Craig Zilles from the University of Illinois at Urbana-Champaign. The sample consisted of a cohort of 82 STEM students recruited from three universities in Northern Louisiana. Microsoft Excel and the Statistical Package for the Social Sciences (SPSS) were used for data computation. Two key concepts, several sub concepts, and 19 misconceptions were tested through 11 items in the DLCI. Statistical analyses based on both the Classical Test Theory (Spearman, 1904) and the Item Response Theory (Lord, 1952) yielded similar results: some misconceptions in the DLCI can reliably be predicted by the Race or the Gender of the test taker. The research is significant because it has shown that some misconceptions in a STEM discipline attracted students with similar ethnic backgrounds differently; thus, leading to the existence of some cultural bias in the standardized test. Therefore the study encourages further research in cultural validity in standardized tests. With culturally valid tests, it will be possible to increase the effectiveness of targeted teaching and learning strategies for STEM students from diverse ethnic backgrounds. To some extent, this dissertation has contributed to understanding, better, the gap between high enrollment rates and low graduation rates among African American students and also among other minority students in STEM disciplines.

  9. Astrophysical neutrinos flavored with beyond the Standard Model physics

    Science.gov (United States)

    Rasmussen, Rasmus W.; Lechner, Lukas; Ackermann, Markus; Kowalski, Marek; Winter, Walter

    2017-10-01

    We systematically study the allowed parameter space for the flavor composition of astrophysical neutrinos measured at Earth, including beyond the Standard Model theories at production, during propagation, and at detection. One motivation is to illustrate the discrimination power of the next-generation neutrino telescopes such as IceCube-Gen2. We identify several examples that lead to potential deviations from the standard neutrino mixing expectation such as significant sterile neutrino production at the source, effective operators modifying the neutrino propagation at high energies, dark matter interactions in neutrino propagation, or nonstandard interactions in Earth matter. IceCube-Gen2 can exclude about 90% of the allowed parameter space in these cases, and hence will allow us to efficiently test and discriminate between models. More detailed information can be obtained from additional observables such as the energy dependence of the effect, fraction of electron antineutrinos at the Glashow resonance, or number of tau neutrino events.

  10. Challenges to the standard model of Big Bang nucleosynthesis

    International Nuclear Information System (INIS)

    Steigman, G.

    1993-01-01

    Big Bang nucleosynthesis provides a unique probe of the early evolution of the Universe and a crucial test of the consistency of the standard hot Big Bang cosmological model. Although the primordial abundances of 2 H, 3 He, 4 He, and 7 Li inferred from current observational data are in agreement with those predicted by Big Bang nucleosynthesis, recent analysis has severely restricted the consistent range for the nucleon-to-photon ratio: 3.7 ≤ η 10 ≤ 4.0. Increased accuracy in the estimate of primordial 4 he and observations of Be and B in Pop II stars are offering new challenges to the standard model and suggest that no new light particles may be allowed (N ν BBN ≤ 3.0, where N ν is the number of equivalent light neutrinos). 23 refs

  11. The search for the Standard Model Higgs boson at ALEPH

    CERN Document Server

    McNamara, P A

    2002-01-01

    The standard model of elementary particles is a remarkably successful theory. The Higgs boson, the particle responsible for giving masses to those particles with mass, is the only particle in the standard model which has not been experimentally observed. In data collected in 2000 at the Large Electron-Positron Collider, at center of mass energies up to 209 GeV, an excess of Higgs-like events was observed. This excess is consistent with the production of a Higgs boson with invariant mass 115.6 ± 0.8 GeV/c 2. The effect is dominated by an excess in the four-jet channels in ALEPH caused by three high purity signal candidates.

  12. Search for the Standard Model Higgs Boson at LEP

    CERN Document Server

    Barate, R.; De Bonis, I.; Decamp, D.; Goy, C.; Jezequel, S.; Lees, J.P.; Martin, F.; Merle, E.; Minard, M.N.; Pietrzyk, B.; Trocme, B.; Boix, G.; Bravo, S.; Casado, M.P.; Chmeissani, M.; Crespo, J.M.; Fernandez, E.; Fernandez-Bosman, M.; Garrido, L.; Grauges, E.; Lopez, J.; Martinez, M.; Merino, G.; Miquel, R.; Mir, L.M.; Pacheco, A.; Paneque, D.; Ruiz, H.; Heister, A.; Schael, S.; Colaleo, A.; Creanza, D.; De Filippis, N.; de Palma, M.; Iaselli, G.; Maggi, G.; Maggi, M.; Nuzzo, S.; Ranieri, A.; Raso, G.; Ruggieri, F.; Selvaggi, G.; Silvestris, L.; Tempesta, P.; Tricomi, A.; Zito, G.; Huang, X.; Lin, J.; Quyang, Q.; Wang, T.; Xie, Y.; Xu, R.; Xue, S.; Zhang, J.; Zhang, L.; Zhao, W.; Abbaneo, D.; Azzurri, P.; Barklow, T.; Buchmuller, O.; Cattaneo, M.; Cerutti, F.; Clerbaux, B.; Drevermann, H.; Forty, R.W.; Frank, M.; Gianotti, F.; Greening, T.C.; Hansen, J.B.; Harvey, J.; Hutchcroft, D.E.; Janot, P.; Jost, B.; Kado, M.; Maley, P.; Mato, P.; Moutoussi, A.; Ranjard, F.; Rolandi, Gigi; Schlatter, D.; Sguazzoni, G.; Tejessy, W.; Teubert, F.; Valassi, A.; Videau, I.; Ward, J.J.; Badaud, F.; Dessagne, S.; Falvard, A.; Fayolle, D.; Gay, P.; Jousset, J.; Michel, B.; Monteil, S.; Pallin, D.; Pascolo, J.M.; Perret, P.; Hansen, J.D.; Hansen, J.R.; Hansen, P.H.; Nilsson, B.S.; Waananen, A.; Kyriakis, A.; Markou, C.; Simopoulou, E.; Vayaki, A.; Zachariadou, K.; Blondel, A.; Brient, J.C.; Machefert, F.; Rouge, A.; Swynghedauw, M.; Tanaka, R.; Videau, H.; Ciulli, V.; Focardi, E.; Parrini, G.; Antonelli, A.; Antonelli, M.; Bencivenni, G.; Bologna, G.; Bossi, F.; Campana, P.; Capon, G.; Chiarella, V.; Laurelli, P.; Mannocchi, G.; Murtas, F.; Murtas, G.P.; Passalacqua, L.; Pepe-Altarelli, M.; Spagnolo, P.; Kennedy, J.; Lynch, J.G.; Negus, P.; O'Shea, V.; Smith, D.; Thompson, A.S.; Wasserbaech, S.; Cavanaugh, R.; Dhamotharan, S.; Geweniger, C.; Hanke, P.; Hepp, V.; Kluge, E.E.; Leibenguth, G.; Putzer, A.; Stenzel, H.; Tittel, K.; Werner, S.; Wunsch, M.; Beuselinck, R.; Binnie, D.M.; Cameron, W.; Davies, G.; Dornan, P.J.; Girone, M.; Hill, R.D.; Marinelli, N.; Nowell, J.; Przysiezniak, H.; Rutherford, S.A.; Sedgbeer, J.K.; Thompson, J.C.; White, R.; Ghete, V.M.; Girtler, P.; Kneringer, E.; Kuhn, D.; Rudolph, G.; Bouhova-Thacker, E.; Bowdery, C.K.; Clarke, D.P.; Ellis, G.; Finch, A.J.; Foster, F.; Hughes, G.; Jones, R.W.L.; Pearson, M.R.; Robertson, N.A.; Smizanska, M.; Lemaitre, V.; Blumenschein, U.; Holldorfer, F.; Jakobs, K.; Kayser, F.; Kleinknecht, K.; Muller, A.S.; Quast, G.; Renk, B.; Sander, H.G.; Schmeling, S.; Wachsmuth, H.; Zeitnitz, C.; Ziegler, T.; Bonissent, A.; Carr, J.; Coyle, P.; Curtil, C.; Ealet, A.; Fouchez, D.; Leroy, O.; Kachelhoffer, T.; Payre, P.; Rousseau, D.; Tilquin, A.; Ragusa, F.; David, A.; Dietl, H.; Ganis, G.; Huttmann, K.; Lutjens, G.; Mannert, C.; Manner, W.; Moser, H.G.; Settles, R.; Wolf, G.; Boucrot, J.; Callot, O.; Davier, M.; Duflot, L.; Grivaz, J.F.; Heusse, P.; Jacholkowska, A.; Loomis, C.; Serin, L.; Veillet, J.J.; de Vivie de Regie, J.B.; Yuan, C.; Bagliesi, Giuseppe; Boccali, T.; Foa, L.; Giammanco, A.; Giassi, A.; Ligabue, F.; Messineo, A.; Palla, F.; Sanguinetti, G.; Sciaba, A.; Tenchini, R.; Venturi, A.; Verdini, P.G.; Awunor, O.; Blair, G.A.; Coles, J.; Cowan, G.; Garcia-Bellido, A.; Green, M.G.; Jones, L.T.; Medcalf, T.; Misiejuk, A.; Strong, J.A.; Teixeira-Dias, P.; Clifft, R.W.; Edgecock, T.R.; Norton, P.R.; Tomalin, I.R.; Bloch-Devaux, Brigitte; Boumediene, D.; Colas, P.; Fabbro, B.; Lancon, E.; Lemaire, M.C.; Locci, E.; Perez, P.; Rander, J.; Renardy, J.F.; Rosowsky, A.; Seager, P.; Trabelsi, A.; Tuchming, B.; Vallage, B.; Konstantinidis, N.; Litke, A.M.; Taylor, G.; Booth, C.N.; Cartwright, S.; Combley, F.; Hodgson, P.N.; Lehto, M.; Thompson, L.F.; Affholderbach, K.; Boehrer, Armin; Brandt, S.; Grupen, C.; Hess, J.; Ngac, A.; Prange, G.; Sieler, U.; Borean, C.; Giannini, G.; He, H.; Putz, J.; Rothberg, J.; Armstrong, S.R.; Berkelman, Karl; Cranmer, K.; Ferguson, D.P.S.; Gao, Y.; Gonzalez, S.; Hayes, O.J.; Hu, H.; Jin, S.; Kile, J.; McNamara, P.A., III; Nielsen, J.; Pan, Y.B.; von Wimmersperg-Toeller, J.H.; Wiedenmann, W.; Wu, J.; Wu, S.L.; Wu, X.; Zobernig, G.; Dissertori, G.; Abdallah, J.; Abreu, P.; Adam, W.; Adzic, P.; Albrecht, T.; Alderweireld, T.; Alemany-Fernandez, R.; Allmendinger, T.; Allport, P.P.; Amaldi, U.; Amapane, N.; Amato, S.; Anashkin, E.; Andreazza, A.; Andringa, S.; Anjos, N.; Antilogus, P.; Apel, W.D.; Arnoud, Y.; Ask, S.; Asman, B.; Augustin, J.E.; Augustinus, A.; Baillon, P.; Ballestrero, A.; Bambade, P.; Barbier, R.; Bardin, D.; Barker, G.J.; Baroncelli, A.; Battaglia, M.; Baubillier, M.; Becks, K.H.; Begalli, M.; Behrmann, A.; Ben-Haim, E.; Benekos, N.; Benvenuti, A.; Berat, C.; Berggren, M.; Berntzon, L.; Bertrand, D.; Besancon, M.; Besson, N.; Bloch, D.; Blom, M.; Bluj, M.; Bonesini, M.; Boonekamp, M.; Booth, P.S.L.; Borisov, G.; Botner, O.; Bouquet, B.; Bowcock, T.J.V.; Boyko, I.; Bracko, M.; Brenner, R.; Brodet, E.; Bruckman, P.; Brunet, J.M.; Bugge, L.; Buschmann, P.; Calvi, M.; Camporesi, T.; Canale, V.; Carena, F.; Castro, Nuno Filipe; Cavallo, F.; Chapkin, M.; Charpentier, P.; Checchia, P.; Chierici, R.; Chliapnikov, P.; Chudoba, J.; Chung, S.U.; Cieslik, K.; Collins, P.; Contri, R.; Cosme, G.; Cossutti, F.; Costa, M.J.; Crawley, B.; Crennell, D.; Cuevas, J.; DHondt, J.; Dalmau, J.; da Silva, T.; Da Silva, W.; Della Ricca, G.; De Angelis, A.; De Boer, W.; De Clercq, C.; De Lotto, B.; De Maria, N.; De Min, A.; de Paula, L.; Di Ciaccio, L.; Di Simone, A.; Doroba, K.; Drees, J.; Dris, M.; Eigen, G.; Ekelof, T.; Ellert, M.; Elsing, M.; Espirito Santo, M.C.; Fanourakis, G.; Fassouliotis, D.; Feindt, M.; Fernandez, J.; Ferrer, A.; Ferro, F.; Flagmeyer, U.; Foeth, H.; Fokitis, E.; Fulda-Quenzer, F.; Fuster, J.; Gandelman, M.; Garcia, C.; Gavillet, P.; Gazis, Evangelos; Gokieli, R.; Golob, B.; Gomez-Ceballos, G.; Goncalves, P.; Graziani, E.; Grosdidier, G.; Grzelak, K.; Guy, J.; Haag, C.; Hallgren, A.; Hamacher, K.; Hamilton, K.; Hansen, J.; Haug, S.; Hauler, F.; Hedberg, V.; Hennecke, M.; Herr, H.; Hoffman, J.; Holmgren, S.O.; Holt, P.J.; Houlden, M.A.; Hultqvist, K.; Jackson, John Neil; Jarlskog, G.; Jarry, P.; Jeans, D.; Johansson, Erik Karl; Johansson, P.D.; Jonsson, P.; Joram, C.; Jungermann, L.; Kapusta, Frederic; Katsanevas, S.; Katsoufis, E.; Kernel, G.; Kersevan, B.P.; Kiiskinen, A.; King, B.T.; Kjaer, N.J.; Kluit, P.; Kokkinias, P.; Kourkoumelis, C.; Kouznetsov, O.; Krumstein, Z.; Kucharczyk, M.; Lamsa, J.; Leder, G.; Ledroit, Fabienne; Leinonen, L.; Leitner, R.; Lemonne, J.; Lepeltier, V.; Lesiak, T.; Liebig, W.; Liko, D.; Lipniacka, A.; Lopes, J.H.; Lopez, J.M.; Loukas, D.; Lutz, P.; Lyons, L.; MacNaughton, J.; Malek, A.; Maltezos, S.; Mandl, F.; Marco, J.; Marco, R.; Marechal, B.; Margoni, M.; Marin, J.C.; Mariotti, C.; Markou, A.; Martinez-Rivero, C.; Masik, J.; Mastroyiannopoulos, N.; Matorras, F.; Matteuzzi, C.; Mazzucato, F.; Mazzucato, M.; McNulty, R.; Meroni, C.; Meyer, W.T.; Migliore, E.; Mitaroff, W.; Mjoernmark, U.; Moa, T.; Moch, M.; Monig, Klaus; Monge, R.; Montenegro, J.; Moraes, D.; Moreno, S.; Morettini, P.; Mueller, U.; Muenich, K.; Mulders, M.; Mundim, L.; Murray, W.; Muryn, B.; Myatt, G.; Myklebust, T.; Nassiakou, M.; Navarria, F.; Nawrocki, K.; Nicolaidou, R.; Nikolenko, M.; Oblakowska-Mucha, A.; Obraztsov, V.; Olshevski, A.; Onofre, A.; Orava, R.; Osterberg, K.; Ouraou, A.; Oyanguren, A.; Paganoni, M.; Paiano, S.; Palacios, J.P.; Palka, H.; Papadopoulou, T.D.; Pape, L.; Parkes, C.; Parodi, F.; Parzefall, U.; Passeri, A.; Passon, O.; Peralta, L.; Perepelitsa, V.; Perrotta, A.; Petrolini, A.; Piedra, J.; Pieri, L.; Pierre, F.; Pimenta, M.; Piotto, E.; Podobnik, T.; Poireau, V.; Pol, M.E.; Polok, G.; Poropat, P.; Pozdniakov, V.; Pukhaeva, N.; Pullia, A.; Rames, J.; Ramler, L.; Read, Alexander L.; Rebecchi, P.; Rehn, J.; Reid, D.; Reinhardt, R.; Renton, P.; Richard, F.; Ridky, J.; Rivero, M.; Rodriguez, D.; Romero, A.; Ronchese, P.; Rosenberg, E.; Roudeau, P.; Rovelli, T.; Ruhlmann-Kleider, V.; Ryabtchikov, D.; Sadovsky, A.; Salmi, L.; Salt, J.; Savoy-Navarro, A.; Schwickerath, U.; Segar, A.; Sekulin, R.; Siebel, M.; Sisakian, A.; Smadja, G.; Smirnova, O.; Sokolov, A.; Sopczak, A.; Sosnowski, R.; Spassov, T.; Stanitzki, M.; Stocchi, A.; Strauss, J.; Stugu, B.; Szczekowski, M.; Szeptycka, M.; Szumlak, T.; Tabarelli, T.; Taffard, A.C.; Tegenfeldt, F.; Timmermans, Jan; Tkatchev, L.; Tobin, M.; Todorovova, S.; Tome, B.; Tonazzo, A.; Tortosa, P.; Travnicek, P.; Treille, D.; Tristram, G.; Trochimczuk, M.; Troncon, C.; Turluer, M.L.; Tyapkin, I.A.; Tyapkin, P.; Tzamarias, S.; Uvarov, V.; Valenti, G.; Van Dam, Piet; Van Eldik, J.; Van Lysebetten, A.; Van Remortel, N.; Van Vulpen, I.; Vegni, G.; Veloso, F.; Venus, W.; Verbeure, F.; Verdier, P.; Verzi, V.; Vilanova, D.; Vitale, L.; Vrba, V.; Wahlen, H.; Washbrook, A.J.; Weiser, C.; Wicke, D.; Wickens, J.; Wilkinson, G.; Winter, M.; Witek, M.; Yushchenko, O.; Zalewska, A.; Zalewski, P.; Zavrtanik, D.; Zhuravlov, V.; Zimine, N.I.; Zintchenko, A.; Zupan, M.; Achard, P.; Adriani, O.; Aguilar-Benitez, M.; Alcaraz, J.; Alemanni, G.; Allaby, J.; Aloisio, A.; Alviggi, M.G.; Anderhub, H.; Andreev, Valery P.; Anselmo, F.; Arefiev, A.; Azemoon, T.; Aziz, T.; Baarmand, M.; Bagnaia, P.; Bajox, A.; Baksay, G.; Baksay, L.; Baldew, S.V.; Banerjee, S.; Barczyk, A.; Barillere, R.; Bartalini, P.; Basile, M.; Batalova, N.; Battiston, R.; Bay, A.; Becattini, F.; Becker, U.; Behner, F.; Bellucci, L.; Berbeco, R.; Berdugo, J.; Berges, P.; Bertucci, B.; Betev, B.L.; Biasini, M.; Biglietti, M.; Biland, A.; Blaising, J.J.; Blyth, S.C.; Bobbink, G.J.; Bohm, A.; Boldizsar, L.; Borgia, B.; Bourilkov, D.; Bourquin, M.; Braccini, S.; Branson, J.G.; Brochu, F.; Buijs, A.; Burger, J.D.; Burger, W.J.; Cai, X.D.; Capell, M.; Cara Romeo, G.; Carlino, G.; Cartacci, A.; Casau, J.; Cavallari, F.; Cavallo, N.; Cecchi, C.; Cerrada, M.; Chamizo, M.; Chang, Y.H.; Chemarin, M.; Chen, A.; Chen, G.; Chen, G.M.; Chen, H.F.; Chen, H.S.; Chiefari, G.; Cifarelli, L.; Cindolo, F.; Clare, I.; Clare, R.; Coignet, G.; Colino, N.; Costantini, S.; de la Cruz, B.; Cucciarelli, S.; Dai, T.S.; van Dalen, J.A.; de Asmundis, R.; Deglont, P.; Debreczeni, J.; Degre, A.; Deiters, K.; della Volpe, D.; Delmeire, E.; Denes, P.; De Notaristefani, F.; De Salvo, A.; Diemoz, M.; Dierckxsens, M.; van Dierendonck, D.; Dionisi, C.; Dittmar, M.; Doria, A.; Dova, M.T.; Duchesneau, D.; Duinker, P.; Echenard, B.; Eline, A.; El Mamouni, H.; Engler, A.; Eppling, F.J.; Ewers, A.; Extermann, P.; Falagan, M.A.; Falciano, S.; Favara, A.; Fay, J.; Fedin, O.; Felcini, M.; Ferguson, T.; Fesefeldt, H.; Fiandrini, E.; Field, J.H.; Filthaut, F.; Fisher, P.H.; Fisher, W.; Fisk, I.; Forconi, G.; Freudenreich, K.; Furetta, C.; Galaktionov, Iouri; Ganguli, S.N.; Garcia-Abia, Pablo; Gataullin, M.; Gentile, S.; Giagu, S.; Gong, Z.F.; Grenier, Gerald Jean; Grimm, O.; Gruenewald, M.W.; Guida, M.; van Gulik, R.; Gupta, V.K.; Gurtu, A.; Gutay, L.J.; Haas, D.; Hatzifotiadou, D.; Hebbeker, T.; Herve, Alain; Hirschfelder, J.; Hofer, H.; Holzner, G.; Hou, S.R.; Hu, Y.; Jin, B.N.; Jones, Lawrence W.; de Jong, P.; Josa-Mutuberria, I.; Kafer, D.; Kaur, M.; Kienzle-Focacci, M.N.; Kim, J.K.; Kirkby, Jasper; Kittel, W.; Klimentov, A.; Konig, A.C.; Kopal, M.; Koutsenko, V.; Kraber, M.; Kraemer, R.W.; Krenz, W.; Kruger, A.; Kunin, A.; Ladron de Guevara, P.; Laktineh, I.; Landi, G.; Lebeau, M.; Lebedev, A.; Lebrun, P.; Lecomte, P.; Lecoq, P.; Le Coultre, P.; Lee, H.J.; Le Goff, J.M.; Leiste, R.; Levtchenko, P.; Li, C.; Likhoded, S.; Lin, C.H.; Lin, W.T.; Linde, F.L.; Lista, L.; Liu, Z.A.; Lohmann, W.; Longo, E.; Lu, Y.S.; Lubelsmeyer, K.; Luci, C.; Luckey, David; Luminari, L.; Lustermann, W.; Ma, W.G.; Malgeri, L.; Malinin, A.; Mana, C.; Mangeol, D.; Mans, J.; Martin, J.P.; Marzano, F.; Mazumdar, K.; McNeil, R.R.; Mele, S.; Merola, L.; Meschini, M.; Metzger, W.J.; Mihul, A.; Milcent, H.; Mirabelli, G.; Mnich, J.; Mohanty, G.B.; Muanza, G.S.; Muijs, A.J.M.; Musicar, B.; Musy, M.; Nagy, S.; Napolitano, M.; Nessi-Tedaldi, F.; Newman, H.; Niessen, T.; Nisati, A.; Kluge, Hannelies; Ofierzynski, R.; Organtini, G.; Palomares, C.; Pandoulas, D.; Paolucci, P.; Paramatti, R.; Passaleva, G.; Patricelli, S.; Paul, Thomas Cantzon; Pauluzzi, M.; Paus, C.; Pauss, F.; Pedace, M.; Pensotti, S.; Perret-Gallix, D.; Petersen, B.; Piccolo, D.; Pierella, F.; Piroue, P.A.; Pistolesi, E.; Plyaskin, V.; Pohl, M.; Pojidaev, V.; Postema, H.; Pothier, J.; Prokofiev, D.O.; Prokofiev, D.; Quartieri, J.; Rahal-Callot, G.; Rahaman, Mohammad Azizur; Raics, P.; Raja, N.; Ramelli, R.; Rancoita, P.G.; Ranieri, R.; Raspereza, A.; Razis, P.; Ren, D.; Rescigno, M.; Reucroft, S.; Riemann, S.; Riles, Keith; Roe, B.P.; Romero, L.; Rosca, A.; Rosier-Lee, S.; Roth, Stefan; Rosenbleck, C.; Roux, B.; Rubio, J.A.; Ruggiero, G.; Rykaczewski, H.; Sakharov, A.; Saremi, S.; Sarkar, S.; Salicio, J.; Sanchez, E.; Sanders, M.P.; Schafer, C.; Schegelsky, V.; Schmidt-Kaerst, S.; Schmitz, D.; Schopper, H.; Schotanus, D.J.; Schwering, G.; Sciacca, C.; Servoli, L.; Shevchenko, S.; Shivarov, N.; Shoutko, V.; Shumilov, E.; Shvorob, A.; Siedenburg, T.; Son, D.; Spillantini, P.; Steuer, M.; Stickland, D.P.; Stoyanov, B.; Straessner, A.; Sudhakar, K.; Sultanov, G.; Sun, L.Z.; Sushkov, S.; Suter, H.; Swain, J.D.; Szillasi, Z.; Tang, X.W.; Tarjan, P.; Tauscher, L.; Taylor, L.; Tellili, B.; Teyssier, D.; Timmermans, Charles; Ting, Samuel C.C.; Ting, S.M.; Tonwar, S.C.; Toth, J.; Tully, C.; Tung, K.L.; Uchida, Y.; Ulbricht, J.; Valente, E.; Van de Walle, R.T.; Veszpremi, V.; Vesztergombi, G.; Vetlitsky, I.; Vicinanza, D.; Viertel, G.; Villa, S.; Vivargent, M.; Vlachos, S.; Vodopianov, I.; Vogel, H.; Vogt, H.; Vorobiev, I.; Vorobyov, A.A.; Wadhwa, M.; Wallraff, W.; Wang, M.; Wang, X.L.; Wang, Z.M.; Weber, M.; Wienemann, P.; Wilkens, H.; Wu, S.X.; Wynhoff, S.; Xia, L.; Xu, Z.Z.; Yamamoto, J.; Yang, B.Z.; Yang, C.G.; Yang, H.J.; Yang, M.; Yeh, S.C.; Zalite, A.; Zalite, Yu.; Zhang, Z.P.; Zhao, J.; Zhu, G.Y.; Zhu, R.Y.; Zhuang, H.L.; Zichichi, A.; Zilizi, G.; Zimmermann, B.; Zoller, M.; Abbiendi, G.; Ainsley, C.; Akesson, P.F.; Alexander, G.; Allison, John; Amaral, P.; Anagnostou, G.; Anderson, K.J.; Arcelli, S.; Asai, S.; Axen, D.; Azuelos, G.; Bailey, I.; Barberio, E.; Barlow, R.J.; Batley, R.J.; Bechtle, P.; Behnke, T.; Bell, Kenneth Watson; Bell, P.J.; Bella, G.; Bellerive, A.; Benelli, G.; Bethke, S.; Biebel, O.; Bloodworth, I.J.; Boeriu, O.; Bock, P.; Bonacorsi, D.; Boutemeur, M.; Braibant, S.; Brigliadori, L.; Brown, Robert M.; Buesser, K.; Burckhart, H.J.; Campana, S.; Carnegie, R.K.; Caron, B.; Carter, A.A.; Carter, J.R.; Chang, C.Y.; Charlton, David G.; Csilling, A.; Cuffiani, M.; Dado, S.; Dallavalle, G.Marco; Dallison, S.; De Roeck, A.; De Wolf, E.A.; Desch, K.; Dienes, B.; Donkers, M.; Dubbert, J.; Duchovni, E.; Duckeck, G.; Duerdoth, I.P.; Elfgren, E.; Etzion, E.; Fabbri, F.; Feld, L.; Ferrari, P.; Fiedler, F.; Fleck, I.; Ford, M.; Frey, A.; Furtjes, A.; Gagnon, P.; Gary, John William; Gaycken, G.; Geich-Gimbel, C.; Giacomelli, G.; Giacomelli, P.; Giunta, Marina; Goldberg, J.; Gross, E.; Grunhaus, J.; Gruwe, M.; Gunther, P.O.; Gupta, A.; Hajdu, C.; Hamann, M.; Hanson, G.G.; Harder, K.; Harel, A.; Harin-Dirac, M.; Hauschild, M.; Hauschildt, J.; Hawkes, C.M.; Hawkings, R.; Hemingway, R.J.; Hensel, C.; Herten, G.; Heuer, R.D.; Hill, J.C.; Hoffman, Kara Dion; Homer, R.J.; Horvath, D.; Howard, R.; Huntemeyer, P.; Igo-Kemenes, P.; Ishii, K.; Jeremie, H.; Jovanovic, P.; Junk, T.R.; Kanaya, N.; Kanzaki, J.; Karapetian, G.; Karlen, D.; Kartvelishvili, V.; Kawagoe, K.; Kawamoto, T.; Keeler, R.K.; Kellogg, R.G.; Kennedy, B.W.; Kim, D.H.; Klein, K.; Klier, A.; Kluth, S.; Kobayashi, T.; Kobel, M.; Komamiya, S.; Kormos, Laura L.; Kowalewski, Robert V.; Kramer, T.; Kress, T.; Krieger, P.; von Krogh, J.; Krop, D.; Kruger, K.; Kupper, M.; Lafferty, G.D.; Landsman, H.; Lanske, D.; Layter, J.G.; Leins, A.; Lellouch, D.; Letts, J.; Levinson, L.; Lillich, J.; Lloyd, S.L.; Loebinger, F.K.; Lu, J.; Ludwig, J.; Macpherson, A.; Mader, W.; Marcellini, S.; Marchant, T.E.; Martin, A.J.; Masetti, G.; Mashimo, T.; Mattig, Peter; McDonald, W.J.; McKenna, J.; McMahon, T.J.; McPherson, R.A.; Meijers, F.; Mendez-Lorenzo, P.; Menges, W.; Merritt, F.S.; Mes, H.; Michelini, A.; Mihara, S.; Mikenberg, G.; Miller, D.J.; Moed, S.; Mohr, W.; Mori, T.; Mutter, A.; Nagai, K.; Nakamura, I.; Neal, H.A.; Nisius, R.; ONeale, S.W.; Oh, A.; Okpara, A.; Oreglia, M.J.; Orito, S.; Pahl, C.; Pasztor, G.; Pater, J.R.; Patrick, G.N.; Pilcher, J.E.; Pinfold, J.; Plane, David E.; Poli, B.; Polok, J.; Pooth, O.; Przybycien, M.; Quadt, A.; Rabbertz, K.; Rembser, C.; Renkel, P.; Rick, H.; Roney, J.M.; Rosati, S.; Rozen, Y.; Runge, K.; Sachs, K.; Saeki, T.; Sahr, O.; Sarkisyan, E.K.G.; Schaile, A.D.; Schaile, O.; Scharff-Hansen, P.; Schieck, J.; Schoerner-Sadenius, Thomas; Schroder, Matthias; Schumacher, M.; Schwick, C.; Scott, W.G.; Seuster, R.; Shears, T.G.; Shen, B.C.; Shepherd-Themistocleous, C.H.; Sherwood, P.; Siroli, G.; Skuja, A.; Smith, A.M.; Sobie, R.; Soldner-Rembold, S.; Spagnolo, S.; Spano, F.; Stahl, A.; Stephens, K.; Strom, David M.; Strohmer, R.; Tarem, S.; Tasevsky, M.; Taylor, R.J.; Teuscher, R.; Thomson, M.A.; Torrence, E.; Toya, D.; Tran, P.; Trefzger, T.; Tricoli, A.; Trigger, I.; Trocsanyi, Z.; Tsur, E.; Turner-Watson, M.F.; Ueda, I.; Ujvari, B.; Vachon, B.; Vollmer, C.F.; Vannerem, P.; Verzocchi, M.; Voss, H.; Vossebeld, J.; Waller, D.; Ward, C.P.; Ward, D.R.; Watkins, P.M.; Watson, A.T.; Watson, N.K.; Wells, P.S.; Wengler, T.; Wermes, N.; Wetterling, D.; Wilson, G.W.; Wilson, J.A.; Wyatt, T.R.; Yamashita, S.; Zer-Zion, D.; Zivkovic, Lidija; Heinemeyer, S.; Weiglein, G.

    2003-01-01

    The four LEP collaborations, ALEPH, DELPHI, L3 and OPAL, have collected a total of 2461 pb-1 of e+e- collision data at centre-of-mass energies between 189 and 209 GeV. The data are used to search for the Standard Model Higgs boson. The search results of the four collaborations are combined and examined in a likelihood test for their consistency with two hypotheses: the background hypothesis and the signal plus background hypothesis. The corresponding confidences have been computed as functions of the hypothetical Higgs boson mass. A lower bound of 114.4 GeV/c2 is established, at the 95% confidence level, on the mass of the Standard Model Higgs boson. The LEP data are also used to set upper bounds on the HZZ coupling for various assumptions concerning the decay of the Higgs boson.

  13. Direct search for the standard model Higgs boson

    CERN Document Server

    Janot, Patrick

    2002-01-01

    For twelve years, LEP revolutionized the knowledge of electroweak symmetry breaking within the standard model, and the direct discovery of the Higgs boson would have been the crowning achievement. Searches at the Z resonance and above the W/sup +/W/sup -/ threshold allowed an unambiguous lower limit on the mass of the standard model Higgs boson to set be at 114.1 GeV.c/sup -2/. After years of efforts to push the LEP performance far beyond the design limits, hints of what could be the first signs of the existence of a 115 GeV-c/sup -2/ Higgs boson appeared in June 2000, were confirmed in September, and were then confirmed again in November. An additional six-month period of LEP operation was enough to provide a definite answer, with an opportunity to make a fundamental discovery of prime importance. (37 refs).

  14. Standards for Documenting Finite‐Fault Earthquake Rupture Models

    KAUST Repository

    Mai, Paul Martin

    2016-04-06

    In this article, we propose standards for documenting and disseminating finite‐fault earthquake rupture models, and related data and metadata. A comprehensive documentation of the rupture models, a detailed description of the data processing steps, and facilitating the access to the actual data that went into the earthquake source inversion are required to promote follow‐up research and to ensure interoperability, transparency, and reproducibility of the published slip‐inversion solutions. We suggest a formatting scheme that describes the kinematic rupture process in an unambiguous way to support subsequent research. We also provide guidelines on how to document the data, metadata, and data processing. The proposed standards and formats represent a first step to establishing best practices for comprehensively documenting input and output of finite‐fault earthquake source studies.

  15. Aspects of Particle Physics Beyond the Standard Model

    Science.gov (United States)

    Lu, Xiaochuan

    This dissertation describes a few aspects of particles beyond the Standard Model, with a focus on the remaining questions after the discovery of a Standard Model-like Higgs boson. In specific, three topics are discussed in sequence: neutrino mass and baryon asymmetry, naturalness problem of Higgs mass, and placing constraints on theoretical models from precision measurements. First, the consequence of the neutrino mass anarchy on cosmology is studied. Attentions are paid in particular to the total mass of neutrinos and baryon asymmetry through leptogenesis. With the assumption of independence among mass matrix entries in addition to the basis independence, Gaussian measure is the only choice. On top of Gaussian measure, a simple approximate U(1) flavor symmetry makes leptogenesis highly successful. Correlations between the baryon asymmetry and the light-neutrino quantities are investigated. Also discussed are possible implications of recently suggested large total mass of neutrinos by the SDSS/BOSS data. Second, the Higgs mass implies fine-tuning for minimal theories of weak-scale supersymmetry (SUSY). Non-decoupling effects can boost the Higgs mass when new states interact with the Higgs, but new sources of SUSY breaking that accompany such extensions threaten naturalness. I will show that two singlets with a Dirac mass can increase the Higgs mass while maintaining naturalness in the presence of large SUSY breaking in the singlet sector. The modified Higgs phenomenology of this scenario, termed "Dirac NMSSM", is also studied. Finally, the sensitivities of future precision measurements in probing physics beyond the Standard Model are studied. A practical three-step procedure is presented for using the Standard Model effective field theory (SM EFT) to connect ultraviolet (UV) models of new physics with weak scale precision observables. With this procedure, one can interpret precision measurements as constraints on the UV model concerned. A detailed explanation is

  16. Behavior model for performance assessment.

    Energy Technology Data Exchange (ETDEWEB)

    Borwn-VanHoozer, S. A.

    1999-07-23

    Every individual channels information differently based on their preference of the sensory modality or representational system (visual auditory or kinesthetic) we tend to favor most (our primary representational system (PRS)). Therefore, some of us access and store our information primarily visually first, some auditorily, and others kinesthetically (through feel and touch); which in turn establishes our information processing patterns and strategies and external to internal (and subsequently vice versa) experiential language representation. Because of the different ways we channel our information, each of us will respond differently to a task--the way we gather and process the external information (input), our response time (process), and the outcome (behavior). Traditional human models of decision making and response time focus on perception, cognitive and motor systems stimulated and influenced by the three sensory modalities, visual, auditory and kinesthetic. For us, these are the building blocks to knowing how someone is thinking. Being aware of what is taking place and how to ask questions is essential in assessing performance toward reducing human errors. Existing models give predications based on time values or response times for a particular event, and may be summed and averaged for a generalization of behavior(s). However, by our not establishing a basic understanding of the foundation of how the behavior was predicated through a decision making strategy process, predicative models are overall inefficient in their analysis of the means by which behavior was generated. What is seen is the end result.

  17. Behavior model for performance assessment

    International Nuclear Information System (INIS)

    Brown-VanHoozer, S. A.

    1999-01-01

    Every individual channels information differently based on their preference of the sensory modality or representational system (visual auditory or kinesthetic) we tend to favor most (our primary representational system (PRS)). Therefore, some of us access and store our information primarily visually first, some auditorily, and others kinesthetically (through feel and touch); which in turn establishes our information processing patterns and strategies and external to internal (and subsequently vice versa) experiential language representation. Because of the different ways we channel our information, each of us will respond differently to a task--the way we gather and process the external information (input), our response time (process), and the outcome (behavior). Traditional human models of decision making and response time focus on perception, cognitive and motor systems stimulated and influenced by the three sensory modalities, visual, auditory and kinesthetic. For us, these are the building blocks to knowing how someone is thinking. Being aware of what is taking place and how to ask questions is essential in assessing performance toward reducing human errors. Existing models give predications based on time values or response times for a particular event, and may be summed and averaged for a generalization of behavior(s). However, by our not establishing a basic understanding of the foundation of how the behavior was predicated through a decision making strategy process, predicative models are overall inefficient in their analysis of the means by which behavior was generated. What is seen is the end result

  18. Gold-standard performance for 2D hydrodynamic modeling

    Science.gov (United States)

    Pasternack, G. B.; MacVicar, B. J.

    2013-12-01

    Two-dimensional, depth-averaged hydrodynamic (2D) models are emerging as an increasingly useful tool for environmental water resources engineering. One of the remaining technical hurdles to the wider adoption and acceptance of 2D modeling is the lack of standards for 2D model performance evaluation when the riverbed undulates, causing lateral flow divergence and convergence. The goal of this study was to establish a gold-standard that quantifies the upper limit of model performance for 2D models of undulating riverbeds when topography is perfectly known and surface roughness is well constrained. A review was conducted of published model performance metrics and the value ranges exhibited by models thus far for each one. Typically predicted velocity differs from observed by 20 to 30 % and the coefficient of determination between the two ranges from 0.5 to 0.8, though there tends to be a bias toward overpredicting low velocity and underpredicting high velocity. To establish a gold standard as to the best performance possible for a 2D model of an undulating bed, two straight, rectangular-walled flume experiments were done with no bed slope and only different bed undulations and water surface slopes. One flume tested model performance in the presence of a porous, homogenous gravel bed with a long flat section, then a linear slope down to a flat pool bottom, and then the same linear slope back up to the flat bed. The other flume had a PVC plastic solid bed with a long flat section followed by a sequence of five identical riffle-pool pairs in close proximity, so it tested model performance given frequent undulations. Detailed water surface elevation and velocity measurements were made for both flumes. Comparing predicted versus observed velocity magnitude for 3 discharges with the gravel-bed flume and 1 discharge for the PVC-bed flume, the coefficient of determination ranged from 0.952 to 0.987 and the slope for the regression line was 0.957 to 1.02. Unsigned velocity

  19. Standard model physics with the ATLAS early data

    CERN Document Server

    Bruckman de Renstrom, Pawel

    2006-01-01

    The Standard Model, despite its open questions, has proved its consistency and predictive power to very high accuracy within the currently available energy reach. LHC, with its high CM energy and luminosity, will give us insight into new processes, possibly showing evidence of “new physics”. Excellent understanding of the SM processes will also be a key to discriminate against any new phenomena. Prospects of selected SM measurements with the ATLAS detector using early LHC luminosity are presented.

  20. Signatures of baryogenesis in the minimal supersymmetric standard model

    International Nuclear Information System (INIS)

    Murayama, Hitoshi; Pierce, Aaron

    2003-01-01

    We reexamine the electroweak baryogenesis within the context of the minimal supersymmetric standard model, studying its potential collider signatures. We find that this mechanism of baryogenesis does not give a new CP violating signal at the B factories. The first circumstantial evidence may come from enhanced B s or B d mixing. If a light right-handed scalar top quark and Higgs boson are found as required, a linear collider represents the best possibility for confirming the scenario

  1. Physics beyond the standard model and cosmological connections ...

    Indian Academy of Sciences (India)

    E-mail: Sridhar@theory.tifr.res.in. Abstract. The international linear collider (ILC) is .... ILC operates at its highest planned centre-of-mass energy of 2 TeV. The alternative is to do a combined LHC/ILC .... A paper which studied the modification of the standard Einstein–Hilbert action in models of TeV-scale gravity through the ...

  2. The Standard Model and the neutron beta-decay

    CERN Document Server

    Abele, H

    2000-01-01

    This article reviews the relationship between the observables in neutron beta-decay and the accepted modern theory of particle physics known as the Standard Model. Recent neutron-decay measurements of various mixed American-British-French-German-Russian collaborations try to shed light on the following topics: the coupling strength of charged weak currents, the universality of the electroweak interaction and the origin of parity violation.

  3. Standard model parameters and the search for new physics

    International Nuclear Information System (INIS)

    Marciano, W.J.

    1988-04-01

    In these lectures, my aim is to present an up-to-date status report on the standard model and some key tests of electroweak unification. Within that context, I also discuss how and where hints of new physics may emerge. To accomplish those goals, I have organized my presentation as follows: I discuss the standard model parameters with particular emphasis on the gauge coupling constants and vector boson masses. Examples of new physics appendages are also briefly commented on. In addition, because these lectures are intended for students and thus somewhat pedagogical, I have included an appendix on dimensional regularization and a simple computational example that employs that technique. Next, I focus on weak charged current phenomenology. Precision tests of the standard model are described and up-to-date values for the Cabibbo-Kobayashi-Maskawa (CKM) mixing matrix parameters are presented. Constraints implied by those tests for a 4th generation, supersymmetry, extra Z/prime/ bosons, and compositeness are also discussed. I discuss weak neutral current phenomenology and the extraction of sin/sup 2/ /theta//sub W/ from experiment. The results presented there are based on a recently completed global analysis of all existing data. I have chosen to concentrate that discussion on radiative corrections, the effect of a heavy top quark mass, and implications for grand unified theories (GUTS). The potential for further experimental progress is also commented on. I depart from the narrowest version of the standard model and discuss effects of neutrino masses and mixings. I have chosen to concentrate on oscillations, the Mikheyev-Smirnov- Wolfenstein (MSW) effect, and electromagnetic properties of neutrinos. On the latter topic, I will describe some recent work on resonant spin-flavor precession. Finally, I conclude with a prospectus on hopes for the future. 76 refs

  4. Exploring and testing the Standard Model and beyond

    International Nuclear Information System (INIS)

    West, G.; Cooper, F.; Ginsparg, P.; Habib, S.; Gupta, R.; Mottola, E.; Nieto, M.; Mattis, M.

    1998-01-01

    This is the final report of a three-year, Laboratory Directed Research and Development (LDRD) project at Los Alamos National Laboratory (LANL). The goal of this project was to extend and develop the predictions of the Standard Model of particle physics in several different directions. This includes various aspects of the strong nuclear interactions in quantum chromodynamics (QCD), electroweak interactions and the origin of baryon asymmetry in the universe, as well as gravitational physics

  5. LEP asymmetries and fits of the standard model

    International Nuclear Information System (INIS)

    Pietrzyk, B.

    1994-01-01

    The lepton and quark asymmetries measured at LEP are presented. The results of the Standard Model fits to the electroweak data presented at this conference are given. The top mass obtained from the fit to the LEP data is 172 -14-20 +13+18 GeV; it is 177 -11-19 +11+18 when also the collider, ν and A LR data are included. (author). 10 refs., 3 figs., 2 tabs

  6. Asymptotically Safe Standard Model Extensions arXiv

    CERN Document Server

    Pelaggi, Giulio Maria; Salvio, Alberto; Sannino, Francesco; Smirnov, Juri; Strumia, Alessandro

    We consider theories with a large number NF of charged fermions and compute the renormalisation group equations for the gauge, Yukawa and quartic couplings resummed at leading order in NF. We construct extensions of the Standard Model where SU(2) and/or SU(3) are asymptotically safe. When the same procedure is applied to the Abelian U(1) factor, we find that the Higgs quartic can not be made asymptotically safe and stay perturbative at the same time.

  7. Observations in particle physics: from two neutrinos to standard model

    International Nuclear Information System (INIS)

    Lederman, L.M.

    1990-01-01

    Experiments, which have made their contribution to creation of the standard model, are discussed. Results of observations on the following concepts: long-lived neutral V-particles, violation of preservation of parity and charge invariance in meson decays, reaction with high-energy neutrino and existence of neutrino of two types, partons and dynamic quarks, dimuon resonance at 9.5 GeV in 400 GeV-proton-nucleus collisions, are considered

  8. Framework for an asymptotically safe standard model via dynamical breaking

    DEFF Research Database (Denmark)

    Abel, Steven; Sannino, Francesco

    2017-01-01

    We present a consistent embedding of the matter and gauge content of the Standard Model into an underlying asymptotically safe theory that has a well-determined interacting UV fixed point in the large color/flavor limit. The scales of symmetry breaking are determined by two mass-squared parameters...... with the breaking of electroweak symmetry being driven radiatively. There are no other free parameters in the theory apart from gauge couplings....

  9. The Assessment Of The Level Of Management Control Standards Completion In Treasury Sector

    Directory of Open Access Journals (Sweden)

    Kulińska Ewa

    2015-06-01

    Full Text Available This paper concerns the rules of the functioning of management control standards used in the Treasury Control Office. Its purpose is to present research results conducted in the years 2013–2014 in Polish Treasury Control Offices. Obtained results are the effect of applying author’s model of the assessment of management control implementation. The research was conducted for management personnel and the rest of offices employees separately. Significant discrepancies between these two groups of respondents were indicated. Based on the results, the areas of deviation from expected level of management control standards were established and the areas where implementation of control mechanisms relying on increasing the supervision of board of directors over managers were indicated, providing permanent and efficient elements of managers supervision over subordinate employees and making purposes and tasks put on the Treasury Control Office for given year more precise and familiarization of employees and carrying out trainings and series of other corrective measures.

  10. No Evidence for Extensions to the Standard Cosmological Model

    Science.gov (United States)

    Heavens, Alan; Fantaye, Yabebal; Sellentin, Elena; Eggers, Hans; Hosenie, Zafiirah; Kroon, Steve; Mootoovaloo, Arrykrishna

    2017-09-01

    We compute the Bayesian evidence for models considered in the main analysis of Planck cosmic microwave background data. By utilizing carefully defined nearest-neighbor distances in parameter space, we reuse the Monte Carlo Markov chains already produced for parameter inference to compute Bayes factors B for many different model-data set combinations. The standard 6-parameter flat cold dark matter model with a cosmological constant (Λ CDM ) is favored over all other models considered, with curvature being mildly favored only when cosmic microwave background lensing is not included. Many alternative models are strongly disfavored by the data, including primordial correlated isocurvature models (ln B =-7.8 ), nonzero scalar-to-tensor ratio (ln B =-4.3 ), running of the spectral index (ln B =-4.7 ), curvature (ln B =-3.6 ), nonstandard numbers of neutrinos (ln B =-3.1 ), nonstandard neutrino masses (ln B =-3.2 ), nonstandard lensing potential (ln B =-4.6 ), evolving dark energy (ln B =-3.2 ), sterile neutrinos (ln B =-6.9 ), and extra sterile neutrinos with a nonzero scalar-to-tensor ratio (ln B =-10.8 ). Other models are less strongly disfavored with respect to flat Λ CDM . As with all analyses based on Bayesian evidence, the final numbers depend on the widths of the parameter priors. We adopt the priors used in the Planck analysis, while performing a prior sensitivity analysis. Our quantitative conclusion is that extensions beyond the standard cosmological model are disfavored by Planck data. Only when newer Hubble constant measurements are included does Λ CDM become disfavored, and only mildly, compared with a dynamical dark energy model (ln B ˜+2 ).

  11. In Search Of The Standard Model Higgs Boson

    CERN Document Server

    http://inspirehep.net/record/666184/files/fermilab-thesis-2004-35.PDF, R

    2002-01-01

    A search for the Standard Model Higgs boson is conducted using data from the L3 detector at CERN's LEP collider during the year 2000. The integrated luminosity collected was 217.4 pb−1 of electron-positron collisions at center-of-mass energies from 200 to 209 GeV. Presented here is a search for e+ e− → hZ, where the Higgs decays into b quarks and the Z boson decays into undetected neutrinos. Also presented are combined results from the other L3 channels. The L3 combined results are consistent with the Standard Model background. Preliminary results from the LEP-wide combination are also shown. The lower limit on the Standard Model Higgs mass is found to be mh>114.1GeV at95%C.L. In the LEP combination, an excess of data events is observed near mh ∼ 115.6 GeV. Whether this is due to a statistical fluctuation or to Higgs production cannot be determined from the available set of data.

  12. Stress-testing the Standard Model at the LHC

    CERN Document Server

    2016-01-01

    With the high-energy run of the LHC now underway, and clear manifestations of beyond-Standard-Model physics not yet seen in data from the previous run, the search for new physics at the LHC may be a quest for small deviations with big consequences. If clear signals are present, precise predictions and measurements will again be crucial for extracting the maximum information from the data, as in the case of the Higgs boson. Precision will therefore remain a key theme for particle physics research in the coming years. The conference will provide a forum for experimentalists and theorists to identify the challenges and refine the tools for high-precision tests of the Standard Model and searches for signals of new physics at Run II of the LHC. Topics to be discussed include: pinning down Standard Model corrections to key LHC processes; combining fixed-order QCD calculations with all-order resummations and parton showers; new developments in jet physics concerning jet substructure, associated jets and boosted je...

  13. Supersymmetry and String Theory: Beyond the Standard Model

    International Nuclear Information System (INIS)

    Rocek, Martin

    2007-01-01

    When I was asked to review Michael Dine's new book, 'Supersymmetry and String Theory', I was pleased to have a chance to read a book by such an established authority on how string theory might become testable. The book is most useful as a list of current topics of interest in modern theoretical physics. It gives a succinct summary of a huge variety of subjects, including the standard model, symmetry, Yang-Mills theory, quantization of gauge theories, the phenomenology of the standard model, the renormalization group, lattice gauge theory, effective field theories, anomalies, instantons, solitons, monopoles, dualities, technicolor, supersymmetry, the minimal supersymmetric standard model, dynamical supersymmetry breaking, extended supersymmetry, Seiberg-Witten theory, general relativity, cosmology, inflation, bosonic string theory, the superstring, the heterotic string, string compactifications, the quintic, string dualities, large extra dimensions, and, in the appendices, Goldstone's theorem, path integrals, and exact beta-functions in supersymmetric gauge theories. Its breadth is both its strength and its weakness: it is not (and could not possibly be) either a definitive reference for experts, where the details of thorny technical issues are carefully explored, or a textbook for graduate students, with detailed pedagogical expositions. As such, it complements rather than replaces the much narrower and more focussed String Theory I and II volumes by Polchinski, with their deep insights, as well the two older volumes by Green, Schwarz, and Witten, which develop string theory pedagogically. (book review)

  14. Review of Current Standard Model Results in ATLAS

    CERN Document Server

    Brandt, Gerhard; The ATLAS collaboration

    2018-01-01

    This talk highlights results selected from the Standard Model research programme of the ATLAS Collaboration at the Large Hadron Collider. Results using data from $p-p$ collisions at $\\sqrt{s}=7,8$~TeV in LHC Run-1 as well as results using data at $\\sqrt{s}=13$~TeV in LHC Run-2 are covered. The status of cross section measurements from soft QCD processes and jet production as well as photon production are presented. The presentation extends to vector boson production with associated jets. Precision measurements of the production of $W$ and $Z$ bosons, including a first measurement of the mass of the $W$ bosons, $m_W$, are discussed. The programme to measure electroweak processes with di-boson and tri-boson final states is outlined. All presented measurements are compatible with Standard Model descriptions and allow to further constrain it. In addition they allow to probe new physics which would manifest through extra gauge couplings, or Standard Model gauge couplings deviating from their predicted value.

  15. Impersonating the Standard Model Higgs boson: alignment without decoupling

    International Nuclear Information System (INIS)

    Carena, Marcela; Low, Ian; Shah, Nausheen R.; Wagner, Carlos E.M.

    2014-01-01

    In models with an extended Higgs sector there exists an alignment limit, in which the lightest CP-even Higgs boson mimics the Standard Model Higgs. The alignment limit is commonly associated with the decoupling limit, where all non-standard scalars are significantly heavier than the Z boson. However, alignment can occur irrespective of the mass scale of the rest of the Higgs sector. In this work we discuss the general conditions that lead to “alignment without decoupling”, therefore allowing for the existence of additional non-standard Higgs bosons at the weak scale. The values of tan β for which this happens are derived in terms of the effective Higgs quartic couplings in general two-Higgs-doublet models as well as in supersymmetric theories, including the MSSM and the NMSSM. Moreover, we study the information encoded in the variations of the SM Higgs-fermion couplings to explore regions in the m A −tan β parameter space

  16. Efficient Lattice-Based Signcryption in Standard Model

    Directory of Open Access Journals (Sweden)

    Jianhua Yan

    2013-01-01

    Full Text Available Signcryption is a cryptographic primitive that can perform digital signature and public encryption simultaneously at a significantly reduced cost. This advantage makes it highly useful in many applications. However, most existing signcryption schemes are seriously challenged by the booming of quantum computations. As an interesting stepping stone in the post-quantum cryptographic community, two lattice-based signcryption schemes were proposed recently. But both of them were merely proved to be secure in the random oracle models. Therefore, the main contribution of this paper is to propose a new lattice-based signcryption scheme that can be proved to be secure in the standard model.

  17. Secure Certificateless Signature with Revocation in the Standard Model

    Directory of Open Access Journals (Sweden)

    Tung-Tso Tsai

    2014-01-01

    previously proposed certificateless signature schemes were insecure under a considerably strong security model in the sense that they suffered from outsiders’ key replacement attacks or the attacks from the key generation center (KGC. In this paper, we propose a certificateless signature scheme without random oracles. Moreover, our scheme is secure under the strong security model and provides a public revocation mechanism, called revocable certificateless signature (RCLS. Under the standard computational Diffie-Hellman assumption, we formally demonstrate that our scheme possesses existential unforgeability against adaptive chosen-message attacks.

  18. Assessment of Usability Benchmarks: Combining Standardized Scales with Specific Questions

    Directory of Open Access Journals (Sweden)

    Stephanie Bettina Linek

    2011-12-01

    Full Text Available The usability of Web sites and online services is of rising importance. When creating a completely new Web site, qualitative data are adequate for identifying the most usability problems. However, changes of an existing Web site should be evaluated by a quantitative benchmarking process. The proposed paper describes the creation of a questionnaire that allows a quantitative usability benchmarking, i.e. a direct comparison of the different versions of a Web site and an orientation on general standards of usability. The questionnaire is also open for qualitative data. The methodology will be explained by the digital library services of the ZBW.

  19. Assessment of Offshore Wind System Design, Safety, and Operation Standards

    Energy Technology Data Exchange (ETDEWEB)

    Sirnivas, Senu [National Renewable Energy Lab. (NREL), Golden, CO (United States); Musial, Walt [National Renewable Energy Lab. (NREL), Golden, CO (United States); Bailey, Bruce [AWS Trupower LLC, Albany, NY (United States); Filippelli, Matthew [AWS Trupower LLC, Albany, NY (United States)

    2014-01-01

    This report is a deliverable for a project sponsored by the U.S. Department of Energy (DOE) entitled National Offshore Wind Energy Resource and Design Data Campaign -- Analysis and Collaboration (contract number DE-EE0005372; prime contractor -- AWS Truepower). The project objective is to supplement, facilitate, and enhance ongoing multiagency efforts to develop an integrated national offshore wind energy data network. The results of this initiative are intended to 1) produce a comprehensive definition of relevant met-ocean resource assets and needs and design standards, and 2) provide a basis for recommendations for meeting offshore wind energy industry data and design certification requirements.

  20. Data interchange standards in healthcare: semantic interoperability in preoperative assessment

    NARCIS (Netherlands)

    Ahmadian, L.

    2011-01-01

    Om risico’s van een operatie in te schatten wordt vooraf een assessment gedaan, maar een standaard ontbreekt. Leila Ahmadian ontwikkelde op basis van expertconsensus en literatuur een gestandaardiseerde kerndataset. Het gebruik van gestandaardiseerde data moet miscommunicatie verminderen en dubbele

  1. A Comprehensive Evaluation of Standardized Assessment Tools in the Diagnosis of Fibromyalgia and in the Assessment of Fibromyalgia Severity

    OpenAIRE

    Boomershine, Chad S.

    2012-01-01

    Standard assessments for fibromyalgia (FM) diagnosis and core FM symptom domains are needed for biomarker development and treatment trials. Diagnostic and symptom assessments are reviewed and recommendations are made for standards. Recommendations for existing assessments include the American College of Rheumatology FM classification criteria using the manual tender point Survey for diagnosis, the brief pain inventory average pain visual analogue scale for pain intensity, the function subscal...

  2. The Universal Thermal Climate Index UTCI compared to ergonomics standards for assessing the thermal environment.

    Science.gov (United States)

    Bröde, Peter; Błazejczyk, Krzysztof; Fiala, Dusan; Havenith, George; Holmér, Ingvar; Jendritzky, Gerd; Kuklane, Kalev; Kampmann, Bernhard

    2013-01-01

    The growing need for valid assessment procedures of the outdoor thermal environment in the fields of public weather services, public health systems, urban planning, tourism & recreation and climate impact research raised the idea to develop the Universal Thermal Climate Index UTCI based on the most recent scientific progress both in thermo-physiology and in heat exchange theory. Following extensive validation of accessible models of human thermoregulation, the advanced multi-node 'Fiala' model was selected to form the basis of UTCI. This model was coupled with an adaptive clothing model which considers clothing habits by the general urban population and behavioral changes in clothing insulation related to actual environmental temperature. UTCI was developed conceptually as an equivalent temperature. Thus, for any combination of air temperature, wind, radiation, and humidity, UTCI is defined as the air temperature in the reference condition which would elicit the same dynamic response of the physiological model. This review analyses the sensitivity of UTCI to humidity and radiation in the heat and to wind in the cold and compares the results with observational studies and internationally standardized assessment procedures. The capabilities, restrictions and potential future extensions of UTCI are discussed.

  3. 25 CFR 36.50 - Standard XVII-School program evaluation and needs assessment.

    Science.gov (United States)

    2010-04-01

    ... assessment. Each school shall complete a formal, formative evaluation at least once every seven (7) years... each school, Agency or Area, as appropriate, a standardized needs assessment and evaluation instrument... 25 Indians 1 2010-04-01 2010-04-01 false Standard XVII-School program evaluation and needs...

  4. Framework for Assessing the ICT Competency in Teachers up to the Requirements of "Teacher" Occupational Standard

    Science.gov (United States)

    Avdeeva, Svetlana; Zaichkina, Olga; Nikulicheva, Nataliya; Khapaeva, Svetlana

    2016-01-01

    The paper deals with problems of working out a test framework for the assessment of teachers' ICT competency in line with the requirements of "Teacher" occupational standard. The authors have analyzed the known approaches to assessing teachers' ICT competency--ISTE Standards and UNESCO ICT CFT and have suggested their own approach to…

  5. Assessing the Effects of Corporate Social Responsibility Standards in Global Value Chains

    DEFF Research Database (Denmark)

    Lund-Thomsen, Peter

    This paper considers the issue of corporate social responsibility (CSR) standard impact assessment in global value chains. CSR standards have proliferated in recent years, and several studies have attempted to assess their effects on local producers, workers, and the environment in developing...... good to the intended beneficiaries - developing country firms, farmers, workers, and communities - unless these ethical and political dilemmas are given serious consideration....

  6. 7 CFR 319.40-11 - Plant pest risk assessment standards.

    Science.gov (United States)

    2010-01-01

    ... analysis to determine the plant pest risks associated with each requested importation in order to determine... 7 Agriculture 5 2010-01-01 2010-01-01 false Plant pest risk assessment standards. 319.40-11... Unmanufactured Wood Articles § 319.40-11 Plant pest risk assessment standards. When evaluating a request to...

  7. EMBEDding the CEFR in Academic Writing Assessment : A case study in training and standardization

    NARCIS (Netherlands)

    Haines, Kevin; Lowie, Wander; Jansma, Petra; Schmidt, Nicole

    2013-01-01

    The CEFR is increasingly being used as the framework of choice for the assessment of language proficiency at universities across Europe. However, to attain consistent assessment, familiarization and standardization are essential. In this paper we report a case study of embedding a standardization

  8. Non-generic couplings in supersymmetric standard models

    Directory of Open Access Journals (Sweden)

    Evgeny I. Buchbinder

    2015-09-01

    Full Text Available We study two phases of a heterotic standard model, obtained from a Calabi–Yau compactification of the E8×E8 heterotic string, in the context of the associated four-dimensional effective theories. In the first phase we have a standard model gauge group, an MSSM spectrum, four additional U(1 symmetries and singlet fields. In the second phase, obtained from the first by continuing along the singlet directions, three of the additional U(1 symmetries are spontaneously broken and the remaining one is a B–L symmetry. In this second phase, dimension five operators inducing proton decay are consistent with all symmetries and as such, they are expected to be present. We show that, contrary to this expectation, these operators are forbidden due to the additional U(1 symmetries present in the first phase of the model. We emphasise that such “unexpected” absences of operators, due to symmetry enhancement at specific loci in the moduli space, can be phenomenologically relevant and, in the present case, protect the model from fast proton decay.

  9. BiGG Models: A platform for integrating, standardizing and sharing genome-scale models

    DEFF Research Database (Denmark)

    King, Zachary A.; Lu, Justin; Dräger, Andreas

    2016-01-01

    Genome-scale metabolic models are mathematically-structured knowledge bases that can be used to predict metabolic pathway usage and growth phenotypes. Furthermore, they can generate and test hypotheses when integrated with experimental data. To maximize the value of these models, centralized....... Reaction and metabolite identifiers have been standardized across models to conform to community standards and enable rapid comparison across models. Furthermore, BiGG Models provides a comprehensive application programming interface for accessing BiGG Models with modeling and analysis tools. As a resource...... for highly curated, standardized and accessible models of metabolism, BiGG Models will facilitate diverse systems biology studies and support knowledge-based analysis of diverse experimental data....

  10. Flavour alignment in physics beyond the standard model

    Energy Technology Data Exchange (ETDEWEB)

    Braeuninger, Carolin Barbara

    2012-11-21

    There are numerous reasons to think that the Standard Model of physics is not the ultimate theory of nature on very small scales. However, attempts to construct theories that go beyond the Standard Model generically lead to high rates of flavour changing neutral processes that are in conflict with experiment: Quarks are the fundamental constituents of protons and neutrons. Together with electrons they form the visible matter of the universe1. They come in three generations or ''flavours''. In interactions, quarks of different generations can mix, i.e. a quark of one flavour can transform into a quark of another flavour. In the Standard Model, at first order in perturbation theory, such processes occur only via the exchange of a charged particle. Flavour changing neutral processes can only arise in processes involving loops of charged particles. This is due to the fact that all couplings of two quarks to a neutral particle are diagonal in the basis of the mass eigenstates of the quarks. There is thus no mixing of quarks of different flavour at first order. Since the loop processes are suppressed by a loop factor, the Standard Model predicts very low rates for neutral processes that change the flavour of quarks. So far, this is in agreement with experiment. In extensions of the Standard Model, new couplings to the quarks are usually introduced. In general there is no reason why the new coupling matrices should be diagonal in the mass basis of the quarks. These models therefore predict high rates for processes that mix quarks of different flavour. Extensions of the Standard Model must therefore have a non-trivial flavour structure. A possibility to avoid flavour violation is to assume that the new couplings are aligned with the mass matrices of the quarks, i.e. diagonal in the same basis. This alignment could be due to a flavour symmetry. In this thesis, two extensions of the Standard Model with alignment are studied. The first is a simple

  11. Utility of Social Modeling for Proliferation Assessment - Preliminary Assessment

    Energy Technology Data Exchange (ETDEWEB)

    Coles, Garill A.; Gastelum, Zoe N.; Brothers, Alan J.; Thompson, Sandra E.

    2009-06-01

    This Preliminary Assessment draft report will present the results of a literature search and preliminary assessment of the body of research, analysis methods, models and data deemed to be relevant to the Utility of Social Modeling for Proliferation Assessment research. This report will provide: 1) a description of the problem space and the kinds of information pertinent to the problem space, 2) a discussion of key relevant or representative literature, 3) a discussion of models and modeling approaches judged to be potentially useful to the research, and 4) the next steps of this research that will be pursued based on this preliminary assessment. This draft report represents a technical deliverable for the NA-22 Simulations, Algorithms, and Modeling (SAM) program. Specifically this draft report is the Task 1 deliverable for project PL09-UtilSocial-PD06, Utility of Social Modeling for Proliferation Assessment. This project investigates non-traditional use of social and cultural information to improve nuclear proliferation assessment, including nonproliferation assessment, proliferation resistance assessments, safeguards assessments and other related studies. These assessments often use and create technical information about the State’s posture towards proliferation, the vulnerability of a nuclear energy system to an undesired event, and the effectiveness of safeguards. This project will find and fuse social and technical information by explicitly considering the role of cultural, social and behavioral factors relevant to proliferation. The aim of this research is to describe and demonstrate if and how social science modeling has utility in proliferation assessment.

  12. An overview of failure assessment methods in codes and standards

    International Nuclear Information System (INIS)

    Zerbst, U.; Ainsworth, R.A.

    2003-01-01

    This volume provides comprehensive up-to-date information on the assessment of the integrity of engineering structures containing crack-like flaws, in the absence of effects of creep at elevated temperatures (see volume 5) and of environment (see volume 6). Key methods are extensively reviewed and background information as well as validation is given. However, it should be kept in mind that for actual detailed assessments the relevant documents have to be consulted. In classical engineering design, an applied stress is compared with the appropriate material resistance expressed in terms of a limit stress, such as the yield strength or fatigue endurance limit. As long as the material resistance exceeds the applied stress, integrity of the component is assured. It is implicitly assumed that the component is defect-free but design margins provide some protection against defects. Modern design and operation philosophies, however, take explicit account of the possible presence of defects in engineering components. Such defects may arise from fabrication, e.g., during casting, welding, or forming processes, or may develop during operation. They may extend during operation and eventually lead to failure, which in the ideal case occurs beyond the design life of the component. Failure assessment methods are based upon the behavior of sharp cracks in structures, and for this reason all flaws or defects found in structures have to be treated as if they are sharp planar cracks. Hence the terms flaw or defect should be regarded as being interchangeable with the term crack throughout this volume. (orig.)

  13. Standard guide for three methods of assessing buried steel tanks

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    1998-01-01

    1.1 This guide covers procedures to be implemented prior to the application of cathodic protection for evaluating the suitability of a tank for upgrading by cathodic protection alone. 1.2 Three procedures are described and identified as Methods A, B, and C. 1.2.1 Method A—Noninvasive with primary emphasis on statistical and electrochemical analysis of external site environment corrosion data. 1.2.2 Method B—Invasive ultrasonic thickness testing with external corrosion evaluation. 1.2.3 Method C—Invasive permanently recorded visual inspection and evaluation including external corrosion assessment. 1.3 This guide presents the methodology and the procedures utilizing site and tank specific data for determining a tank's condition and the suitability for such tanks to be upgraded with cathodic protection. 1.4 The tank's condition shall be assessed using Method A, B, or C. Prior to assessing the tank, a preliminary site survey shall be performed pursuant to Section 8 and the tank shall be tightness test...

  14. Risk assessment of manual material handling activities (case study: PT BRS Standard Industry)

    Science.gov (United States)

    Deviani; Triyanti, V.

    2017-12-01

    The process of moving material manually has the potential for injury to workers. The risk of injury will increase if we do not pay attention to the working conditions. The purpose of this study is to assess and analyze the injury risk level in manual handling material activity, as well as to improve the condition. The observed manual material handling activities is pole lifting and goods loading. These activities were analyzed using Job Strain Index method, Rapid Entire Body Assessment, and Chaffin’s 2D Planar Static Model. The results show that most workers who perform almost all activities have a high level of risk level with the score of JSI and REBA exceeds 9 points. For some activities, the estimated compression forces in the lumbar area also exceed the standard limits of 3400 N. Concerning this condition, several suggestions for improvement were made, improving the composition of packing, improving body posture, and making guideline posters.

  15. Selected topics in phenomenology of the standard model

    International Nuclear Information System (INIS)

    Roberts, R.G.

    1992-01-01

    We begin with the structure of the proton which is revealed through deep inelastic scattering of nucleons by electron/muon or neutrino scattering off nucleons. The quark parton model is described which leads on to the interaction of quarks and gluons - quantum chromodynamics (QCD). From this parton distributions can be extracted and then fed into the quark parton description of hadron-hadron collisions. In this way we analyse large p T jet production, prompt photon production and dilepton, W and Z production (Drell-Yan mechanism), ending with a study of heavy quark production. W and Z physics is then discussed. The various definitions at the tree level of sin 2 θ w are listed and then the radiative corrections to these are briefly considered. The data from European Large Electron-Positron storage rings (LEP) then allow limits to be set on the mass of the top quark and the Higgs via these corrections. Standard model predictions for the various Z widths are compared with the latest LEP data. Electroweak effects in e + e - scattering are discussed together with the extraction of the various vector and axial-vector couplings involved. We return to QCD when the production of jets in e + e - is studied. Both the LEP and lower energy data are able to give quantitative estimates of the strong coupling α s and the consistency of the various estimates and those from other QCD processes are discussed. The value of α s (M z ) actually plays an important role in setting the scale of the possible supersymmetry (SUSY) physics beyond the standard model. Finally the subject of quark mixing is addressed. How the the values of the various CKM matrix elements are derived is discussed together with a very brief look at the charge-parity (CP) violation and how the standard model is standing up to the latest measurements of ε'/ε. (Author)

  16. Standardization of Thermo-Fluid Modeling in Modelica.Fluid

    Energy Technology Data Exchange (ETDEWEB)

    Franke, Rudiger; Casella, Francesco; Sielemann, Michael; Proelss, Katrin; Otter, Martin; Wetter, Michael

    2009-09-01

    This article discusses the Modelica.Fluid library that has been included in the Modelica Standard Library 3.1. Modelica.Fluid provides interfaces and basic components for the device-oriented modeling of onedimensional thermo-fluid flow in networks containing vessels, pipes, fluid machines, valves and fittings. A unique feature of Modelica.Fluid is that the component equations and the media models as well as pressure loss and heat transfer correlations are decoupled from each other. All components are implemented such that they can be used for media from the Modelica.Media library. This means that an incompressible or compressible medium, a single or a multiple substance medium with one or more phases might be used with one and the same model as long as the modeling assumptions made hold. Furthermore, trace substances are supported. Modeling assumptions can be configured globally in an outer System object. This covers in particular the initialization, uni- or bi-directional flow, and dynamic or steady-state formulation of mass, energy, and momentum balance. All assumptions can be locally refined for every component. While Modelica.Fluid contains a reasonable set of component models, the goal of the library is not to provide a comprehensive set of models, but rather to provide interfaces and best practices for the treatment of issues such as connector design and implementation of energy, mass and momentum balances. Applications from various domains are presented.

  17. Toward Standardizing a Lexicon of Infectious Disease Modeling Terms.

    Science.gov (United States)

    Milwid, Rachael; Steriu, Andreea; Arino, Julien; Heffernan, Jane; Hyder, Ayaz; Schanzer, Dena; Gardner, Emma; Haworth-Brockman, Margaret; Isfeld-Kiely, Harpa; Langley, Joanne M; Moghadas, Seyed M

    2016-01-01

    Disease modeling is increasingly being used to evaluate the effect of health intervention strategies, particularly for infectious diseases. However, the utility and application of such models are hampered by the inconsistent use of infectious disease modeling terms between and within disciplines. We sought to standardize the lexicon of infectious disease modeling terms and develop a glossary of terms commonly used in describing models' assumptions, parameters, variables, and outcomes. We combined a comprehensive literature review of relevant terms with an online forum discussion in a virtual community of practice, mod4PH (Modeling for Public Health). Using a convergent discussion process and consensus amongst the members of mod4PH, a glossary of terms was developed as an online resource. We anticipate that the glossary will improve inter- and intradisciplinary communication and will result in a greater uptake and understanding of disease modeling outcomes in heath policy decision-making. We highlight the role of the mod4PH community of practice and the methodologies used in this endeavor to link theory, policy, and practice in the public health domain.

  18. Modeling RHIC Using the Standard Machine Format Accelerator Description

    Science.gov (United States)

    Pilat, F.; Trahern, C. G.; Wei, J.; Satogata, T.; Tepikian, S.

    1997-05-01

    The Standard Machine Format (SMF)(N. Malitsky, R. Talman, et. al., A Proposed Flat Yet Hierarchical Accelerator Lattice Object Model), Particle Accel. 55, 313(1996). is a structured description of accelerator lattices which supports both the hierarchy of beam lines and generic lattice objects as well as the deviations (field errors, misalignments, etc.) associated with each distinct component which are necessary for accurate modeling of beam dynamics. In this paper we discuss the use of SMF to describe the Relativistic Heavy Ion Collider (RHIC) as well as the ancillary data structures (such as field quality measurements) that are necessarily incorporated into the RHIC SMF model. Future applications of SMF are outlined, including its use in the RHIC operational environment.

  19. Probabilistic Radiological Performance Assessment Modeling and Uncertainty

    Science.gov (United States)

    Tauxe, J.

    2004-12-01

    A generic probabilistic radiological Performance Assessment (PA) model is presented. The model, built using the GoldSim systems simulation software platform, concerns contaminant transport and dose estimation in support of decision making with uncertainty. Both the U.S. Nuclear Regulatory Commission (NRC) and the U.S. Department of Energy (DOE) require assessments of potential future risk to human receptors of disposal of LLW. Commercially operated LLW disposal facilities are licensed by the NRC (or agreement states), and the DOE operates such facilities for disposal of DOE-generated LLW. The type of PA model presented is probabilistic in nature, and hence reflects the current state of knowledge about the site by using probability distributions to capture what is expected (central tendency or average) and the uncertainty (e.g., standard deviation) associated with input parameters, and propagating through the model to arrive at output distributions that reflect expected performance and the overall uncertainty in the system. Estimates of contaminant release rates, concentrations in environmental media, and resulting doses to human receptors well into the future are made by running the model in Monte Carlo fashion, with each realization representing a possible combination of input parameter values. Statistical summaries of the results can be compared to regulatory performance objectives, and decision makers are better informed of the inherently uncertain aspects of the model which supports their decision-making. While this information may make some regulators uncomfortable, they must realize that uncertainties which were hidden in a deterministic analysis are revealed in a probabilistic analysis, and the chance of making a correct decision is now known rather than hoped for. The model includes many typical features and processes that would be part of a PA, but is entirely fictitious. This does not represent any particular site and is meant to be a generic example. A

  20. Validation of the Standard Method for Assessing Flicker From Wind Turbines

    DEFF Research Database (Denmark)

    Barahona Garzon, Braulio; Sørensen, Poul Ejnar; Christensen, L.

    2011-01-01

    This paper studies the validity of the standard method in IEC 61400-21 for assessing the flicker emission from multiple wind turbines. The standard method is based on testing a single wind turbine and then using the results of this test to assess the flicker emission from a number of wind turbines...... the flicker emission at the collection line; this assessment is then compared to the actual measurements in order to study the accuracy of the estimation. It was observed in both wind farms, that the assessment based on the standard method is statistically conservative compared to the measurements. The reason...... for this is the statistical characteristics of flicker emission....

  1. Standard Format for Chromatographic-polarimetric System small samples assessment

    International Nuclear Information System (INIS)

    Naranjo, S.; Fajer, V.; Fonfria, C.; Patinno, R.

    2012-01-01

    The treatment of samples containing optically active substances to be evaluated as part of quality control of raw material entering industrial process, and also during the modifications exerted on it to obtain the desired final composition is still and unsolved problem for many industries. That is the case of sugarcane industry. Sometimes the troubles implied are enlarged because samples to be evaluated are not bigger than one milliliter. Reduction of gel beds in G-10 and G-50 chromatographic columns having an inner diameter of 16 mm, instead of 25, and bed heights adjustable to requirements by means of sliding stoppers to increase analytical power were evaluated with glucose and sucrose standards in concentrations from 1 to 10 g/dL, using aliquots of 1 ml without undesirable dilutions that could affect either detection or chromatographic profile. Assays with seaweed extracts gave good results that are shown. It is established the advantage to know concentration of a separated substance by the height of its peak and the savings in time and reagents resulting . Sample expanded uncertainty in both systems is compared. It is also presented several programs for data acquisition, storing and processing. (Author)

  2. Early universe cosmology. In supersymmetric extensions of the standard model

    Energy Technology Data Exchange (ETDEWEB)

    Baumann, Jochen Peter

    2012-03-19

    In this thesis we investigate possible connections between cosmological inflation and leptogenesis on the one side and particle physics on the other side. We work in supersymmetric extensions of the Standard Model. A key role is played by the right-handed sneutrino, the superpartner of the right-handed neutrino involved in the type I seesaw mechanism. We study a combined model of inflation and non-thermal leptogenesis that is a simple extension of the Minimal Supersymmetric Standard Model (MSSM) with conserved R-parity, where we add three right-handed neutrino super fields. The inflaton direction is given by the imaginary components of the corresponding scalar component fields, which are protected from the supergravity (SUGRA) {eta}-problem by a shift symmetry in the Kaehler potential. We discuss the model first in a globally supersymmetric (SUSY) and then in a supergravity context and compute the inflationary predictions of the model. We also study reheating and non-thermal leptogenesis in this model. A numerical simulation shows that shortly after the waterfall phase transition that ends inflation, the universe is dominated by right-handed sneutrinos and their out-of-equilibrium decay can produce the desired matter-antimatter asymmetry. Using a simplified time-averaged description, we derive analytical expressions for the model predictions. Combining the results from inflation and leptogenesis allows us to constrain the allowed parameter space from two different directions, with implications for low energy neutrino physics. As a second thread of investigation, we discuss a generalisation of the inflationary model discussed above to include gauge non-singlet fields as inflatons. This is motivated by the fact that in left-right symmetric, supersymmetric Grand Unified Theories (SUSY GUTs), like SUSY Pati-Salam unification or SUSY SO(10) GUTs, the righthanded (s)neutrino is an indispensable ingredient and does not have to be put in by hand as in the MSSM. We discuss

  3. Standardization of natural phenomena risk assessment methodology at the Savannah River Plant

    International Nuclear Information System (INIS)

    Huang, J.C.; Hsu, Y.S.

    1985-01-01

    Safety analyses at the Savannah River Plant (SRP) normally require consideration of the risks of incidents caused by natural events such as high-velocity straight winds, tornadic winds, and earthquakes. The probabilities for these events to occur at SRP had been studied independently by several investigators, but the results of their studies were never systematically evaluated. As part of the endeavor to standardize our environmental risk assessment methodology, these independent studies have been thoroughly reviewed and critiqued, and appropriate probability models for these natural events have been selected. The selected probability models for natural phenomena, high-velocity straight winds and tornadic winds in particular, are in agreement with those being used at other DOE sites, and have been adopted as a guide for all safety studies conducted for SRP operations and facilities. 7 references, 3 figures

  4. Detecting physics beyond the Standard Model with the REDTOP experiment

    Science.gov (United States)

    González, D.; León, D.; Fabela, B.; Pedraza, M. I.

    2017-10-01

    REDTOP is an experiment at its proposal stage. It belongs to the High Intensity class of experiments. REDTOP will use a 1.8 GeV continuous proton beam impinging on a fixed target. It is expected to produce about 1013 η mesons per year. The main goal of REDTOP is to look for physics beyond the Standard Model by detecting rare η decays. The detector is designed with innovative technologies based on the detection of prompt Cherenkov light, such that interesting events can be observed and the background events are efficiently rejected. The experimental design, the physics program and the running plan of the experiment is presented.

  5. CP asymmetry in Bd→φKS: Standard model pollution

    International Nuclear Information System (INIS)

    Grossman, Y.; Isidori, G.; Worah, M.P.

    1998-01-01

    The difference in the time dependent CP asymmetries between the modes B→ψK S and B→φK S is a clean signal for physics beyond the standard model. This interpretation could fail if there is a large enhancement of the matrix element of the b→u bar us operator between the B d initial state and the φK S final state. We argue against this possibility and propose some experimental tests that could shed light on the situation. copyright 1998 The American Physical Society

  6. What is special about the group of the standard model?

    International Nuclear Information System (INIS)

    Nielsen, H.B.; Brene, N.

    1989-03-01

    The standard model is based on the algebra of U 1 xSU 2 xSU 3 . The systematics of charges of the fundamental fermions seems to suggest the importance of a particular group having this algebra, viz. S(U 2 xU 3 ). This group is distinguished from all other connected compact non semisimple groups with dimensionality up to 12 by a characteristic property: it is very 'skew'. By this we mean that the group has relatively few 'generalised outer automorphisms'. One may speculate about physical reasons for this fact. (orig.)

  7. Baryon number dissipation at finite temperature in the standard model

    International Nuclear Information System (INIS)

    Mottola, E.; Raby, S.; Starkman, G.

    1990-01-01

    We analyze the phenomenon of baryon number violation at finite temperature in the standard model, and derive the relaxation rate for the baryon density in the high temperature electroweak plasma. The relaxation rate, γ is given in terms of real time correlation functions of the operator E·B, and is directly proportional to the sphaleron transition rate, Γ: γ preceq n f Γ/T 3 . Hence it is not instanton suppressed, as claimed by Cohen, Dugan and Manohar (CDM). We show explicitly how this result is consistent with the methods of CDM, once it is recognized that a new anomalous commutator is required in their approach. 19 refs., 2 figs

  8. Dark Matter and Color Octets Beyond the Standard Model

    Energy Technology Data Exchange (ETDEWEB)

    Krnjaic, Gordan Zdenko [Johns Hopkins Univ., Baltimore, MD (United States)

    2012-07-01

    Although the Standard Model (SM) of particles and interactions has survived forty years of experimental tests, it does not provide a complete description of nature. From cosmological and astrophysical observations, it is now clear that the majority of matter in the universe is not baryonic and interacts very weakly (if at all) via non-gravitational forces. The SM does not provide a dark matter candidate, so new particles must be introduced. Furthermore, recent Tevatron results suggest that SM predictions for benchmark collider observables are in tension with experimental observations. In this thesis, we will propose extensions to the SM that address each of these issues.

  9. B_{s,d} -> l+ l- in the Standard Model

    CERN Document Server

    Bobeth, Christoph; Hermann, Thomas; Misiak, Mikolaj; Stamou, Emmanuel; Steinhauser, Matthias

    2014-01-01

    We combine our new results for the O(alpha_em) and O(alpha_s^2) corrections to B_{s,d} -> l^+ l^-, and present updated branching ratio predictions for these decays in the standard model. Inclusion of the new corrections removes major theoretical uncertainties of perturbative origin that have just begun to dominate over the parametric ones. For the recently observed muonic decay of the B_s meson, our calculation gives BR(B_s -> mu^+ mu^-) = (3.65 +_ 0.23) * 10^(-9).

  10. Future high precision experiments and new physics beyond Standard Model

    International Nuclear Information System (INIS)

    Luo, Mingxing.

    1993-01-01

    High precision (< 1%) electroweak experiments that have been done or are likely to be done in this decade are examined on the basis of Standard Model (SM) predictions of fourteen weak neutral current observables and fifteen W and Z properties to the one-loop level, the implications of the corresponding experimental measurements to various types of possible new physics that enter at the tree or loop level were investigated. Certain experiments appear to have special promise as probes of the new physics considered here

  11. The strong interactions beyond the standard model of particle physics

    Energy Technology Data Exchange (ETDEWEB)

    Bergner, Georg [Muenster Univ. (Germany). Inst. for Theoretical Physics

    2016-11-01

    SuperMUC is one of the most convenient high performance machines for our project since it offers a high performance and flexibility regarding different applications. This is of particular importance for investigations of new theories, where on the one hand the parameters and systematic uncertainties have to be estimated in smaller simulations and on the other hand a large computational performance is needed for the estimations of the scale at zero temperature. Our project is just the first investigation of the new physics beyond the standard model of particle physics and we hope to proceed with our studies towards more involved Technicolour candidates, supersymmetric QCD, and extended supersymmetry.

  12. High Mass Standard Model Higgs searches at the Tevatron

    Directory of Open Access Journals (Sweden)

    Petridis Konstantinos A.

    2012-06-01

    Full Text Available We present the results of searches for the Standard Model Higgs boson decaying predominantly to W+W− pairs, at a center-of-mass energy of √s = 1.96 TeV, using up to 8.2 fb−1 of data collected with the CDF and D0 detectors at the Fermilab Tevatron collider. The analysis techniques and the various channels considered are discussed. These searches result in exclusions across the Higgs mass range of 156.5< mH <173.7 GeV for CDF and 161< mH <170 GeV for D0.

  13. Coset Space Dimensional Reduction approach to the Standard Model

    International Nuclear Information System (INIS)

    Farakos, K.; Kapetanakis, D.; Koutsoumbas, G.; Zoupanos, G.

    1988-01-01

    We present a unified theory in ten dimensions based on the gauge group E 8 , which is dimensionally reduced to the Standard Mode SU 3c xSU 2 -LxU 1 , which breaks further spontaneously to SU 3L xU 1em . The model gives similar predictions for sin 2 θ w and proton decay as the minimal SU 5 G.U.T., while a natural choice of the coset space radii predicts light Higgs masses a la Coleman-Weinberg

  14. Modeling RHIC using the standard machine formal accelerator description

    International Nuclear Information System (INIS)

    Pilat, F.; Trahern, C.G.; Wei, J.

    1997-01-01

    The Standard Machine Format (SMF) is a structured description of accelerator lattices which supports both the hierarchy of beam lines and generic lattice objects as well as those deviations (field errors, alignment efforts, etc.) associated with each component of the as-installed machine. In this paper we discuss the use of SMF to describe the Relativistic Heavy Ion Collider (RHIC) as well as the ancillary data structures (such as field quality measurements) that are necessarily incorporated into the RHIC SMF model. Future applications of SMF are outlined, including its use in the RHIC operational environment

  15. Standardized assessment of tumor-infiltrating lymphocytes in breast cancer

    DEFF Research Database (Denmark)

    Tramm, Trine; Di Caterino, Tina; Jylling, Anne Marie B.

    2018-01-01

    by the International TILs Working Group 2014, applied to a cohort of breast cancers reflecting an average breast cancer population. Material and methods: Stromal TILs were assessed using full slide sections from 124 breast cancers with varying histology, malignancy grade and ER- and HER2 status. TILs were estimated......Introduction: In breast cancer, there is a growing body of evidence that tumor-infiltrating lymphocytes (TILs) may have clinical utility and may be able to direct clinical decisions for subgroups of patients. Clinical utility is, however, not sufficient for warranting the implementation of a new...... by nine dedicated breast pathologists using scanned hematoxylin–eosin stainings. TILs results were categorized using various cutoffs, and the inter-observer agreement was evaluated using the intraclass coefficient (ICC), Kappa statistics as well as individual overall agreements with the median value...

  16. Independent donor ethical assessment: aiming to standardize donor advocacy.

    Science.gov (United States)

    Choudhury, Devasmita; Jotterand, Fabrice; Casenave, Gerald; Smith-Morris, Carolyn

    2014-06-01

    Living organ donation has become more common across the world. To ensure an informed consent process, given the complex issues involved with organ donation, independent donor advocacy is required. The choice of how donor advocacy is administered is left up to each transplant center. This article presents the experience and process of donor advocacy at University of Texas Southwestern Medical Center administered by a multidisciplinary team consisting of physicians, surgeons, psychologists, medical ethicists and anthropologists, lawyers, a chaplain, a living kidney donor, and a kidney transplant recipient. To ensure that advocacy remains fair and consistent for all donors being considered, the donor advocacy team at University of Texas Southwestern Medical Center developed the Independent Donor Ethical Assessment, a tool that may be useful to others in rendering donor advocacy. In addition, the tool may be modified as circumstances arise to improve donor advocacy and maintain uniformity in decision making.

  17. Physics beyond the standard model in the non-perturbative unification scheme

    International Nuclear Information System (INIS)

    Kapetanakis, D.; Zoupanos, G.

    1990-01-01

    The non-perturbative unification scenario predicts reasonably well the low energy gauge couplings of the standard model. Agreement with the measured low energy couplings is obtained by assuming certain kind of physics beyond the standard model. A number of possibilities for physics beyond the standard model is examined. The best candidates so far are the standard model with eight fermionic families and a similar number of Higgs doublets, and the supersymmetric standard model with five families. (author)

  18. Big bang nucleosynthesis: The standard model and alternatives

    Science.gov (United States)

    Schramm, David N.

    1991-01-01

    Big bang nucleosynthesis provides (with the microwave background radiation) one of the two quantitative experimental tests of the big bang cosmological model. This paper reviews the standard homogeneous-isotropic calculation and shows how it fits the light element abundances ranging from He-4 at 24% by mass through H-2 and He-3 at parts in 10(exp 5) down to Li-7 at parts in 10(exp 10). Furthermore, the recent large electron positron (LEP) (and the stanford linear collider (SLC)) results on the number of neutrinos are discussed as a positive laboratory test of the standard scenario. Discussion is presented on the improved observational data as well as the improved neutron lifetime data. Alternate scenarios of decaying matter or of quark-hadron induced inhomogeneities are discussed. It is shown that when these scenarios are made to fit the observed abundances accurately, the resulting conlusions on the baryonic density relative to the critical density, omega(sub b) remain approximately the same as in the standard homogeneous case, thus, adding to the robustness of the conclusion that omega(sub b) approximately equals 0.06. This latter point is the driving force behind the need for non-baryonic dark matter (assuming omega(sub total) = 1) and the need for dark baryonic matter, since omega(sub visible) is less than omega(sub b).

  19. Big bang nucleosynthesis: The standard model and alternatives

    International Nuclear Information System (INIS)

    Schramm, D.N.

    1991-01-01

    Big bang nucleosynthesis provides (with the microwave background radiation) one of the two quantitative experimental tests of the big bang cosmological model. This paper reviews the standard homogeneous-isotropic calculation and shows how it fits the light element abundances ranging from 4 He at 24% by mass through 2 H and 3 He at parts in 10 5 down to 7 Li at parts in 10 10 . Furthermore, the recent LEP (and SLC) results on the number of neutrinos are discussed as a positive laboratory test of the standard scenario. Discussion is presented on the improved observational data as well as the improved neutron lifetime data. Alternate scenarios of decaying matter or of quark-hadron induced inhomogeneities are discussed. It is shown that when these scenarios are made to fit the observed abundances accurately, the resulting conclusions on the baryonic density relative to the critical density, Ω b , remain approximately the same as in the standard homogeneous case, thus, adding to the robustness of the conclusion that Ω b ≅0.06. This latter point is the driving force behind the need for non-baryonic dark matter (assuming Ω total =1) and the need for dark baryonic matter, since Ω visible b . (orig.)

  20. Standard Model CP-violation and baryon asymmetry

    CERN Document Server

    Gavela, M.B.; Orloff, J.; Pene, O.

    1994-01-01

    Simply based on CP arguments, we argue against a Standard Model explanation of the baryon asymmetry of the universe in the presence of a first order phase transition. A CP-asymmetry is found in the reflection coefficients of quarks hitting the phase boundary created during the electroweak transition. The problem is analyzed both in an academic zero temperature case and in the realistic finite temperature one. The building blocks are similar in both cases: Kobayashi-Maskawa CP-violation, CP-even phases in the reflection coefficients of quarks, and physical transitions due to fermion self-energies. In both cases an effect is present at order $\\alpha_W^2$ in rate. A standard GIM behaviour is found as intuitively expected. In the finite temperature case, a crucial role is played by the damping rate of quasi-particles in a hot plasma, which is a relevant scale together with $M_W$ and the temperature. The effect is many orders of magnitude below what observation requires, and indicates that non standard physics is ...

  1. Development of Proliferation Resistance Assessment Methodology Based on International Standards

    International Nuclear Information System (INIS)

    Lee, Yong Deok; Lee, Jung Won; Lee, Kwang Seok

    2009-03-01

    Proliferation resistance is one of the requirement to be met in GEN IV and INPRO for next generation nuclear energy system. Internationally, the evaluation methodology on PR had been already initiated from 1980, but the systematic development was started at 2000s. In Korea, for the export of nuclear energy system and the increase of international credibility and transparence of domestic nuclear system and fuel cycle development, the independent development of PR evaluation methodology was started in 2007 as a nuclear long term R and D project and the development is being performed for the model of PR evaluation methodology. In 1st year, comparative study of GEN-IV/INPRO, PR indicator development, quantification of indicator and evaluation model development, analysis of technology system and international technology development trend had been performed. In 2nd year, feasibility study of indicator, allowable limit of indicator, review of technical requirement of indicator were done. The results of PR evaluation must be applied in the beginning of conceptual design of nuclear system. Through the technology development of PR evaluation methodology, the methodology will be applied in the regulatory requirement for authorization and permission to be developed

  2. Dose assessment models. Annex A

    International Nuclear Information System (INIS)

    1982-01-01

    The models presented in this chapter have been separated into 2 general categories: environmental transport models which describe the movement of radioactive materials through all sectors of the environment after their release, and dosimetric models to calculate the absorbed dose following an intake of radioactive materials or exposure to external irradiation. Various sections of this chapter also deal with atmospheric transport models, terrestrial models, and aquatic models.

  3. How to use the Standard Model effective field theory

    Energy Technology Data Exchange (ETDEWEB)

    Henning, Brian; Lu, Xiaochuan [Department of Physics, University of California, Berkeley,Berkeley, California 94720 (United States); Theoretical Physics Group, Lawrence Berkeley National Laboratory,Berkeley, California 94720 (United States); Murayama, Hitoshi [Department of Physics, University of California, Berkeley,Berkeley, California 94720 (United States); Theoretical Physics Group, Lawrence Berkeley National Laboratory,Berkeley, California 94720 (United States); Kavli Institute for the Physics and Mathematics of the Universe (WPI),Todai Institutes for Advanced Study, University of Tokyo,Kashiwa 277-8583 (Japan)

    2016-01-05

    We present a practical three-step procedure of using the Standard Model effective field theory (SM EFT) to connect ultraviolet (UV) models of new physics with weak scale precision observables. With this procedure, one can interpret precision measurements as constraints on a given UV model. We give a detailed explanation for calculating the effective action up to one-loop order in a manifestly gauge covariant fashion. This covariant derivative expansion method dramatically simplifies the process of matching a UV model with the SM EFT, and also makes available a universal formalism that is easy to use for a variety of UV models. A few general aspects of RG running effects and choosing operator bases are discussed. Finally, we provide mapping results between the bosonic sector of the SM EFT and a complete set of precision electroweak and Higgs observables to which present and near future experiments are sensitive. Many results and tools which should prove useful to those wishing to use the SM EFT are detailed in several appendices.

  4. How to use the Standard Model effective field theory

    Science.gov (United States)

    Henning, Brian; Lu, Xiaochuan; Murayama, Hitoshi

    2016-01-01

    We present a practical three-step procedure of using the Standard Model effective field theory (SM EFT) to connect ultraviolet (UV) models of new physics with weak scale precision observables. With this procedure, one can interpret precision measurements as constraints on a given UV model. We give a detailed explanation for calculating the effective action up to one-loop order in a manifestly gauge covariant fashion. This covariant derivative expansion method dramatically simplifies the process of matching a UV model with the SM EFT, and also makes available a universal formalism that is easy to use for a variety of UV models. A few general aspects of RG running effects and choosing operator bases are discussed. Finally, we provide mapping results between the bosonic sector of the SM EFT and a complete set of precision electroweak and Higgs observables to which present and near future experiments are sensitive. Many results and tools which should prove useful to those wishing to use the SM EFT are detailed in several appendices.

  5. Electroweak baryogenesis in extensions of the standard model

    Energy Technology Data Exchange (ETDEWEB)

    Fromme, L.

    2006-07-07

    We investigate the generation of the baryon asymmetry in two extensions of the Standard Model; these are the {phi}{sup 6} and the two-Higgs-doublet model. Analyzing the thermal potential in the presence of CP violation, we find a strong first order phase transition for a wide range of parameters in both models. We compute the relevant bubble wall properties which then enter the transport equations. In non-supersymmetric models electroweak baryogenesis is dominated by top transport, which we treat in the WKB approximation. We calculate the CP-violating source terms starting from the Dirac equation. We show how to resolve discrepancies between this treatment and the computation in the Schwinger-Keldysh formalism. Furthermore, we keep inelastic scatterings of quarks and W bosons at a finite rate, which considerably affects the amount of the generated baryon asymmetry depending on the bubble wall velocity. In addition, we improve the transport equations by novel source terms which are generated by CP-conserving perturbations in the plasma. It turns out that their effect is relatively small. Both models under consideration predict a baryon to entropy ratio close to the observed value for a large part of the parameter space without being in conflict with constraints on electric dipole moments. (orig.)

  6. Comparative life cycle assessment of standard and green roofs.

    Science.gov (United States)

    Saiz, Susana; Kennedy, Christopher; Bass, Brad; Pressnail, Kim

    2006-07-01

    Life cycle assessment (LCA) is used to evaluate the benefits, primarily from reduced energy consumption, resulting from the addition of a green roof to an eight story residential building in Madrid. Building energy use is simulated and a bottom-up LCA is conducted assuming a 50 year building life. The key property of a green roof is its low solar absorptance, which causes lower surface temperature, thereby reducing the heat flux through the roof. Savings in annual energy use are just over 1%, but summer cooling load is reduced by over 6% and reductions in peak hour cooling load in the upper floors reach 25%. By replacing the common flat roof with a green roof, environmental impacts are reduced by between 1.0 and 5.3%. Similar reductions might be achieved by using a white roof with additional insulation for winter, but more substantial reductions are achieved if common use of green roofs leads to reductions in the urban heat island.

  7. Decay of the standard model Higgs field after inflation

    CERN Document Server

    Figueroa, Daniel G; Torrenti, Francisco

    2015-01-01

    We study the nonperturbative dynamics of the Standard Model (SM) after inflation, in the regime where the SM is decoupled from (or weakly coupled to) the inflationary sector. We use classical lattice simulations in an expanding box in (3+1) dimensions, modeling the SM gauge interactions with both global and Abelian-Higgs analogue scenarios. We consider different post-inflationary expansion rates. During inflation, the Higgs forms a condensate, which starts oscillating soon after inflation ends. Via nonperturbative effects, the oscillations lead to a fast decay of the Higgs into the SM species, transferring most of the energy into $Z$ and $W^{\\pm}$ bosons. All species are initially excited far away from equilibrium, but their interactions lead them into a stationary stage, with exact equipartition among the different energy components. From there on the system eventually reaches equilibrium. We have characterized in detail, in the different expansion histories considered, the evolution of the Higgs and of its ...

  8. CP Violating B Decays in the Standard Model and Supersymmetry

    International Nuclear Information System (INIS)

    Ciuchini, M.; Franco, E.; Martinelli, G.; Masiero, A.; Silvestrini, L.

    1997-01-01

    We study the uncertainties of the standard model (SM) predictions for CP violating B decays and investigate where and how supersymmetric (SUSY) contributions may be disentangled. The first task is accomplished by letting the relevant matrix elements of the effective Hamiltonian vary within certain ranges. The SUSY analysis makes use of a formalism which allows one to obtain model-independent results. We show that in some cases it is possible (a) to measure the CP B endash BB mixing phase and (b) to discriminate the SM and SUSY contributions to the CP decay phases. The gold-plated decays in this respect are the B→φK S and B→K S π 0 channels. copyright 1997 The American Physical Society

  9. Electro symmetry breaking and beyond the standard model

    International Nuclear Information System (INIS)

    Barklow, T.; Dawson, S.; Haber, H.E.

    1995-05-01

    The development of the Standard Model of particle physics is a remarkable success story. Its many facets have been tested at present day accelerators; no significant unambiguous deviations have yet been found. In some cases, the model has been verified at an accuracy of better than one part in a thousand. This state of affairs presents our field with a challenge. Where do we go from here? What is our vision for future developments in particle physics? Are particle physicists' recent successes a signal of the field's impending demise, or do real long-term prospects exist for further progress? We assert that the long-term health and intellectual vitality of particle physics depends crucially on the development of a new generation of particle colliders that push the energy frontier by an order of magnitude beyond present capabilities. In this report, we address the scientific issues underlying this assertion

  10. Modeling the wet bulb globe temperature using standard meteorological measurements.

    Science.gov (United States)

    Liljegren, James C; Carhart, Richard A; Lawday, Philip; Tschopp, Stephen; Sharp, Robert

    2008-10-01

    The U.S. Army has a need for continuous, accurate estimates of the wet bulb globe temperature to protect soldiers and civilian workers from heat-related injuries, including those involved in the storage and destruction of aging chemical munitions at depots across the United States. At these depots, workers must don protective clothing that increases their risk of heat-related injury. Because of the difficulty in making continuous, accurate measurements of wet bulb globe temperature outdoors, the authors have developed a model of the wet bulb globe temperature that relies only on standard meteorological data available at each storage depot for input. The model is composed of separate submodels of the natural wet bulb and globe temperatures that are based on fundamental principles of heat and mass transfer, has no site-dependent parameters, and achieves an accuracy of better than 1 degree C based on comparisons with wet bulb globe temperature measurements at all depots.

  11. Electro symmetry breaking and beyond the standard model

    Energy Technology Data Exchange (ETDEWEB)

    Barklow, T. [Stanford Linear Accelerator Center, Menlo Park, CA (United States); Dawson, S. [Brookhaven National Lab., Upton, NY (United States); Haber, H.E. [California Univ., Santa Cruz, CA (United States). Inst. for Particle Physics; Siegrist, J. [Lawrence Berkeley Lab., CA (United States)

    1995-05-01

    The development of the Standard Model of particle physics is a remarkable success story. Its many facets have been tested at present day accelerators; no significant unambiguous deviations have yet been found. In some cases, the model has been verified at an accuracy of better than one part in a thousand. This state of affairs presents our field with a challenge. Where do we go from here? What is our vision for future developments in particle physics? Are particle physicists` recent successes a signal of the field`s impending demise, or do real long-term prospects exist for further progress? We assert that the long-term health and intellectual vitality of particle physics depends crucially on the development of a new generation of particle colliders that push the energy frontier by an order of magnitude beyond present capabilities. In this report, we address the scientific issues underlying this assertion.

  12. Neutron electric dipole moment in the minimal supersymmetric standard model

    International Nuclear Information System (INIS)

    Inui, T.; Mimura, Y.; Sakai, N.; Sasaki, T.

    1995-01-01

    The neutron electric dipole moment (EDM) due to the single quark EDM and to the transition EDM is calculated in the minimal supersymmetric standard model. Assuming that the Cabibbo-Kobayashi-Maskawa matrix at the grand unification scale is the only source of CP violation, complex phases are induced in the parameters of soft supersymmetry breaking at low energies. The chargino one-loop diagram is found to give the dominant contribution of the order of 10 -27 similar 10 -29 e.cm for the quark EDM, assuming the light chargino mass and the universal scalar mass to be 50 GeV and 100 GeV, respectively. Therefore the neutron EDM in this class of model is difficult to measure experimentally. The gluino one-loop diagram also contributes due to the flavor changing gluino coupling. The transition EDM is found to give dominant contributions for certain parameter regions. (orig.)

  13. Potential growing model for the standard carnation cv. Delphi

    Directory of Open Access Journals (Sweden)

    Miguel Ángel López M.

    2014-08-01

    Full Text Available The cut flower business requires exact synchronicity between product offer and demand in consumer countries. Having tools that help to improve this synchronicity through predictions or crop growth monitoring could provide an important advantage to program standards and corrective agronomic practices. At the Centro de Biotecnología Agropecuaria, SENA (SENA's Biotechnology, Agricultural and Livestock Center, located in Mosquera, Cundinamarca, a trial with standard carnation cv. Delphi grown under greenhouse conditions was carried out. The objective of this study was to build a simple model of dry matter (DM production and partition of on-carnation flower stems. The model was based on the photosynthetically active radiation (PAR MJ m-2 d-1 and temperature as exogenous variables and assumed no water or nutrient limitations or damage caused by pests, disease or weeds. In this model, the daily DM increase depended on the PAR, the light fraction intercepted by the foliage (F LINT and the light use efficiency (LUE g MJ-1. The LUE in the vegetative and reproductive stages reached values of 1.31 and 0.74 g MJ-1, respectively. The estimated extinction coefficient (k value corresponded to 0.53 and the maximum F LINT was between 0.79 and 0.82. Partitioning between the plant vegetative and reproductive stages was modeled based on the hypothesis that the partition is regulated by the source sink relationship. The estimated partition coefficient for the vegetative stage of the leaves was 0.63 and 0.37 for the stems. During the reproductive stage, the partitioning coefficients of leaves, stems and flower buds were 0.05, 0.74, and 0.21, respectively.

  14. Collider physics within the standard model a primer

    CERN Document Server

    Altarelli, Guido

    2017-01-01

    With this graduate-level primer, the principles of the standard model of particle physics receive a particular skillful, personal and enduring exposition by one of the great contributors to the field. In 2013 the late Prof. Altarelli wrote: The discovery of the Higgs boson and the non-observation of new particles or exotic phenomena have made a big step towards completing the experimental confirmation of the standard model of fundamental particle interactions. It is thus a good moment for me to collect, update and improve my graduate lecture notes on quantum chromodynamics and the theory of electroweak interactions, with main focus on collider physics. I hope that these lectures can provide an introduction to the subject for the interested reader, assumed to be already familiar with quantum field theory and some basic facts in elementary particle physics as taught in undergraduate courses. “These lecture notes are a beautiful example of Guido’s unique pedagogical abilities and scientific vision”. From...

  15. From the CERN web: Standard Model, SESAME and more

    CERN Multimedia

    2015-01-01

    This section highlights articles, blog posts and press releases published in the CERN web environment over the past weeks. This way, you won’t miss a thing...   Left: ATLAS non-leptonic MWZ data. Right: ATLAS σ × B exclusion for W’ → WZ. Is the Standard Model about to crater? 28 October – CERN Courier The Standard Model is coming under more and more pressure from experiments. New results from the analysis of LHC’s Run 1 data show effects that, if confirmed, would be the signature of new interactions at the TeV scale. Continue to read…      Students and teachers participate in lectures about CERN science at the first ever SESAME teacher and students school. New CERN programme to develop network between SESAME schools 22 October - by Harriet Jarlett In September CERN welcomed 28 visitors from the Middle East for the first ever student and teacher school f...

  16. Standard Model in multiscale theories and observational constraints

    Science.gov (United States)

    Calcagni, Gianluca; Nardelli, Giuseppe; Rodríguez-Fernández, David

    2016-08-01

    We construct and analyze the Standard Model of electroweak and strong interactions in multiscale spacetimes with (i) weighted derivatives and (ii) q -derivatives. Both theories can be formulated in two different frames, called fractional and integer picture. By definition, the fractional picture is where physical predictions should be made. (i) In the theory with weighted derivatives, it is shown that gauge invariance and the requirement of having constant masses in all reference frames make the Standard Model in the integer picture indistinguishable from the ordinary one. Experiments involving only weak and strong forces are insensitive to a change of spacetime dimensionality also in the fractional picture, and only the electromagnetic and gravitational sectors can break the degeneracy. For the simplest multiscale measures with only one characteristic time, length and energy scale t*, ℓ* and E*, we compute the Lamb shift in the hydrogen atom and constrain the multiscale correction to the ordinary result, getting the absolute upper bound t*28 TeV . Stronger bounds are obtained from the measurement of the fine-structure constant. (ii) In the theory with q -derivatives, considering the muon decay rate and the Lamb shift in light atoms, we obtain the independent absolute upper bounds t*35 MeV . For α0=1 /2 , the Lamb shift alone yields t*450 GeV .

  17. Assessing the reliability of the borderline regression method as a standard setting procedure for objective structured clinical examination

    Directory of Open Access Journals (Sweden)

    Sara Mortaz Hejri

    2013-01-01

    Full Text Available Background: One of the methods used for standard setting is the borderline regression method (BRM. This study aims to assess the reliability of BRM when the pass-fail standard in an objective structured clinical examination (OSCE was calculated by averaging the BRM standards obtained for each station separately. Materials and Methods: In nine stations of the OSCE with direct observation the examiners gave each student a checklist score and a global score. Using a linear regression model for each station, we calculated the checklist score cut-off on the regression equation for the global scale cut-off set at 2. The OSCE pass-fail standard was defined as the average of all station′s standard. To determine the reliability, the root mean square error (RMSE was calculated. The R2 coefficient and the inter-grade discrimination were calculated to assess the quality of OSCE. Results: The mean total test score was 60.78. The OSCE pass-fail standard and its RMSE were 47.37 and 0.55, respectively. The R2 coefficients ranged from 0.44 to 0.79. The inter-grade discrimination score varied greatly among stations. Conclusion: The RMSE of the standard was very small indicating that BRM is a reliable method of setting standard for OSCE, which has the advantage of providing data for quality assurance.

  18. Background and derivation of ANS-5.4 standard fission product release model. Technical report

    International Nuclear Information System (INIS)

    1982-01-01

    ANS Working Group 5.4 was established in 1974 to examine fission product releases from UO2 fuel. The scope of ANS-5.4 was narrowly defined to include the following: (1) Review available experimental data on release of volatile fission products from UO2 and mixed-oxide fuel; (2) Survey existing analytical models currently being applied to lightwater reactors; and (3) Develop a standard analytical model for volatile fission product release to the fuel rod void space. Place emphasis on obtaining a model for radioactive fission product releases to be used in assessing radiological consequences of postulated accidents

  19. JPL Thermal Design Modeling Philosophy and NASA-STD-7009 Standard for Models and Simulations - A Case Study

    Science.gov (United States)

    Avila, Arturo

    2011-01-01

    The Standard JPL thermal engineering practice prescribes worst-case methodologies for design. In this process, environmental and key uncertain thermal parameters (e.g., thermal blanket performance, interface conductance, optical properties) are stacked in a worst case fashion to yield the most hot- or cold-biased temperature. Thus, these simulations would represent the upper and lower bounds. This, effectively, represents JPL thermal design margin philosophy. Uncertainty in the margins and the absolute temperatures is usually estimated by sensitivity analyses and/or by comparing the worst-case results with "expected" results. Applicability of the analytical model for specific design purposes along with any temperature requirement violations are documented in peer and project design review material. In 2008, NASA released NASA-STD-7009, Standard for Models and Simulations. The scope of this standard covers the development and maintenance of models, the operation of simulations, the analysis of the results, training, recommended practices, the assessment of the Modeling and Simulation (M&S) credibility, and the reporting of the M&S results. The Mars Exploration Rover (MER) project thermal control system M&S activity was chosen as a case study determining whether JPL practice is in line with the standard and to identify areas of non-compliance. This paper summarizes the results and makes recommendations regarding the application of this standard to JPL thermal M&S practices.

  20. Subjective Video Quality Assessment of H.265 Compression Standard for Full HD Resolution

    Directory of Open Access Journals (Sweden)

    Miroslav Uhrina

    2015-01-01

    Full Text Available Recently increasing interest in multimedia services leads to requirements for quality assessment, especially in the video domain. There are many factors that influence the video quality. Compression technology and transmission link imperfection can be considered as the main ones. This paper deals with the assessment of the impact of H.265/HEVC compression standard on the video quality using subjective metrics. The evaluation is done for two types of sequences with Full HD resolution depending on content. The paper is divided as follows. In the first part of the article, a short characteristic of the H.265/HEVC compression standard is written. In the second part, the subjective video quality methods used in our experiments are described. The last part of this article deals with the measurements and experimental results. They showed that quality of sequences coded between 5 and 7 Mbps is for observers sufficient, so there is no need for providers to use higher bitrates in streaming than this threshold. These results are part of a new model that is still being created and will be used for predicting the video quality in networks based on IP.

  1. Experimental validation of Swy-2 clay standard's PHREEQC model

    Science.gov (United States)

    Szabó, Zsuzsanna; Hegyfalvi, Csaba; Freiler, Ágnes; Udvardi, Beatrix; Kónya, Péter; Székely, Edit; Falus, György

    2017-04-01

    One of the challenges of the present century is to limit the greenhouse gas emissions for the mitigation of climate change which is possible for example by a transitional technology, CCS (Carbon Capture and Storage) and, among others, by the increase of nuclear proportion in the energy mix. Clay minerals are considered to be responsible for the low permeability and sealing capacity of caprocks sealing off stored CO2 and they are also the main constituents of bentonite in high level radioactive waste disposal facilities. The understanding of clay behaviour in these deep geological environments is possible through laboratory batch experiments of well-known standards and coupled geochemical models. Such experimentally validated models are scarce even though they allow deriving more precise long-term predictions of mineral reactions and rock and bentonite degradation underground and, therefore, ensuring the safety of the above technologies and increase their public acceptance. This ongoing work aims to create a kinetic geochemical model of Na-montmorillonite standard Swy-2 in the widely used PHREEQC code, supported by solution and mineral composition results from batch experiments. Several four days experiments have been carried out in 1:35 rock:water ratio at atmospheric conditions, and with inert and CO2 supercritical phase at 100 bar and 80 ⁰C relevant for the potential Hungarian CO2 reservoir complex. Solution samples have been taken during and after experiments and their compositions were measured by ICP-OES. The treated solid phase has been analysed by XRD and ATR-FTIR and compared to in-parallel measured references (dried Swy-2). Kinetic geochemical modelling of the experimental conditions has been performed by PHREEQC version 3 using equations and kinetic rate parameters from the USGS report of Palandri and Kharaka (2004). The visualization of experimental and numerous modelling results has been automatized by R. Experiments and models show very fast

  2. Modeling for operational event risk assessment

    International Nuclear Information System (INIS)

    Sattison, M.B.

    1997-01-01

    The U.S. Nuclear Regulatory Commission has been using risk models to evaluate the risk significance of operational events in U.S. commercial nuclear power plants for more seventeen years. During that time, the models have evolved in response to the advances in risk assessment technology and insights gained with experience. Evaluation techniques fall into two categories, initiating event assessments and condition assessments. The models used for these analyses have become uniquely specialized for just this purpose

  3. Sensitivity Assessment of Ozone Models

    Energy Technology Data Exchange (ETDEWEB)

    Shorter, Jeffrey A.; Rabitz, Herschel A.; Armstrong, Russell A.

    2000-01-24

    The activities under this contract effort were aimed at developing sensitivity analysis techniques and fully equivalent operational models (FEOMs) for applications in the DOE Atmospheric Chemistry Program (ACP). MRC developed a new model representation algorithm that uses a hierarchical, correlated function expansion containing a finite number of terms. A full expansion of this type is an exact representation of the original model and each of the expansion functions is explicitly calculated using the original model. After calculating the expansion functions, they are assembled into a fully equivalent operational model (FEOM) that can directly replace the original mode.

  4. Environmental assessment in support of proposed voluntary energy conservation standard for new residential buildings

    Energy Technology Data Exchange (ETDEWEB)

    Hadley, D.L.; Parker, G.B.; Callaway, J.W.; Marsh, S.J.; Roop, J.M.; Taylor, Z.T.

    1989-06-01

    The objective of this environmental assessment (EA) is to identify the potential environmental impacts that could result from the proposed voluntary residential standard (VOLRES) on private sector construction of new residential buildings. 49 refs., 15 tabs.

  5. New extended standard model, dark matters and relativity theory

    Science.gov (United States)

    Hwang, Jae-Kwang

    2016-03-01

    Three-dimensional quantized space model is newly introduced as the extended standard model. Four three-dimensional quantized spaces with total 12 dimensions are used to explain the universes including ours. Electric (EC), lepton (LC) and color (CC) charges are defined to be the charges of the x1x2x3, x4x5x6 and x7x8x9 warped spaces, respectively. Then, the lepton is the xi(EC) - xj(LC) correlated state which makes 3x3 = 9 leptons and the quark is the xi(EC) - xj(LC) - xk(CC) correlated state which makes 3x3x3 = 27 quarks. The new three bastons with the xi(EC) state are proposed as the dark matters seen in the x1x2x3 space, too. The matter universe question, three generations of the leptons and quarks, dark matter and dark energy, hadronization, the big bang, quantum entanglement, quantum mechanics and general relativity are briefly discussed in terms of this new model. The details can be found in the article titled as ``journey into the universe; three-dimensional quantized spaces, elementary particles and quantum mechanics at https://www.researchgate.net/profile/J_Hwang2''.

  6. Future Directions in Assessment: Influences of Standards and Implications for Language Learning

    Science.gov (United States)

    Cox, Troy L.; Malone, Margaret E.; Winke, Paula

    2018-01-01

    As "Foreign Language Annals" concludes its 50th anniversary, it is fitting to review the past and peer into the future of standards-based education and assessment. Standards are a common yardstick used by educators and researchers as a powerful framework for conceptualizing teaching and measuring learner success. The impact of standards…

  7. External Peer Review of Assessment: An Effective Approach to Verifying Standards?

    Science.gov (United States)

    Bloxham, Sue; Hudson, Jane; den Outer, Birgit; Price, Margaret

    2015-01-01

    There is growing international concern to regulate and assure standards in higher education. External peer review of assessment, often called external examining, is a well-established approach to assuring standards. Australian higher education is one of several systems without a history of external examining for undergraduate programmes that is…

  8. The Impact of Early Exposure of Eighth Grade Math Standards on End of Grade Assessments

    Science.gov (United States)

    Robertson, Tonjai E.

    2016-01-01

    The purpose of this study was to examine the Cumberland County Schools district-wide issue surrounding the disproportional performance of eighth grade Math I students' proficiency scores on standardized end-of-grade and end-of-course assessments. The study focused on the impact of the school district incorporating eighth grade math standards in…

  9. INEE Minimum Standards: A Tool for Education Quality Assessment in Afghan Refugee Schools in Pakistan

    Science.gov (United States)

    Qahir, Katayon

    2007-01-01

    This article details a pilot Minimum Standards assessment in Afghan refugee schools supported by the International Rescue Committee's Female Education Program in the North West Frontier Province of Pakistan. A set of specifically selected, contextualized indicators, based on the global INEE Minimum Standards, served as a tool for teachers and…

  10. Rediscovering standard model physics with the ATLAS detector

    CERN Document Server

    Flowerdew, M J

    2009-01-01

    With its 14 TeV proton-proton center of mass energy, the LHC is a factory of standard model (SM) particles produced at previously inaccessible energy scales. The ATLAS experiment needs to perform a thorough analysis of these particles before exploring more exotic possibilities that the LHC may open doors to. W and Z bosons will initially be used as calibration samples to improve the understanding of the detector. Top quarks will also be copiously produced and will for the first time be calibration particles, whilst also yielding an important background to beyond the SM searches. Top quarks may also be produced with high transverse momenta, requiring novel methods to perform efficient top quark identification in the ATLAS detector. I will give an overview of the current status of the heavy gauge boson and top quark physics at ATLAS, in terms of both detector and expected precision measurements performance.

  11. Dark Matter in the Standard Model? arXiv

    CERN Document Server

    Gross, Christian; Strumia, Alessandro; Urbano, Alfredo; Xue, Wei

    We critically reexamine two possible Dark Matter candidate within the Standard Model. First, we consider the $uuddss$ exa-quark. Its QCD binding energy could be large enough to make it (quasi) stable. We show that the cosmological Dark Matter abundance is reproduced thermally if its mass is 1.2 GeV. However, we also find that such mass is excluded by the stability of Oxygen nuclei. Second, we consider the possibility that the instability in the Higgs potential leads to the formation of primordial black holes while avoiding vacuum decay during inflation. We show that the non-minimal Higgs coupling to gravity must be as small as allowed by quantum corrections, $|\\xi_H| < 0.01$. Even so, one must assume that the Universe survived in $e^{120}$ independent regions to fluctuations that lead to vacuum decay with probability 1/2 each.

  12. A Constrained Standard Model: Effects of Fayet-Iliopoulos Terms

    International Nuclear Information System (INIS)

    Barbieri, Riccardo; Hall, Lawrence J.; Nomura, Yasunori

    2001-01-01

    In (1)the one Higgs doublet standard model was obtained by an orbifold projection of a 5D supersymmetric theory in an essentially unique way, resulting in a prediction for the Higgs mass m H = 127 +- 8 GeV and for the compactification scale 1/R = 370 +- 70 GeV. The dominant one loop contribution to the Higgs potential was found to be finite, while the above uncertainties arose from quadratically divergent brane Z factors and from other higher loop contributions. In (3), a quadratically divergent Fayet-Iliopoulos term was found at one loop in this theory. We show that the resulting uncertainties in the predictions for the Higgs boson mass and the compactification scale are small, about 25percent of the uncertainties quoted above, and hence do not affect the original predictions. However, a tree level brane Fayet-Iliopoulos term could, if large enough, modify these predictions, especially for 1/R.

  13. Status of standard model predictions and uncertainties for electroweak observables

    International Nuclear Information System (INIS)

    Kniehl, B.A.

    1993-11-01

    Recent progress in theoretical predictions of electroweak parameters beyond one loop in the standard model is reviewed. The topics include universal corrections of O(G F 2 M H 2 M W 2 ), O(G F 2 m t 4 ), O(α s G F M W 2 ), and those due to virtual t anti t threshold effects, as well as specific corrections to Γ(Z → b anti b) of O(G F 2 m t 4 ), O(α s G F m t 2 ), and O(α s 2 m b 2 /M Z 2 ). An update of the hadronic contributions to Δα is presented. Theoretical uncertainties, other than those due to the lack of knowledge of M H and m t , are estimated. (orig.)

  14. On the metastability of the Standard Model vacuum

    International Nuclear Information System (INIS)

    Isidori, Gino; Ridolfi, Giovanni; Strumia, Alessandro

    2001-01-01

    If the Higgs mass m H is as low as suggested by present experimental information, the Standard Model ground state might not be absolutely stable. We present a detailed analysis of the lower bounds on m H imposed by the requirement that the electroweak vacuum be sufficiently long-lived. We perform a complete one-loop calculation of the tunnelling probability at zero temperature, and we improve it by means of two-loop renormalization-group equations. We find that, for m H =115 GeV, the Higgs potential develops an instability below the Planck scale for m t >(166±2) GeV, but the electroweak vacuum is sufficiently long-lived for m t <(175±2) GeV

  15. On the metastability of the Standard Model vacuum

    CERN Document Server

    Isidori, Gino; Strumia, A; Isidori, Gino; Ridolfi, Giovanni; Strumia, Alessandro

    2001-01-01

    If the Higgs mass $m_H$ is as low as suggested by present experimental information, the Standard Model ground state might not be absolutely stable. We present a detailed analysis of the lower bounds on $m_H$ imposed by the requirement that the electroweak vacuum be sufficiently long-lived. We perform a complete one-loop calculation of the tunnelling probability at zero temperature, and we improve it by means of two-loop renormalization-group equations. We find that, for $m_H=115$ GeV, the Higgs potential develops an instability below the Planck scale for $m_t>(166\\pm 2) \\GeV$, but the electroweak vacuum is sufficiently long-lived for $m_t > (175\\pm 2) \\GeV$.

  16. Through precision straits to next standard model heights

    CERN Document Server

    David, André

    2016-01-01

    After the LHC Run 1, the standard model (SM) of particle physics has been completed. Yet, despite its successes, the SM has shortcomings vis-\\`{a}-vis cosmological and other observations. At the same time, while the LHC restarts for Run 2 at 13 TeV, there is presently a lack of direct evidence for new physics phenomena at the accelerator energy frontier. From this state of affairs arises the need for a consistent theoretical framework in which deviations from the SM predictions can be calculated and compared to precision measurements. Such a framework should be able to comprehensively make use of all measurements in all sectors of particle physics, including LHC Higgs measurements, past electroweak precision data, electric dipole moment, $g-2$, penguins and flavor physics, neutrino scattering, deep inelastic scattering, low-energy $e^{+}e^{-}$ scattering, mass measurements, and any search for physics beyond the SM. By simultaneously describing all existing measurements, this framework then becomes an intermed...

  17. Error modelling of quantum Hall array resistance standards

    Science.gov (United States)

    Marzano, Martina; Oe, Takehiko; Ortolano, Massimo; Callegaro, Luca; Kaneko, Nobu-Hisa

    2018-04-01

    Quantum Hall array resistance standards (QHARSs) are integrated circuits composed of interconnected quantum Hall effect elements that allow the realization of virtually arbitrary resistance values. In recent years, techniques were presented to efficiently design QHARS networks. An open problem is that of the evaluation of the accuracy of a QHARS, which is affected by contact and wire resistances. In this work, we present a general and systematic procedure for the error modelling of QHARSs, which is based on modern circuit analysis techniques and Monte Carlo evaluation of the uncertainty. As a practical example, this method of analysis is applied to the characterization of a 1 MΩ QHARS developed by the National Metrology Institute of Japan. Software tools are provided to apply the procedure to other arrays.

  18. The Standard-Model Extension and Gravitational Tests

    Directory of Open Access Journals (Sweden)

    Jay D. Tasson

    2016-10-01

    Full Text Available The Standard-Model Extension (SME provides a comprehensive effective field-theory framework for the study of CPT and Lorentz symmetry. This work reviews the structure and philosophy of the SME and provides some intuitive examples of symmetry violation. The results of recent gravitational tests performed within the SME are summarized including analysis of results from the Laser Interferometer Gravitational-Wave Observatory (LIGO, sensitivities achieved in short-range gravity experiments, constraints from cosmic-ray data, and results achieved by studying planetary ephemerids. Some proposals and ongoing efforts will also be considered including gravimeter tests, tests of the Weak Equivalence Principle, and antimatter experiments. Our review of the above topics is augmented by several original extensions of the relevant work. We present new examples of symmetry violation in the SME and use the cosmic-ray analysis to place first-ever constraints on 81 additional operators.

  19. The hierarchy problem of the electroweak standard model revisited

    Energy Technology Data Exchange (ETDEWEB)

    Jegerlehner, Fred [Humboldt-Universitaet, Berlin (Germany). Inst. fuer Physik; Deutsches Elektronen-Synchrotron (DESY), Zeuthen (Germany)

    2013-05-15

    A careful renormalization group analysis of the electroweak Standard Model reveals that there is no hierarchy problem in the SM. In the broken phase a light Higgs turns out to be natural as it is self-protected and self-tuned by the Higgs mechanism. It means that the scalar Higgs needs not be protected by any extra symmetry, specifically super symmetry, in order not to be much heavier than the other SM particles which are protected by gauge- or chiral-symmetry. Thus the existence of quadratic cutoff effects in the SM cannot motivate the need for a super symmetric extensions of the SM, but in contrast plays an important role in triggering the electroweak phase transition and in shaping the Higgs potential in the early universe to drive inflation as supported by observation.

  20. Experimental limits from ATLAS on Standard Model Higgs production.

    CERN Multimedia

    ATLAS, collaboration

    2012-01-01

    Experimental limits from ATLAS on Standard Model Higgs production in the mass range 110-600 GeV. The solid curve reflects the observed experimental limits for the production of a Higgs of each possible mass value (horizontal axis). The region for which the solid curve dips below the horizontal line at the value of 1 is excluded with a 95% confidence level (CL). The dashed curve shows the expected limit in the absence of the Higgs boson, based on simulations. The green and yellow bands correspond (respectively) to 68%, and 95% confidence level regions from the expected limits. Higgs masses in the narrow range 123-130 GeV are the only masses not excluded at 95% CL

  1. CP violation outside the standard model phenomenology for pedestrians

    International Nuclear Information System (INIS)

    Lipkin, H.J.

    1993-01-01

    So far the only experimental evidence for CP violation is the 1964 discovery of K L →2π where the two mass eigenstates produced by neutral meson mixing both decay into the same CP eigenstate. This result is described by two parameters ε and ε'. Today ε ∼ its 1964 value, ε' data are still inconclusive and there is no new evidence for CP violation. One might expect to observe similar phenomena in other systems and also direct CP violation as charge asymmetries between decays of charge conjugate hadrons H ± → f ± . Why is it so hard to find CP violation? How can B Physics help? Does CP lead beyond the standard model? The author presents a pedestrian symmetry approach which exhibits the difficulties and future possibilities of these two types of CP-violation experiments, neutral meson mixing and direct charge asymmetry: what may work, what doesn't work and why

  2. Beyond the Standard Model new physics at the electroweak scale

    CERN Document Server

    Masiero, Antonio

    1997-01-01

    A critical reappraisal of the Standard Model (SM) will force us to new physics beyond it. I will argue that we have good reasons to believe that the latter is likely to lie close to the electroweak scale. After discussing the possibility that such new physics may be linked to a dynamical breaking of SU(2)xU(1) (technicolour), I will come to the core of the course: low energy supersymmetry. I will focus on the main phenomenological features, while emphasizing the relevant differences for various options of supersymmetrization of the SM. In particular the economical (but very particular) minimal SUSY SM (MSSM)will be discussed in detail. Some touchy issues for SUSY like the flavour problem or matter stability will be adressed. I will conclude with the prospects for SUSY searches in high-energy accelerators, B-factories and non-accelerator physics.

  3. Ruling out a strongly interacting standard Higgs model

    International Nuclear Information System (INIS)

    Riesselmann, K.; Willenbrock, S.

    1997-01-01

    Previous work has suggested that perturbation theory is unreliable for Higgs- and Goldstone-boson scattering, at energies above the Higgs-boson mass, for relatively small values of the Higgs quartic coupling λ(μ). By performing a summation of nonlogarithmic terms, we show that perturbation theory is in fact reliable up to relatively large coupling. This eliminates the possibility of a strongly interacting standard Higgs model at energies above the Higgs-boson mass, complementing earlier studies which excluded strong interactions at energies near the Higgs-boson mass. The summation can be formulated in terms of an appropriate scale in the running coupling, μ=√(s)/e∼√(s)/2.7, so it can be incorporated easily in renormalization-group-improved tree-level amplitudes as well as higher-order calculations. copyright 1996 The American Physical Society

  4. Consistent constraints on the Standard Model Effective Field Theory

    Energy Technology Data Exchange (ETDEWEB)

    Berthier, Laure; Trott, Michael [Niels Bohr International Academy, University of Copenhagen,Blegdamsvej 17, DK-2100 Copenhagen (Denmark)

    2016-02-10

    We develop the global constraint picture in the (linear) effective field theory generalisation of the Standard Model, incorporating data from detectors that operated at PEP, PETRA, TRISTAN, SpS, Tevatron, SLAC, LEPI and LEP II, as well as low energy precision data. We fit one hundred and three observables. We develop a theory error metric for this effective field theory, which is required when constraints on parameters at leading order in the power counting are to be pushed to the percent level, or beyond, unless the cut off scale is assumed to be large, Λ≳ 3 TeV. We more consistently incorporate theoretical errors in this work, avoiding this assumption, and as a direct consequence bounds on some leading parameters are relaxed. We show how an S,T analysis is modified by the theory errors we include as an illustrative example.

  5. DsixTools: the standard model effective field theory toolkit

    Energy Technology Data Exchange (ETDEWEB)

    Celis, Alejandro [Ludwig-Maximilians-Universitaet Muenchen, Fakultaet fuer Physik, Arnold Sommerfeld Center for Theoretical Physics, Munich (Germany); Fuentes-Martin, Javier; Vicente, Avelino [Universitat de Valencia-CSIC, Instituto de Fisica Corpuscular, Valencia (Spain); Virto, Javier [University of Bern, Albert Einstein Center for Fundamental Physics, Institute for Theoretical Physics, Bern (Switzerland)

    2017-06-15

    We present DsixTools, a Mathematica package for the handling of the dimension-six standard model effective field theory. Among other features, DsixTools allows the user to perform the full one-loop renormalization group evolution of the Wilson coefficients in the Warsaw basis. This is achieved thanks to the SMEFTrunner module, which implements the full one-loop anomalous dimension matrix previously derived in the literature. In addition, DsixTools also contains modules devoted to the matching to the ΔB = ΔS = 1, 2 and ΔB = ΔC = 1 operators of the Weak Effective Theory at the electroweak scale, and their QCD and QED Renormalization group evolution below the electroweak scale. (orig.)

  6. Image contrast enhancement based on a local standard deviation model

    International Nuclear Information System (INIS)

    Chang, Dah-Chung; Wu, Wen-Rong

    1996-01-01

    The adaptive contrast enhancement (ACE) algorithm is a widely used image enhancement method, which needs a contrast gain to adjust high frequency components of an image. In the literature, the gain is usually inversely proportional to the local standard deviation (LSD) or is a constant. But these cause two problems in practical applications, i.e., noise overenhancement and ringing artifact. In this paper a new gain is developed based on Hunt's Gaussian image model to prevent the two defects. The new gain is a nonlinear function of LSD and has the desired characteristic emphasizing the LSD regions in which details are concentrated. We have applied the new ACE algorithm to chest x-ray images and the simulations show the effectiveness of the proposed algorithm

  7. Standardization of milk mid-infrared spectrometers for the transfer and use of multiple models.

    Science.gov (United States)

    Grelet, C; Pierna, J A Fernández; Dardenne, P; Soyeurt, H; Vanlierde, A; Colinet, F; Bastin, C; Gengler, N; Baeten, V; Dehareng, F

    2017-10-01

    An increasing number of models are being developed to provide information from milk Fourier transform mid-infrared (FT-MIR) spectra on fine milk composition, technological properties of milk, or even cows' physiological status. In this context, and to take advantage of these existing models, the purpose of this work was to evaluate whether a spectral standardization method can enable the use of multiple equations within a network of different FT-MIR spectrometers. The piecewise direct standardization method was used, matching "slave" instruments to a common reference, the "master." The effect of standardization on network reproducibility was assessed on 66 instruments from 3 different brands by comparing the spectral variability of the slaves and the master with and without standardization. With standardization, the global Mahalanobis distance from the slave spectra to the master spectra was reduced on average from 2,655.9 to 14.3, representing a significant reduction of noninformative spectral variability. The transfer of models from instrument to instrument was tested using 3 FT-MIR models predicting (1) the quantity of daily methane emitted by dairy cows, (2) the concentration of polyunsaturated fatty acids in milk, and (3) the fresh cheese yield. The differences, in terms of root mean squared error, between master predictions and slave predictions were reduced after standardization on average from 103 to 17 g/d, from 0.0315 to 0.0045 g/100 mL of milk, and from 2.55 to 0.49 g of curd/100 g of milk, respectively. For all the models, standard deviations of predictions among all the instruments were also reduced by 5.11 times for methane, 5.01 times for polyunsaturated fatty acids, and 7.05 times for fresh cheese yield, showing an improvement of prediction reproducibility within the network. Regarding the results obtained, spectral standardization allows the transfer and use of multiple models on all instruments as well as the improvement of spectral and prediction

  8. Assessment of Molecular Modeling & Simulation

    Energy Technology Data Exchange (ETDEWEB)

    None

    2002-01-03

    This report reviews the development and applications of molecular and materials modeling in Europe and Japan in comparison to those in the United States. Topics covered include computational quantum chemistry, molecular simulations by molecular dynamics and Monte Carlo methods, mesoscale modeling of material domains, molecular-structure/macroscale property correlations like QSARs and QSPRs, and related information technologies like informatics and special-purpose molecular-modeling computers. The panel's findings include the following: The United States leads this field in many scientific areas. However, Canada has particular strengths in DFT methods and homogeneous catalysis; Europe in heterogeneous catalysis, mesoscale, and materials modeling; and Japan in materials modeling and special-purpose computing. Major government-industry initiatives are underway in Europe and Japan, notably in multi-scale materials modeling and in development of chemistry-capable ab-initio molecular dynamics codes.

  9. Predictive Model Assessment for Count Data

    National Research Council Canada - National Science Library

    Czado, Claudia; Gneiting, Tilmann; Held, Leonhard

    2007-01-01

    .... In case studies, we critique count regression models for patent data, and assess the predictive performance of Bayesian age-period-cohort models for larynx cancer counts in Germany. Key words: Calibration...

  10. A Quantitative Software Risk Assessment Model

    Science.gov (United States)

    Lee, Alice

    2002-01-01

    This slide presentation reviews a risk assessment model as applied to software development. the presentation uses graphs to demonstrate basic concepts of software reliability. It also discusses the application to the risk model to the software development life cycle.

  11. The hadronic standard model for strong and electroweak interactions

    Energy Technology Data Exchange (ETDEWEB)

    Raczka, R. [Soltan Inst. for Nuclear Studies, Otwock-Swierk (Poland)

    1993-12-31

    We propose a new model for strong and electro-weak interactions. First, we review various QCD predictions for hadron-hadron and lepton-hadron processes. We indicate that the present formulation of strong interactions in the frame work of Quantum Chromodynamics encounters serious conceptual and numerical difficulties in a reliable description of hadron-hadron and lepton-hadron interactions. Next we propose to replace the strong sector of Standard Model based on unobserved quarks and gluons by the strong sector based on the set of the observed baryons and mesons determined by the spontaneously broken SU(6) gauge field theory model. We analyse various properties of this model such as asymptotic freedom, Reggeization of gauge bosons and fundamental fermions, baryon-baryon and meson-baryon high energy scattering, generation of {Lambda}-polarization in inclusive processes and others. Finally we extend this model by electro-weak sector. We demonstrate a remarkable lepton and hadron anomaly cancellation and we analyse a series of important lepton-hadron and hadron-hadron processes such as e{sup +} + e{sup -} {yields} hadrons, e{sup +} + e{sup -} {yields} W{sup +} + W{sup -}, e{sup +} + e{sup -} {yields} p + anti-p, e + p {yields} e + p and p + anti-p {yields} p + anti-p processes. We obtained a series of interesting new predictions in this model especially for processes with polarized particles. We estimated the value of the strong coupling constant {alpha}(M{sub z}) and we predicted the top baryon mass M{sub {Lambda}{sub t}} {approx_equal} 240 GeV. Since in our model the proton, neutron, {Lambda}-particles, vector mesons like {rho}, {omega}, {phi}, J/{psi} ect. and leptons are elementary most of experimentally analysed lepton-hadron and hadron-hadron processes in LEP1, LEP2, LEAR, HERA, HERMES, LHC and SSC experiments may be relatively easily analysed in our model. (author). 252 refs, 65 figs, 1 tab.

  12. Adherence of Pain Assessment to the German National Standard for Pain Management in 12 Nursing Homes

    Directory of Open Access Journals (Sweden)

    Jürgen Osterbrink

    2014-01-01

    Full Text Available BACKGROUND: Pain is very common among nursing home residents. The assessment of pain is a prerequisite for effective multiprofessional pain management. Within the framework of the German health services research project, ‘Action Alliance Pain-Free City Muenster’, the authors investigated pain assessment adherence according to the German national Expert Standard for Pain Management in Nursing, which is a general standard applicable to all chronic/acute pain-affected persons and highly recommended for practice.

  13. Selected topics in phenomenology of the standard model

    International Nuclear Information System (INIS)

    Roberts, R.G.

    1991-01-01

    These lectures cover some aspects of phenomenology of topics in high energy physics which advertise the success of the standard model in dealing with a wide variety of experimental data. First we begin with a look at deep inelastic scattering. This tells us about the structure of the nucleon, which is understood in terms of the SU(3) gauge theory of QCD, which then allows the information on quark and gluon distributions to be carried over to other 'hard' processes such as hadronic production of jets. Recent data on electroweak processes can estimate the value of Sin 2 θw to a precision where the inclusion of radiative corrections allow bounds to be made on the mass of the top quark. Electroweak effects arise in e + e - collisions, but we first present a review of the recent history of this topic within the context of QCD. We bring the subject up to date with a look at the physics at (or near) the Z pole where the measurement of asymmetries can give more information. We look at the conventional description of quark mixing by the CKM matrix and see how the mixing parameters are systematically being extracted from a variety of reactions and decays. In turn, the values can be used to set bounds on the top quark mass. The matter of CP violation in weak interactions is addressed within the context of the standard model, recent data on ε'/ε being the source of current excitement. Finally, we at the theoretical description and experimental efforts to search for the top quark. (author)

  14. Semileptonic B decays in the Standard Model and beyond

    Energy Technology Data Exchange (ETDEWEB)

    Wick, Michael

    2010-09-15

    In this thesis we study several aspects of decays based on the quark level transitions b{yields}s{nu}anti {nu} and b{yields}s{mu}{sup +}{mu}{sup -} as well as transition form factors for radiative and rare semileptonic B meson decays. The quark level transition b{yields}s{nu}anti {nu} offers a transparent study of Z penguin and other electroweak penguin effects in New Physics (NP) scenarios in the absence of dipole operator contributions and Higgs penguin contributions. We present an analysis of B{yields}K*{nu}anti {nu} with improved form factors and of the decays B{yields}K{nu}anti {nu} and B{yields}X{sub s}{nu}anti {nu} in the Standard Model (SM) and in a number of NP scenarios like the general Minimal Supersymmetric Standard Model (MSSM), general scenarios with modified Z/Z{sup '} penguins and in a singlet scalar extension of the SM. The results for the SM and NP scenarios can be transparently visualized in a ({epsilon};{eta}) plane. The rare decay B{yields}K*({yields}K{pi}){mu}{sup +}{mu}{sup -} is regarded as one of the crucial channels for B physics as it gives rise to a multitude of observables. We investigate systematically the often correlated effects in these observables in the context of the SM and various NP models, in particular the Littlest Higgs model with T-parity and various MSSM scenarios and identify those observables with small to moderate dependence on hadronic quantities and large impact of NP. Furthermore, we study transition form factors for radiative and rare semi-leptonic B-meson decays into light pseudoscalar or vector mesons, combining theoretical and phenomenological constraints from Lattice QCD, light-cone sum rules, and dispersive bounds. We pay particular attention to form factor parameterizations which are based on the so-called series expansion, and study the related systematic uncertainties on a quantitative level. In this analysis as well as in the analysis of the b{yields}s transitions, we use consistently a convenient form

  15. Standard setting in student assessment: is a defensible method yet to come?

    Science.gov (United States)

    Barman, A

    2008-11-01

    Setting, maintaining and re-evaluation of assessment standard periodically are important issues in medical education. The cut-off scores are often "pulled from the air" or set to an arbitrary percentage. A large number of methods/procedures used to set standard or cut score are described in literature. There is a high degree of uncertainty in performance standard set by using these methods. Standards set using the existing methods reflect the subjective judgment of the standard setters. This review is not to describe the existing standard setting methods/procedures but to narrate the validity, reliability, feasibility and legal issues relating to standard setting. This review is on some of the issues in standard setting based on the published articles of educational assessment researchers. Standard or cut-off score should be to determine whether the examinee attained the requirement to be certified competent. There is no perfect method to determine cut score on a test and none is agreed upon as the best method. Setting standard is not an exact science. Legitimacy of the standard is supported when performance standard is linked to the requirement of practice. Test-curriculum alignment and content validity are important for most educational test validity arguments. Representative percentage of must-know learning objectives in the curriculum may be the basis of test items and pass/fail marks. Practice analysis may help in identifying the must-know areas of curriculum. Cut score set by this procedure may give the credibility, validity, defensibility and comparability of the standard. Constructing the test items by subject experts and vetted by multi-disciplinary faculty members may ensure the reliability of the test as well as the standard.

  16. The role of Health Impact Assessment in the setting of air quality standards: An Australian perspective

    Energy Technology Data Exchange (ETDEWEB)

    Spickett, Jeffery, E-mail: J.Spickett@curtin.edu.au [WHO Collaborating Centre for Environmental Health Impact Assessment (Australia); Faculty of Health Sciences, School of Public Health, Curtin University, Perth, Western Australia (Australia); Katscherian, Dianne [WHO Collaborating Centre for Environmental Health Impact Assessment (Australia); Faculty of Health Sciences, School of Public Health, Curtin University, Perth, Western Australia (Australia); Harris, Patrick [CHETRE — UNSW Research Centre for Primary Health Care and Equity, University of New South Wales (Australia)

    2013-11-15

    The approaches used for setting or reviewing air quality standards vary from country to country. The purpose of this research was to consider the potential to improve decision-making through integration of HIA into the processes to review and set air quality standards used in Australia. To assess the value of HIA in this policy process, its strengths and weaknesses were evaluated aligned with review of international processes for setting air quality standards. Air quality standard setting programmes elsewhere have either used HIA or have amalgamated and incorporated factors normally found within HIA frameworks. They clearly demonstrate the value of a formalised HIA process for setting air quality standards in Australia. The following elements should be taken into consideration when using HIA in standard setting. (a) The adequacy of a mainly technical approach in current standard setting procedures to consider social determinants of health. (b) The importance of risk assessment criteria and information within the HIA process. The assessment of risk should consider equity, the distribution of variations in air quality in different locations and the potential impacts on health. (c) The uncertainties in extrapolating evidence from one population to another or to subpopulations, especially the more vulnerable, due to differing environmental factors and population variables. (d) The significance of communication with all potential stakeholders on issues associated with the management of air quality. In Australia there is also an opportunity for HIA to be used in conjunction with the NEPM to develop local air quality standard measures. The outcomes of this research indicated that the use of HIA for air quality standard setting at the national and local levels would prove advantageous. -- Highlights: • Health Impact Assessment framework has been applied to a policy development process. • HIA process was evaluated for application in air quality standard setting.

  17. Testing keywords internationally to define and apply undergraduate assessment standards in art and design

    Directory of Open Access Journals (Sweden)

    Robert Harland

    2015-07-01

    Full Text Available What language should be featured in assessment standards for international students? Have universities adjusted their assessment methods sufficiently to match the increased demand for studying abroad? How might art and design benefit from a more stable definition of standards? These are some questions this paper seeks to address by reporting the results of recent pedagogic research at the School of the Arts, Loughborough University, in the United Kingdom. Language use is at the heart of this issue, yet it is generally overlooked as an essential tool that links assessment, feedback and action planning for international students. The paper reveals existing and new data that builds on research since 2009, aimed at improving students’ assessment literacy. Recommendations are offered to stimulate local and global discussion about keyword use for defining undergraduate assessment standards in art and design.

  18. 2002 Defense Modeling and Simulation Office (DMSO) Laboratory for Human Behavior Model Interchange Standards

    Science.gov (United States)

    2003-07-01

    among Human Behavior Modeling (HEM) -related models in the Department of Defense (DoD), Industry, Academia, and other Government simulations by...establishing a Laboratory for the Study of Human Behavior Representation Interchange Standard. With experience, expertise, and technologies of the

  19. Modeling Diagnostic Assessments with Bayesian Networks

    Science.gov (United States)

    Almond, Russell G.; DiBello, Louis V.; Moulder, Brad; Zapata-Rivera, Juan-Diego

    2007-01-01

    This paper defines Bayesian network models and examines their applications to IRT-based cognitive diagnostic modeling. These models are especially suited to building inference engines designed to be synchronous with the finer grained student models that arise in skills diagnostic assessment. Aspects of the theory and use of Bayesian network models…

  20. Standardized methodological assessment of research presentations (SHARP): development of a new instrument.

    Science.gov (United States)

    Farrokhyar, Forough; Dath, Deepak; Amin, Nalin; Bhandari, Mohit; Kelly, Stephen; Kolkin, Ann; Gill Pottruff, Catherine; Reid, Susan

    2014-06-01

    There are currently no validated guidelines to assess the quality of the content and the delivery style of scientific podium surgical presentations. We have developed a simple, short, and reliable instrument to objectively assess the overall quality of scientific podium presentations. A simple and efficient rating instrument was developed to assess the scientific content and presentation style/skills of the surgical residents' presentations from 1996 to 2013. Absolute and consistency agreement for the different sections of the instrument was determined and assessed overtime, by stage of the project and study design. Intraclass correlation coefficients with 95% confidence intervals were calculated and reported using a mixed-effects model. Inter-rater reliability for both absolute and consistency agreement was substantial for total score and for each of the 3 sections of the instrument. The absolute agreement for the overall rating of the presentations was .87 (.63 to .98) and .78 (.50 to .95), and the consistency agreement was .90 (.70 to .99) and .87 (.67 to .97) for the 2012 and 2013 institutional research presentations, respectively. Rater agreement for evaluating project stage and different study designs varied from .70 to .81 and was consistent over the years. The consistency agreement in rating of the presentation was .77 for both faculty and resident raters. Standardized methodological assessment of research presentations (SHARP) instrument rates the scientific quality of the research and style of the delivered presentation. It is highly reliable in scoring the quality of the all study designs regardless of their stage. We recommend that researchers focus on presenting the key concepts and significant elements of their evidence using visually simple slides in a professionally engaging manner for effective delivery of their research and better communication with the audience. Copyright © 2014 Elsevier Inc. All rights reserved.

  1. A spatial- and age-structured assessment model to estimate the ...

    African Journals Online (AJOL)

    , thereby indirectly negatively impacting juvenile abalone which rely on the urchins for shelter. A model is developed for abalone that is an extension of more standard age-structured assessment models because it explicitly takes spatial effects ...

  2. 13th Workshop on What Comes Beyond the Standard Models

    CERN Document Server

    Nielsen, Holger Bech; Lukman, Dragan; What Comes Beyond the Standard Models

    2010-01-01

    1. Noncommutativity and Topology within Lattice Field Theories 2. The Construction of Quantum Field Operators 3. The Bargmann-Wigner Formalism for Spin 2 Fields 4. New Light on Dark Matter from the LHC 5. Extra Dimensional Metric Reversal Symmetry and its Prospect... 6. Masses and Mixing Matrices of Families within SU(3) Flavor Symmetry ... 7. Dark Atoms of the Universe: OHe Nuclear Physics, 8. Can the Matter-Antimatter Asymmetry be Easier to Understand Within the "Spin-charge-family-theory", .. 9. Mass Matrices of Twice Four Families of Quarks and Leptons, ...in the "Spin-charge-family-theory" 10. Bohmian Quantum Mechanics or What Comes Before the Standard Model 11. Backward Causation in Complex Action Model ... 12. Is the Prediction of the "Spin-charge-family-theory" in Disagreement with the XENON100..? 13. Masses and Mixing Matrices of Families of Quarks and Leptons Within the "Spin-charge-family-theory" 14. Can the Stable Fifth Family of the "Spin-charge-family-theory" ...Form the Fifth Antibaryon Cluster...

  3. Standard model baryogenesis through four-fermion operators in braneworlds

    International Nuclear Information System (INIS)

    Chung, Daniel J.H.; Dent, Thomas

    2002-01-01

    We study a new baryogenesis scenario in a class of braneworld models with low fundamental scale, which typically have difficulty with baryogenesis. The scenario is characterized by its minimal nature: the field content is that of the standard model and all interactions consistent with the gauge symmetry are admitted. Baryon number is violated via a dimension-6 proton decay operator, suppressed today by the mechanism of quark-lepton separation in extra dimensions; we assume that this operator was unsuppressed in the early Universe due to a time-dependent quark-lepton separation. The source of CP violation is the CKM matrix, in combination with the dimension-6 operators. We find that almost independently of cosmology, sufficient baryogenesis is nearly impossible in such a scenario if the fundamental scale is above 100 TeV, as required by an unsuppressed neutron-antineutron oscillation operator. The only exception producing sufficient baryon asymmetry is a scenario involving out-of-equilibrium c quarks interacting with equilibrium b quarks

  4. Assessment of the Rescorla-Wagner model.

    Science.gov (United States)

    Miller, R R; Barnet, R C; Grahame, N J

    1995-05-01

    The Rescorla-Wagner model has been the most influential theory of associative learning to emerge from the study of animal behavior over the last 25 years. Recently, equivalence to this model has become a benchmark in assessing connectionist models, with such equivalence often achieved by incorporating the Widrow-Hoff delta rule. This article presents the Rescorla-Wagner model's basic assumptions, reviews some of the model's predictive successes and failures, relates the failures to the model's assumptions, and discusses the model's heuristic value. It is concluded that the model has had a positive influence on the study of simple associative learning by stimulating research and contributing to new model development. However, this benefit should neither lead to the model being regarded as inherently "correct" nor imply that its predictions can be profitably used to assess other models.

  5. Early Grade Writing Assessment: An Instrument Model

    Science.gov (United States)

    Jiménez, Juan E.

    2017-01-01

    The United Nations Educational, Scientific, and Cultural Organization promoted the creation of a model instrument for individual assessment of students' foundational writing skills in the Spanish language that was based on a literature review and existing writing tools and assessments. The purpose of the "Early Grade Writing Assessment"…

  6. HCPB TBM thermo mechanical design: Assessment with respect codes and standards and DEMO relevancy

    International Nuclear Information System (INIS)

    Cismondi, F.; Kecskes, S.; Aiello, G.

    2011-01-01

    In the frame of the activities of the European TBM Consortium of Associates the Helium Cooled Pebble Bed Test Blanket Module (HCPB-TBM) is developed in Karlsruhe Institute of Technology (KIT). After performing detailed thermal and fluid dynamic analyses of the preliminary HCPB TBM design, the thermo mechanical behaviour of the TBM under typical ITER loads has to be assessed. A synthesis of the different design options proposed has been realized building two different assemblies of the HCPB-TBM: these two assemblies and the analyses performed on them are presented in this paper. Finite Element thermo-mechanical analyses of two detailed 1/4 scaled models of the HCPB-TBM assemblies proposed have been performed, with the aim of verifying the accordance of the mechanical behaviour with the criteria of the design codes and standards. The structural design limits specified in the codes and standard are discussed in relation with the EUROFER available data and possible damage modes. Solutions to improve the weak structural points of the present design are identified and the DEMO relevancy of the present thermal and structural design parameters is discussed.

  7. Assessing risk of non-compliance of phosphorus standards for lakes in England and Wales

    Science.gov (United States)

    Duethmann, D.; Anthony, S.; Carvalho, L.; Spears, B.

    2009-04-01

    High population densities, use of inorganic fertilizer and intensive livestock agriculture have increased phosphorus loads to lakes, and accelerated eutrophication is a major pressure for many lakes. The EC Water Framework Directive (WFD) requires that good chemical and ecological quality is restored in all surface water bodies by 2015. Total phosphorus (TP) standards for lakes in England and Wales have been agreed recently, and our aim was to estimate what percentage of lakes in England and Wales is at risk of failing these standards. With measured lake phosphorus concentrations only being available for a small number of lakes, such an assessment had to be model based. The study also makes a source apportionment of phosphorus inputs into lakes. Phosphorus loads were estimated from a range of sources including agricultural loads, sewage effluents, septic tanks, diffuse urban sources, atmospheric deposition, groundwater and bank erosion. Lake phosphorus concentrations were predicted using the Vollenweider model, and the model framework was satisfactorily tested against available observed lake concentration data. Even though predictions for individual lakes remain uncertain, results for a population of lakes are considered as sufficiently robust. A scenario analysis was carried out to investigate to what extent reductions in phosphorus loads would increase the number of lakes achieving good ecological status in terms of TP standards. Applying the model to all lakes in England and Wales greater than 1 ha, it was calculated that under current conditions roughly two thirds of the lakes would fail the good ecological status with respect to phosphorus. According to our estimates, agricultural phosphorus loads represent the most frequent dominant source for the majority of catchments, but diffuse urban runoff also is important in many lakes. Sewage effluents are the most frequent dominant source for large lake catchments greater than 100 km². The evaluation in terms of

  8. Standardized Patients Provide a Reliable Assessment of Athletic Training Students' Clinical Skills

    Science.gov (United States)

    Armstrong, Kirk J.; Jarriel, Amanda J.

    2016-01-01

    Context: Providing students reliable objective feedback regarding their clinical performance is of great value for ongoing clinical skill assessment. Since a standardized patient (SP) is trained to consistently portray the case, students can be assessed and receive immediate feedback within the same clinical encounter; however, no research, to our…

  9. Ecosystem Model Skill Assessment. Yes We Can!

    Science.gov (United States)

    Olsen, Erik; Fay, Gavin; Gaichas, Sarah; Gamble, Robert; Lucey, Sean; Link, Jason S.

    2016-01-01

    Need to Assess the Skill of Ecosystem Models Accelerated changes to global ecosystems call for holistic and integrated analyses of past, present and future states under various pressures to adequately understand current and projected future system states. Ecosystem models can inform management of human activities in a complex and changing environment, but are these models reliable? Ensuring that models are reliable for addressing management questions requires evaluating their skill in representing real-world processes and dynamics. Skill has been evaluated for just a limited set of some biophysical models. A range of skill assessment methods have been reviewed but skill assessment of full marine ecosystem models has not yet been attempted. Northeast US Atlantis Marine Ecosystem Model We assessed the skill of the Northeast U.S. (NEUS) Atlantis marine ecosystem model by comparing 10-year model forecasts with observed data. Model forecast performance was compared to that obtained from a 40-year hindcast. Multiple metrics (average absolute error, root mean squared error, modeling efficiency, and Spearman rank correlation), and a suite of time-series (species biomass, fisheries landings, and ecosystem indicators) were used to adequately measure model skill. Overall, the NEUS model performed above average and thus better than expected for the key species that had been the focus of the model tuning. Model forecast skill was comparable to the hindcast skill, showing that model performance does not degenerate in a 10-year forecast mode, an important characteristic for an end-to-end ecosystem model to be useful for strategic management purposes. Skill Assessment Is Both Possible and Advisable We identify best-practice approaches for end-to-end ecosystem model skill assessment that would improve both operational use of other ecosystem models and future model development. We show that it is possible to not only assess the skill of a complicated marine ecosystem model, but that

  10. Probing physics beyond the standard model in diatomic molecules

    International Nuclear Information System (INIS)

    Denis, M.

    2017-01-01

    Nowadays, the incompleteness of the Standard Model of particles (SM) is largely acknowledged. One of its most obvious shortcomings is the lack of explanation for the huge surplus of matter over antimatter in the universe, the so-called baryon asymmetry of the universe. New CP (charge conjugation and spatial parity) violations absent in the SM are assumed to be responsible for this asymmetry. Such a violation could be observed, in ordinary matter through a set of interactions violating both parity and time-reversal symmetries (P, T -odd) among which the preponderant ones are the electron Electric Dipole Moment (eEDM), the electron-nucleon scalar-pseudoscalar (enSPS) and the nuclear magnetic quadrupole moment (nMQM) interactions. Hence, an experimental evidence of a non-zero P, T -odd interaction constant would be a probe of this New Physics beyond the Standard Model. The calculation of the corresponding molecular parameters is performed by making use of an elaborate four-component relativistic configuration interaction approach in polar diatomic molecules containing an actinide, that are particularly adequate systems for eEDM experiments, such as ThO that allowed for assigning the most constraining upper bound on the eEDM and ThF + that will be used in a forthcoming experiment. Those results will be of crucial importance in the interpretation of the measurements since the fundamental constants can only be evaluated if one combines both experimental energy shift measurements and theoretical molecular parameters. This manuscript proceeds as follows, after an introduction to the general background of the search of CP-violations and its consequences for the understanding of the Universe (Chapter 1), a presentation of the underlying theory of the evidence of such violation in ordinary matter, namely the P, T -odd sources of the Electric Dipole Moment of a many-electron system, as well as the relevant molecular parameters is given in Chapter 2. A similar introduction to

  11. Standardizing acute toxicity data for use in ecotoxicology models: influence of test type, life stage, and concentration reporting.

    Science.gov (United States)

    Raimondo, Sandy; Vivian, Deborah N; Barron, Mace G

    2009-10-01

    Ecotoxicological models generally have large data requirements and are frequently based on existing information from diverse sources. Standardizing data for toxicological models may be necessary to reduce extraneous variation and to ensure models reflect intrinsic relationships. However, the extent to which data standardization is necessary remains unclear, particularly when data transformations are used in model development. An extensive acute toxicity database was compiled for aquatic species to comprehensively assess the variation associated with acute toxicity test type (e.g., flow-through, static), reporting concentrations as nominal or measured, and organism life stage. Three approaches were used to assess the influence of these factors on log-transformed acute toxicity: toxicity ratios, log-linear models of factor groups, and comparison of interspecies correlation estimation (ICE) models developed using either standardized test types or reported concentration type. In general, median ratios were generally less than 2.0, the slopes of log-linear models were approximately one for well-represented comparisons, and ICE models developed using data from standardized test types or reported concentrations did not differ substantially. These results indicate that standardizing test data by acute test type, reported concentration type, or life stage may not be critical for developing ecotoxicological models using large datasets of log-transformed values.

  12. Informal Assessment of Competences in the Context of Science Standards in Austria

    Science.gov (United States)

    Schiffl, Iris

    2016-01-01

    Science standards have been a topic in educational research in Austria for about ten years now. Starting in 2005, competency structure models have been developed for junior and senior classes of different school types. After evaluating these models, prototypic tasks were created to point out the meaning of the models to teachers. At the moment,…

  13. Assessing the Effects of Corporate Social Responsibility Standards in Global Value Chains

    DEFF Research Database (Denmark)

    Lund-Thomsen, Peter

    This paper considers the issue of corporate social responsibility (CSR) standard impact assessment in global value chains. CSR standards have proliferated in recent years, and several studies have attempted to assess their effects on local producers, workers, and the environment in developing...... countries. However, much less attention has been paid to the “dark side” of impact assessment – the ethical and political dilemmas that arise in the process of carrying out impact studies. This paper addresses this gap in literature, arguing that impact assessments of CSR standards may do more harm than...... good to the intended beneficiaries - developing country firms, farmers, workers, and communities - unless these ethical and political dilemmas are given serious consideration....

  14. Les Houches Summer School on Theoretical Physics: Session 84: Particle Physics Beyond the Standard Model

    CERN Document Server

    Lavignac, Stephan; Dalibard, Jean

    2006-01-01

    The Standard Model of elementary particles and interactions is one of the tested theories in physics. This book presents a collection of lectures given in August 2005 at the Les Houches Summer School on Particle Physics beyond the Standard Model. It provides a pedagogical introduction to the aspects of particle physics beyond the Standard Model

  15. Caries risk assessment models in caries prediction

    Directory of Open Access Journals (Sweden)

    Amila Zukanović

    2013-11-01

    Full Text Available Objective. The aim of this research was to assess the efficiency of different multifactor models in caries prediction. Material and methods. Data from the questionnaire and objective examination of 109 examinees was entered into the Cariogram, Previser and Caries-Risk Assessment Tool (CAT multifactor risk assessment models. Caries risk was assessed with the help of all three models for each patient, classifying them as low, medium or high-risk patients. The development of new caries lesions over a period of three years [Decay Missing Filled Tooth (DMFT increment = difference between Decay Missing Filled Tooth Surface (DMFTS index at baseline and follow up], provided for examination of the predictive capacity concerning different multifactor models. Results. The data gathered showed that different multifactor risk assessment models give significantly different results (Friedman test: Chi square = 100.073, p=0.000. Cariogram is the model which identified the majority of examinees as medium risk patients (70%. The other two models were more radical in risk assessment, giving more unfavorable risk –profiles for patients. In only 12% of the patients did the three multifactor models assess the risk in the same way. Previser and CAT gave the same results in 63% of cases – the Wilcoxon test showed that there is no statistically significant difference in caries risk assessment between these two models (Z = -1.805, p=0.071. Conclusions. Evaluation of three different multifactor caries risk assessment models (Cariogram, PreViser and CAT showed that only the Cariogram can successfully predict new caries development in 12-year-old Bosnian children.

  16. Comparison of Standard Wind Turbine Models with Vendor Models for Power System Stability Analysis: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Honrubia-Escribano, A.; Gomez Lazaro, E.; Jimenez-Buendia, F.; Muljadi, Eduard

    2016-11-01

    The International Electrotechnical Commission Standard 61400-27-1 was published in February 2015. This standard deals with the development of generic terms and parameters to specify the electrical characteristics of wind turbines. Generic models of very complex technological systems, such as wind turbines, are thus defined based on the four common configurations available in the market. Due to its recent publication, the comparison of the response of generic models with specific vendor models plays a key role in ensuring the widespread use of this standard. This paper compares the response of a specific Gamesa dynamic wind turbine model to the corresponding generic IEC Type III wind turbine model response when the wind turbine is subjected to a three-phase voltage dip. This Type III model represents the doubly-fed induction generator wind turbine, which is not only one of the most commonly sold and installed technologies in the current market but also a complex variable-speed operation implementation. In fact, active and reactive power transients are observed due to the voltage reduction. Special attention is given to the reactive power injection provided by the wind turbine models because it is a requirement of current grid codes. Further, the boundaries of the generic models associated with transient events that cannot be represented exactly are included in the paper.

  17. Flavor democracy in standard models at high energies

    Energy Technology Data Exchange (ETDEWEB)

    Cvetic, G. (Dortmund Univ. (Germany). Inst. fuer Physik); Kim, C.S. (Yonsei Univ., Seoul (Korea, Republic of). Dept. of Physics)

    1993-10-18

    It is possible that the standard model (SM) is replaced around some transition energy [Lambda] by a new, possibly Higgsless, 'flavor gauge theory' such that the Yukawa (running) parameters of SM at E[approx][Lambda] show up an (approximate) flavor democracy (FD). We investigate the latter possibility by studying the renormalization group equations for the Yukawa couplings of SM with one and two Higgs doublets, by evolving them from given physical values at low energies (E[approx equal]1 GeV) to [Lambda] ([approx][Lambda][sub pole]) and comparing the resulting fermion masses and CKM matrix elements at E[approx equal][Lambda] for various m[sub t][sup phy] and ratios y[sub u]/y[sub d] of vacuum expectation values. We find that the minimal SM and the closely related SM with two Higgs doublets (type I) show increasing deviation from FD when energy is increased, but that SM with two Higgs doublets (type II) clearly tends to FD with increasing energy - in both the quark and the leptonic sector (q-q and l-l FD). Furthermore, we find within the type-II model that, for [Lambda][sub pole]<<[Lambda][sub Planck], m[sub t][sup phy] can be less than 200 GeV in most cases of chosen y[sub u]/y[sub d]. Under the assumption that also the corresponding Yukawa couplings in the quark and the leptonic sector at E[approx equal][Lambda] are equal (l-q FD), we derive estimates of bounds on masses of top quark and tau-neutrino, which are compatible with experimental bounds. (orig.)

  18. LHCb is trying to crack the Standard Model

    CERN Multimedia

    2011-01-01

    LHCb will reveal new results tomorrow that will shed more light on the possible CP-violation measurement reported recently by the Tevatron experiments, different from Standard Model predictions. Quantum Diaries blogger for CERN, Pauline Gagnon, explains how.   LHCb, one of the Large Hadron Collider (LHC) experiments, was designed specifically to study charge-parity or CP violation. In simple words, its goal is to explain why more matter than antimatter was produced when the Universe slowly cooled down after the Big Bang, leading to a world predominantly composed of matter. This is quite puzzling since in laboratory experiments we do not measure a preference for the creation of matter over antimatter. Hence the CP-conservation law in physics that states that Nature should not have a preference for matter over antimatter. So why did the Universe evolve this way? One of the best ways to study this phenomenon is with b quarks. Since they are heavy, they can decay (i.e break down into smaller parts) ...

  19. Electroweak Precision Observables in the Minimal Supersymmetric Standard Model

    CERN Document Server

    Heinemeyer, S; Weiglein, Georg

    2006-01-01

    The current status of electroweak precision observables in the Minimal Supersymmetric Standard Model (MSSM) is reviewed. We focus in particular on the $W$ boson mass, M_W, the effective leptonic weak mixing angle, sin^2 theta_eff, the anomalous magnetic moment of the muon, (g-2)_\\mu, and the lightest CP-even MSSM Higgs boson mass, m_h. We summarize the current experimental situation and the status of the theoretical evaluations. An estimate of the current theoretical uncertainties from unknown higher-order corrections and from the experimental errors of the input parameters is given. We discuss future prospects for both the experimental accuracies and the precision of the theoretical predictions. Confronting the precision data with the theory predictions within the unconstrained MSSM and within specific SUSY-breaking scenarios, we analyse how well the data are described by the theory. The mSUGRA scenario with cosmological constraints yields a very good fit to the data, showing a clear preference for a relativ...

  20. On the fate of the Standard Model at finite temperature

    Energy Technology Data Exchange (ETDEWEB)

    Rose, Luigi Delle; Marzo, Carlo [Università del Salento, Dipartimento di Matematica e Fisica “Ennio De Giorgi' ,Via Arnesano, 73100 Lecce (Italy); INFN - Sezione di Lecce,via Arnesano, 73100 Lecce (Italy); Urbano, Alfredo [SISSA - International School for Advanced Studies,via Bonomea 256, 34136 Trieste (Italy)

    2016-05-10

    In this paper we revisit and update the computation of thermal corrections to the stability of the electroweak vacuum in the Standard Model. At zero temperature, we make use of the full two-loop effective potential, improved by three-loop beta functions with two-loop matching conditions. At finite temperature, we include one-loop thermal corrections together with resummation of daisy diagrams. We solve numerically — both at zero and finite temperature — the bounce equation, thus providing an accurate description of the thermal tunneling. Assuming a maximum temperature in the early Universe of the order of 10{sup 18} GeV, we find that the instability bound excludes values of the top mass M{sub t}≳173.6 GeV, with M{sub h}≃125 GeV and including uncertainties on the strong coupling. We discuss the validity and temperature-dependence of this bound in the early Universe, with a special focus on the reheating phase after inflation.