WorldWideScience

Sample records for models standard assessment

  1. STAMINA - Model description. Standard Model Instrumentation for Noise Assessments

    NARCIS (Netherlands)

    Schreurs EM; Jabben J; Verheijen ENG; CMM; mev

    2010-01-01

    Deze rapportage beschrijft het STAMINA-model, dat staat voor Standard Model Instrumentation for Noise Assessments en door het RIVM is ontwikkeld. Het instituut gebruikt dit standaardmodel om omgevingsgeluid in Nederland in kaart te brengen. Het model is gebaseerd op de Standaard Karteringsmethode

  2. Higher Education Quality Assessment Model: Towards Achieving Educational Quality Standard

    Science.gov (United States)

    Noaman, Amin Y.; Ragab, Abdul Hamid M.; Madbouly, Ayman I.; Khedra, Ahmed M.; Fayoumi, Ayman G.

    2017-01-01

    This paper presents a developed higher education quality assessment model (HEQAM) that can be applied for enhancement of university services. This is because there is no universal unified quality standard model that can be used to assess the quality criteria of higher education institutes. The analytical hierarchy process is used to identify the…

  3. Regional drought assessment using a distributed hydrological model coupled with Standardized Runoff Index

    Directory of Open Access Journals (Sweden)

    H. Shen

    2015-05-01

    Full Text Available Drought assessment is essential for coping with frequent droughts nowadays. Owing to the large spatio-temporal variations in hydrometeorology in most regions in China, it is very necessary to use a physically-based hydrological model to produce rational spatial and temporal distributions of hydro-meteorological variables for drought assessment. In this study, the large-scale distributed hydrological model Variable Infiltration Capacity (VIC was coupled with a modified standardized runoff index (SRI for drought assessment in the Weihe River basin, northwest China. The result indicates that the coupled model is capable of reasonably reproducing the spatial distribution of drought occurrence. It reflected the spatial heterogeneity of regional drought and improved the physical mechanism of SRI. This model also has potential for drought forecasting, early warning and mitigation, given that accurate meteorological forcing data are available.

  4. The Relationship between Students' Performance on Conventional Standardized Mathematics Assessments and Complex Mathematical Modeling Problems

    Science.gov (United States)

    Kartal, Ozgul; Dunya, Beyza Aksu; Diefes-Dux, Heidi A.; Zawojewski, Judith S.

    2016-01-01

    Critical to many science, technology, engineering, and mathematics (STEM) career paths is mathematical modeling--specifically, the creation and adaptation of mathematical models to solve problems in complex settings. Conventional standardized measures of mathematics achievement are not structured to directly assess this type of mathematical…

  5. A Standardized Generalized Dimensionality Discrepancy Measure and a Standardized Model-Based Covariance for Dimensionality Assessment for Multidimensional Models

    Science.gov (United States)

    Levy, Roy; Xu, Yuning; Yel, Nedim; Svetina, Dubravka

    2015-01-01

    The standardized generalized dimensionality discrepancy measure and the standardized model-based covariance are introduced as tools to critique dimensionality assumptions in multidimensional item response models. These tools are grounded in a covariance theory perspective and associated connections between dimensionality and local independence.…

  6. Scan-To Output Validation: Towards a Standardized Geometric Quality Assessment of Building Information Models Based on Point Clouds

    Science.gov (United States)

    Bonduel, M.; Bassier, M.; Vergauwen, M.; Pauwels, P.; Klein, R.

    2017-11-01

    The use of Building Information Modeling (BIM) for existing buildings based on point clouds is increasing. Standardized geometric quality assessment of the BIMs is needed to make them more reliable and thus reusable for future users. First, available literature on the subject is studied. Next, an initial proposal for a standardized geometric quality assessment is presented. Finally, this method is tested and evaluated with a case study. The number of specifications on BIM relating to existing buildings is limited. The Levels of Accuracy (LOA) specification of the USIBD provides definitions and suggestions regarding geometric model accuracy, but lacks a standardized assessment method. A deviation analysis is found to be dependent on (1) the used mathematical model, (2) the density of the point clouds and (3) the order of comparison. Results of the analysis can be graphical and numerical. An analysis on macro (building) and micro (BIM object) scale is necessary. On macro scale, the complete model is compared to the original point cloud and vice versa to get an overview of the general model quality. The graphical results show occluded zones and non-modeled objects respectively. Colored point clouds are derived from this analysis and integrated in the BIM. On micro scale, the relevant surface parts are extracted per BIM object and compared to the complete point cloud. Occluded zones are extracted based on a maximum deviation. What remains is classified according to the LOA specification. The numerical results are integrated in the BIM with the use of object parameters.

  7. State Standards and State Assessment Systems: A Guide to Alignment. Series on Standards and Assessments.

    Science.gov (United States)

    La Marca, Paul M.; Redfield, Doris; Winter, Phoebe C.

    Alignment of content standards, performance standards, and assessments is crucial. This guide contains information to assist states and districts in aligning their assessment systems to their content and performance standards. It includes a review of current literature, both published and fugitive. The research is woven together with a few basic…

  8. Framework for Designing The Assessment Models of Readiness SMEs to Adopt Indonesian National Standard (SNI), Case Study: SMEs Batik in Surakarta

    Science.gov (United States)

    Fahma, Fakhrina; Zakaria, Roni; Fajar Gumilang, Royan

    2018-03-01

    Since the ASEAN Economic Community (AEC) is released, the opportunity to expand market share has become very open, but the level of competition is also very high. Standardization is believed to be an important factor in seizing opportunities in the AEC’s era and other free trade agreements in the future. Standardization activities in industry can be proven by obtaining certification of SNI (Indonesian National Standard). This is a challenge for SMEs, considering that currently only 20% of SMEs had SNI certification both product and process. This research will be designed a model of readiness assessment to obtain SNI certification for SMEs. The stages of model development used an innovative approach by Roger (2003). Variables that affect the readiness of SMEs are obtained from product certification requirements established by BSN (National Standardization Agency) and LSPro (Certification body). This model will be used for mapping the readiness for SNI certification of SMEs’product. The level of readiness of SMEs is determined by the percentage of compliance with those requirements. Based on the result of this study, the five variables are determined as main aspects to assess SME readiness. For model validation, trials were conducted on Batik SMEs in Laweyan Surakarta.

  9. Perceptual video quality assessment in H.264 video coding standard using objective modeling.

    Science.gov (United States)

    Karthikeyan, Ramasamy; Sainarayanan, Gopalakrishnan; Deepa, Subramaniam Nachimuthu

    2014-01-01

    Since usage of digital video is wide spread nowadays, quality considerations have become essential, and industry demand for video quality measurement is rising. This proposal provides a method of perceptual quality assessment in H.264 standard encoder using objective modeling. For this purpose, quality impairments are calculated and a model is developed to compute the perceptual video quality metric based on no reference method. Because of the shuttle difference between the original video and the encoded video the quality of the encoded picture gets degraded, this quality difference is introduced by the encoding process like Intra and Inter prediction. The proposed model takes into account of the artifacts introduced by these spatial and temporal activities in the hybrid block based coding methods and an objective modeling of these artifacts into subjective quality estimation is proposed. The proposed model calculates the objective quality metric using subjective impairments; blockiness, blur and jerkiness compared to the existing bitrate only calculation defined in the ITU G 1070 model. The accuracy of the proposed perceptual video quality metrics is compared against popular full reference objective methods as defined by VQEG.

  10. Standard Model processes

    CERN Document Server

    Mangano, M.L.; Aguilar-Saavedra, Juan Antonio; Alekhin, S.; Badger, S.; Bauer, C.W.; Becher, T.; Bertone, V.; Bonvini, M.; Boselli, S.; Bothmann, E.; Boughezal, R.; Cacciari, M.; Carloni Calame, C.M.; Caola, F.; Campbell, J.M.; Carrazza, S.; Chiesa, M.; Cieri, L.; Cimaglia, F.; Febres Cordero, F.; Ferrarese, P.; D'Enterria, D.; Ferrera, G.; Garcia i Tormo, X.; Garzelli, M.V.; Germann, E.; Hirschi, V.; Han, T.; Ita, H.; Jäger, B.; Kallweit, S.; Karlberg, A.; Kuttimalai, S.; Krauss, F.; Larkoski, A.J.; Lindert, J.; Luisoni, G.; Maierhöfer, P.; Mattelaer, O.; Martinez, H.; Moch, S.; Montagna, G.; Moretti, M.; Nason, P.; Nicrosini, O.; Oleari, C.; Pagani, D.; Papaefstathiou, A.; Petriello, F.; Piccinini, F.; Pierini, M.; Pierog, T.; Pozzorini, S.; Re, E.; Robens, T.; Rojo, J.; Ruiz, R.; Sakurai, K.; Salam, G.P.; Salfelder, L.; Schönherr, M.; Schulze, M.; Schumann, S.; Selvaggi, M.; Shivaji, A.; Siodmok, A.; Skands, P.; Torrielli, P.; Tramontano, F.; Tsinikos, I.; Tweedie, B.; Vicini, A.; Westhoff, S.; Zaro, M.; Zeppenfeld, D.; CERN. Geneva. ATS Department

    2017-06-22

    This report summarises the properties of Standard Model processes at the 100 TeV pp collider. We document the production rates and typical distributions for a number of benchmark Standard Model processes, and discuss new dynamical phenomena arising at the highest energies available at this collider. We discuss the intrinsic physics interest in the measurement of these Standard Model processes, as well as their role as backgrounds for New Physics searches.

  11. The Standard Model

    International Nuclear Information System (INIS)

    Sutton, Christine

    1994-01-01

    The initial evidence from Fermilab for the long awaited sixth ('top') quark puts another rivet in the already firm structure of today's Standard Model of physics. Analysis of the Fermilab CDF data gives a top mass of 174 GeV with an error of ten per cent either way. This falls within the mass band predicted by the sum total of world Standard Model data and underlines our understanding of physics in terms of six quarks and six leptons. In this specially commissioned overview, physics writer Christine Sutton explains the Standard Model

  12. The Standard Model course

    CERN Multimedia

    CERN. Geneva HR-RFA

    2006-01-01

    Suggested Readings: Aspects of Quantum Chromodynamics/A Pich, arXiv:hep-ph/0001118. - The Standard Model of Electroweak Interactions/A Pich, arXiv:hep-ph/0502010. - The Standard Model of Particle Physics/A Pich The Standard Model of Elementary Particle Physics will be described. A detailed discussion of the particle content, structure and symmetries of the theory will be given, together with an overview of the most important experimental facts which have established this theoretical framework as the Standard Theory of particle interactions.

  13. Standard deviation analysis of the mastoid fossa temperature differential reading: a potential model for objective chiropractic assessment.

    Science.gov (United States)

    Hart, John

    2011-03-01

    This study describes a model for statistically analyzing follow-up numeric-based chiropractic spinal assessments for an individual patient based on his or her own baseline. Ten mastoid fossa temperature differential readings (MFTD) obtained from a chiropractic patient were used in the study. The first eight readings served as baseline and were compared to post-adjustment readings. One of the two post-adjustment MFTD readings fell outside two standard deviations of the baseline mean and therefore theoretically represents improvement according to pattern analysis theory. This study showed how standard deviation analysis may be used to identify future outliers for an individual patient based on his or her own baseline data. Copyright © 2011 National University of Health Sciences. Published by Elsevier Inc. All rights reserved.

  14. Premise for Standardized Sepsis Models.

    Science.gov (United States)

    Remick, Daniel G; Ayala, Alfred; Chaudry, Irshad; Coopersmith, Craig M; Deutschman, Clifford; Hellman, Judith; Moldawer, Lyle; Osuchowski, Marcin

    2018-06-05

    Sepsis morbidity and mortality exacts a toll on patients and contributes significantly to healthcare costs. Preclinical models of sepsis have been used to study disease pathogenesis and test new therapies, but divergent outcomes have been observed with the same treatment even when using the same sepsis model. Other disorders such as diabetes, cancer, malaria, obesity and cardiovascular diseases have used standardized, preclinical models that allow laboratories to compare results. Standardized models accelerate the pace of research and such models have been used to test new therapies or changes in treatment guidelines. The National Institutes of Health (NIH) mandated that investigators increase data reproducibility and the rigor of scientific experiments and has also issued research funding announcements about the development and refinement of standardized models. Our premise is that refinement and standardization of preclinical sepsis models may accelerate the development and testing of potential therapeutics for human sepsis, as has been the case with preclinical models for other disorders. As a first step towards creating standardized models, we suggest 1) standardizing the technical standards of the widely used cecal ligation and puncture model and 2) creating a list of appropriate organ injury and immune dysfunction parameters. Standardized sepsis models could enhance reproducibility and allow comparison of results between laboratories and may accelerate our understanding of the pathogenesis of sepsis.

  15. Shifting Gears: Standards, Assessments, Curriculum, & Instruction.

    Science.gov (United States)

    Dougherty, Eleanor

    This book is designed to help educators move from a system that measures students against students to one that values mastery of central concepts and skills, striving for proficiency in publicly acknowledged standards of academic performance. It aims to connect the operative parts of standards-based education (standards, assessment, curriculum,…

  16. Beyond the standard model

    International Nuclear Information System (INIS)

    Wilczek, F.

    1993-01-01

    The standard model of particle physics is highly successful, although it is obviously not a complete or final theory. In this presentation the author argues that the structure of the standard model gives some quite concrete, compelling hints regarding what lies beyond. Essentially, this presentation is a record of the author's own judgement of what the central clues for physics beyond the standard model are, and also it is an attempt at some pedagogy. 14 refs., 6 figs

  17. Basic Laparoscopic Skills Assessment Study: Validation and Standard Setting among Canadian Urology Trainees.

    Science.gov (United States)

    Lee, Jason Y; Andonian, Sero; Pace, Kenneth T; Grober, Ethan

    2017-06-01

    As urology training programs move to a competency based medical education model, iterative assessments with objective standards will be required. To develop a valid set of technical skills standards we initiated a national skills assessment study focusing initially on laparoscopic skills. Between February 2014 and March 2016 the basic laparoscopic skill of Canadian urology trainees and attending urologists was assessed using 4 standardized tasks from the AUA (American Urological Association) BLUS (Basic Laparoscopic Urological Surgery) curriculum, including peg transfer, pattern cutting, suturing and knot tying, and vascular clip applying. All performances were video recorded and assessed using 3 methods, including time and error based scoring, expert global rating scores and C-SATS (Crowd-Sourced Assessments of Technical Skill Global Rating Scale), a novel, crowd sourced assessment platform. Different methods of standard setting were used to develop pass-fail cut points. Six attending urologists and 99 trainees completed testing. Reported laparoscopic experience and training level correlated with performance (p standard setting methods to define pass-fail cut points for all 4 AUA BLUS tasks. The 4 AUA BLUS tasks demonstrated good construct validity evidence for use in assessing basic laparoscopic skill. Performance scores using the novel C-SATS platform correlated well with traditional time-consuming methods of assessment. Various standard setting methods were used to develop pass-fail cut points for educators to use when making formative and summative assessments of basic laparoscopic skill. Copyright © 2017 American Urological Association Education and Research, Inc. Published by Elsevier Inc. All rights reserved.

  18. Beyond the standard model; Au-dela du modele standard

    Energy Technology Data Exchange (ETDEWEB)

    Cuypers, F. [Paul Scherrer Inst. (PSI), Villigen (Switzerland)

    1997-05-01

    These lecture notes are intended as a pedagogical introduction to several popular extensions of the standard model of strong and electroweak interactions. The topics include the Higgs sector, the left-right symmetric model, grand unification and supersymmetry. Phenomenological consequences and search procedures are emphasized. (author) figs., tabs., 18 refs.

  19. Control system architecture: The standard and non-standard models

    International Nuclear Information System (INIS)

    Thuot, M.E.; Dalesio, L.R.

    1993-01-01

    Control system architecture development has followed the advances in computer technology through mainframes to minicomputers to micros and workstations. This technology advance and increasingly challenging accelerator data acquisition and automation requirements have driven control system architecture development. In summarizing the progress of control system architecture at the last International Conference on Accelerator and Large Experimental Physics Control Systems (ICALEPCS) B. Kuiper asserted that the system architecture issue was resolved and presented a ''standard model''. The ''standard model'' consists of a local area network (Ethernet or FDDI) providing communication between front end microcomputers, connected to the accelerator, and workstations, providing the operator interface and computational support. Although this model represents many present designs, there are exceptions including reflected memory and hierarchical architectures driven by requirements for widely dispersed, large channel count or tightly coupled systems. This paper describes the performance characteristics and features of the ''standard model'' to determine if the requirements of ''non-standard'' architectures can be met. Several possible extensions to the ''standard model'' are suggested including software as well as the hardware architectural feature

  20. Beyond the standard model

    International Nuclear Information System (INIS)

    Altarelli, G.

    1987-01-01

    The standard model of particle interactions is a complete and relatively simple theoretical framework which describes all the observed fundamental forces. It consists of quantum chromodynamics (QCD) and of the electro-weak theory of Glashow, Salam and Weinberg. The former is the theory of colored quarks and gluons, which underlies the observed phenomena of strong interactions, the latter leads to a unified description of electromagnetism and of weak interactions. The inclusion of the classical Einstein theory of gravity completes the set of established basic knowledge. The standard model is in agreement with essentially all of the experimental information which is very rich by now. The recent discovery of the charged and neutral intermediate vector bosons of weak interactions at the expected masses has closed a really important chapter of particle physics. Never before the prediction of new particles was so neat and quantitatively precise. Yet the experimental proof of the standard model is not completed. For example, the hints of experimental evidence for the top quark at a mass ∼ 40 GeV have not yet been firmly established. The Higgs sector of the theory has not been tested at all. Beyond the realm of pure QED, even remaining within the electro-weak sector, the level of quantitative precision in testing the standard model does not exceed 5% or so. Furthermore, the standard model does not look as the ultimate theory. To a closer inspection a large class of fundamental questions emerges and one finds that a host of crucial problems are left open by the standard model

  1. Psychological distress and streamlined BreastScreen follow-up assessment versus standard assessment.

    Science.gov (United States)

    Sherman, Kerry A; Winch, Caleb J; Borecky, Natacha; Boyages, John

    2013-11-04

    To establish whether altered protocol characteristics of streamlined StepDown breast assessment clinics heightened or reduced the psychological distress of women in attendance compared with standard assessment. Willingness to attend future screening was also compared between the assessment groups. Observational, prospective study of women attending either a mammogram-only StepDown or a standard breast assessment clinic. Women completed questionnaires on the day of assessment and 1 month later. Women attending StepDown (136 women) or standard assessment clinics (148 women) at a BreastScreen centre between 10 November 2009 and 7 August 2010. Breast cancer worries; positive and negative psychological consequences of assessment (Psychological Consequences Questionnaire); breast cancer-related intrusion and avoidance (Impact of Event Scale); and willingness to attend, and uneasiness about, future screening. At 1-month follow-up, no group differences were evident between those attending standard and StepDown clinics on breast cancer worries (P= 0.44), positive (P= 0.88) and negative (P = 0.65) consequences, intrusion (P = 0.64), and avoidance (P = 0.87). Willingness to return for future mammograms was high, and did not differ between groups (P = 0.16), although higher levels of unease were associated with lessened willingness to rescreen (P = 0.04). There was no evidence that attending streamlined StepDown assessments had different outcomes in terms of distress than attending standard assessment clinics for women with a BreastScreen-detected abnormality. However, unease about attending future screening was generally associated with less willingness to do so in both groups; thus, there is a role for psycho-educational intervention to address these concerns.

  2. Consistency Across Standards or Standards in a New Business Model

    Science.gov (United States)

    Russo, Dane M.

    2010-01-01

    Presentation topics include: standards in a changing business model, the new National Space Policy is driving change, a new paradigm for human spaceflight, consistency across standards, the purpose of standards, danger of over-prescriptive standards, a balance is needed (between prescriptive and general standards), enabling versus inhibiting, characteristics of success-oriented standards, characteristics of success-oriented standards, and conclusions. Additional slides include NASA Procedural Requirements 8705.2B identifies human rating standards and requirements, draft health and medical standards for human rating, what's been done, government oversight models, examples of consistency from anthropometry, examples of inconsistency from air quality and appendices of government and non-governmental human factors standards.

  3. General formulation of standard model the standard model is in need of new concepts

    International Nuclear Information System (INIS)

    Khodjaev, L.Sh.

    2001-01-01

    The phenomenological basis for formulation of the Standard Model has been reviewed. The Standard Model based on the fundamental postulates has been formulated. The concept of the fundamental symmetries has been introduced: To look for not fundamental particles but fundamental symmetries. By searching of more general theory it is natural to search first of all global symmetries and than to learn consequence connected with the localisation of this global symmetries like wise of the standard Model

  4. Beyond the standard model

    International Nuclear Information System (INIS)

    Pleitez, V.

    1994-01-01

    The search for physics laws beyond the standard model is discussed in a general way, and also some topics on supersymmetry theories. An approach is made on recent possibilities rise in the leptonic sector. Finally, models with SU(3) c X SU(2) L X U(1) Y symmetry are considered as alternatives for the extensions of the elementary particles standard model. 36 refs., 1 fig., 4 tabs

  5. Tests of Alignment among Assessment, Standards, and Instruction Using Generalized Linear Model Regression

    Science.gov (United States)

    Fulmer, Gavin W.; Polikoff, Morgan S.

    2014-01-01

    An essential component in school accountability efforts is for assessments to be well-aligned with the standards or curriculum they are intended to measure. However, relatively little prior research has explored methods to determine statistical significance of alignment or misalignment. This study explores analyses of alignment as a special case…

  6. The Cosmological Standard Model and Its Implications for Beyond the Standard Model of Particle Physics

    CERN Multimedia

    CERN. Geneva

    2011-01-01

    While the cosmological standard model has many notable successes, it assumes 95% of the mass-energy density of the universe is dark and of unknown nature, and there was an early stage of inflationary expansion driven by physics far beyond the range of the particle physics standard model. In the colloquium I will discuss potential particle-physics implications of the standard cosmological model.

  7. Technical Standards on the Safety Assessment of a HLW Repository in Other Countries

    International Nuclear Information System (INIS)

    Lee, Sung Ho; Hwang, Yong Soo

    2009-01-01

    The basic function of HLW disposal system is to prevent excessive radio-nuclides being leaked from the repository in a short time. To do this, many technical standards should be developed and established on the components of disposal system. Safety assessment of a repository is considered as one of technical standards, because it produces quantitative results of the future evolution of a repository based on a reasonably simplified model. In this paper, we investigated other countries' regulations related to safely assessment focused on the assessment period, radiation dose limits and uncertainties of the assessment. Especially, in the investigation process of the USA regulations, the USA regulatory bodies' approach to assessment period and peak dose is worth taking into account in case of a conflict between peak dose from safety assessment and limited value in regulation.

  8. An alternative to the standard model

    International Nuclear Information System (INIS)

    Baek, Seungwon; Ko, Pyungwon; Park, Wan-Il

    2014-01-01

    We present an extension of the standard model to dark sector with an unbroken local dark U(1) X symmetry. Including various singlet portal interactions provided by the standard model Higgs, right-handed neutrinos and kinetic mixing, we show that the model can address most of phenomenological issues (inflation, neutrino mass and mixing, baryon number asymmetry, dark matter, direct/indirect dark matter searches, some scale scale puzzles of the standard collisionless cold dark matter, vacuum stability of the standard model Higgs potential, dark radiation) and be regarded as an alternative to the standard model. The Higgs signal strength is equal to one as in the standard model for unbroken U(1) X case with a scalar dark matter, but it could be less than one independent of decay channels if the dark matter is a dark sector fermion or if U(1) X is spontaneously broken, because of a mixing with a new neutral scalar boson in the models

  9. Standards, Assessments & Opting Out, Spring 2015

    Science.gov (United States)

    Advance Illinois, 2015

    2015-01-01

    In the spring, Illinois students will take new state assessments that reflect the rigor and relevance of the new Illinois Learning Standards. But some classmates will sit out and join the pushback against standardized testing. Opt-out advocates raise concerns about over-testing, and the resulting toll on students as well as the impact on classroom…

  10. Calculating Impacts of Energy Standards on Energy Demand in U.S. Buildings under Uncertainty with an Integrated Assessment Model: Technical Background Data

    Energy Technology Data Exchange (ETDEWEB)

    Scott, Michael J. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Daly, Don S. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Hathaway, John E. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Lansing, Carina S. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Liu, Ying [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); McJeon, Haewon C. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Moss, Richard H. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Patel, Pralit L. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Peterson, Marty J. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Rice, Jennie S. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Zhou, Yuyu [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2014-12-06

    This report presents data and assumptions employed in an application of PNNL’s Global Change Assessment Model with a newly-developed Monte Carlo analysis capability. The model is used to analyze the impacts of more aggressive U.S. residential and commercial building-energy codes and equipment standards on energy consumption and energy service costs at the state level, explicitly recognizing uncertainty in technology effectiveness and cost, socioeconomics, presence or absence of carbon prices, and climate impacts on energy demand. The report provides a summary of how residential and commercial buildings are modeled, together with assumptions made for the distributions of state–level population, Gross Domestic Product (GDP) per worker, efficiency and cost of residential and commercial energy equipment by end use, and efficiency and cost of residential and commercial building shells. The cost and performance of equipment and of building shells are reported separately for current building and equipment efficiency standards and for more aggressive standards. The report also details assumptions concerning future improvements brought about by projected trends in technology.

  11. Performance Standards': Utility for Different Uses of Assessments

    Directory of Open Access Journals (Sweden)

    Robert L. Linn

    2003-09-01

    Full Text Available Performance standards are arguably one of the most controversial topics in educational measurement. There are uses of assessments such as licensure and certification where performance standards are essential. There are many other uses, however, where performance standards have been mandated or become the preferred method of reporting assessment results where the standards are not essential to the use. Distinctions between essential and nonessential uses of performance standards are discussed. It is argued that the insistence on reporting in terms of performance standards in situations where they are not essential has been more harmful than helpful. Variability in the definitions of proficient academic achievement by states for purposes of the No Child Left Behind Act of 2001 is discussed and it is argued that the variability is so great that characterizing achievement is meaningless. Illustrations of the great uncertainty in standards are provided.

  12. Beyond the Standard Model

    International Nuclear Information System (INIS)

    Peskin, M.E.

    1997-05-01

    These lectures constitute a short course in ''Beyond the Standard Model'' for students of experimental particle physics. The author discusses the general ideas which guide the construction of models of physics beyond the Standard model. The central principle, the one which most directly motivates the search for new physics, is the search for the mechanism of the spontaneous symmetry breaking observed in the theory of weak interactions. To illustrate models of weak-interaction symmetry breaking, the author gives a detailed discussion of the idea of supersymmetry and that of new strong interactions at the TeV energy scale. He discusses experiments that will probe the details of these models at future pp and e + e - colliders

  13. Beyond the Standard Model

    Energy Technology Data Exchange (ETDEWEB)

    Peskin, M.E.

    1997-05-01

    These lectures constitute a short course in ``Beyond the Standard Model`` for students of experimental particle physics. The author discusses the general ideas which guide the construction of models of physics beyond the Standard model. The central principle, the one which most directly motivates the search for new physics, is the search for the mechanism of the spontaneous symmetry breaking observed in the theory of weak interactions. To illustrate models of weak-interaction symmetry breaking, the author gives a detailed discussion of the idea of supersymmetry and that of new strong interactions at the TeV energy scale. He discusses experiments that will probe the details of these models at future pp and e{sup +}e{sup {minus}} colliders.

  14. Impact assessment of commodity standards

    NARCIS (Netherlands)

    Ruben, Ruerd

    2017-01-01

    Voluntary commodity standards are widely used to enhance the performance of tropical agro-food chains and to support the welfare and sustainability of smallholder farmers. Different methods and approaches are used to assess the effectiveness and impact of these certification schemes at

  15. Ecological models and pesticide risk assessment: current modeling practice.

    Science.gov (United States)

    Schmolke, Amelie; Thorbek, Pernille; Chapman, Peter; Grimm, Volker

    2010-04-01

    Ecological risk assessments of pesticides usually focus on risk at the level of individuals, and are carried out by comparing exposure and toxicological endpoints. However, in most cases the protection goal is populations rather than individuals. On the population level, effects of pesticides depend not only on exposure and toxicity, but also on factors such as life history characteristics, population structure, timing of application, presence of refuges in time and space, and landscape structure. Ecological models can integrate such factors and have the potential to become important tools for the prediction of population-level effects of exposure to pesticides, thus allowing extrapolations, for example, from laboratory to field. Indeed, a broad range of ecological models have been applied to chemical risk assessment in the scientific literature, but so far such models have only rarely been used to support regulatory risk assessments of pesticides. To better understand the reasons for this situation, the current modeling practice in this field was assessed in the present study. The scientific literature was searched for relevant models and assessed according to nine characteristics: model type, model complexity, toxicity measure, exposure pattern, other factors, taxonomic group, risk assessment endpoint, parameterization, and model evaluation. The present study found that, although most models were of a high scientific standard, many of them would need modification before they are suitable for regulatory risk assessments. The main shortcomings of currently available models in the context of regulatory pesticide risk assessments were identified. When ecological models are applied to regulatory risk assessments, we recommend reviewing these models according to the nine characteristics evaluated here. (c) 2010 SETAC.

  16. The Standard Model

    Science.gov (United States)

    Burgess, Cliff; Moore, Guy

    2012-04-01

    List of illustrations; List of tables; Preface; Acknowledgments; Part I. Theoretical Framework: 1. Field theory review; 2. The standard model: general features; 3. Cross sections and lifetimes; Part II. Applications: Leptons: 4. Elementary boson decays; 5. Leptonic weak interactions: decays; 6. Leptonic weak interactions: collisions; 7. Effective Lagrangians; Part III. Applications: Hadrons: 8. Hadrons and QCD; 9. Hadronic interactions; Part IV. Beyond the Standard Model: 10. Neutrino masses; 11. Open questions, proposed solutions; Appendix A. Experimental values for the parameters; Appendix B. Symmetries and group theory review; Appendix C. Lorentz group and the Dirac algebra; Appendix D. ξ-gauge Feynman rules; Appendix E. Metric convention conversion table; Select bibliography; Index.

  17. Testing the standard model

    International Nuclear Information System (INIS)

    Gordon, H.; Marciano, W.; Williams, H.H.

    1982-01-01

    We summarize here the results of the standard model group which has studied the ways in which different facilities may be used to test in detail what we now call the standard model, that is SU/sub c/(3) x SU(2) x U(1). The topics considered are: W +- , Z 0 mass, width; sin 2 theta/sub W/ and neutral current couplings; W + W - , Wγ; Higgs; QCD; toponium and naked quarks; glueballs; mixing angles; and heavy ions

  18. A Model for Semantic IS Standards

    NARCIS (Netherlands)

    Folmer, Erwin Johan Albert; Oude Luttighuis, Paul; van Hillegersberg, Jos

    2011-01-01

    We argue that, in order to suggest improvements of any kind to semantic information system (IS) standards, better understanding of the conceptual structure of semantic IS standard is required. This study develops a model for semantic IS standard, based on literature and expert knowledge. The model

  19. Savannah River Site peer evaluator standards: Operator assessment for restart

    International Nuclear Information System (INIS)

    1990-01-01

    Savannah River Site has implemented a Peer Evaluator program for the assessment of certified Central Control Room Operators, Central Control Room Supervisors and Shift Technical Engineers prior to restart. This program is modeled after the nuclear Regulatory Commission's (NRC's) Examiner Standard, ES-601, for the requalification of licensed operators in the commercial utility industry. It has been tailored to reflect the unique differences between Savannah River production reactors and commercial power reactors

  20. Field theory and the Standard Model

    Energy Technology Data Exchange (ETDEWEB)

    Dudas, E [Orsay, LPT (France)

    2014-07-01

    This brief introduction to Quantum Field Theory and the Standard Model contains the basic building blocks of perturbation theory in quantum field theory, an elementary introduction to gauge theories and the basic classical and quantum features of the electroweak sector of the Standard Model. Some details are given for the theoretical bias concerning the Higgs mass limits, as well as on obscure features of the Standard Model which motivate new physics constructions.

  1. Beyond the Standard Model of Cosmology

    International Nuclear Information System (INIS)

    Ellis, John; Nanopoulos, D. V.

    2004-01-01

    Recent cosmological observations of unprecented accuracy, by WMAP in particular, have established a 'Standard Model' of cosmology, just as LEP established the Standard Model of particle physics. Both Standard Models raise open questions whose answers are likely to be linked. The most fundamental problems in both particle physics and cosmology will be resolved only within a framework for Quantum Gravity, for which the only game in town is string theory. We discuss novel ways to model cosmological inflation and late acceleration in a non-critical string approach, and discuss possible astrophysical tests

  2. STANDARDIZING QUALITY ASSESSMENT OF FUSED REMOTELY SENSED IMAGES

    Directory of Open Access Journals (Sweden)

    C. Pohl

    2017-09-01

    Full Text Available The multitude of available operational remote sensing satellites led to the development of many image fusion techniques to provide high spatial, spectral and temporal resolution images. The comparison of different techniques is necessary to obtain an optimized image for the different applications of remote sensing. There are two approaches in assessing image quality: 1. Quantitatively by visual interpretation and 2. Quantitatively using image quality indices. However an objective comparison is difficult due to the fact that a visual assessment is always subject and a quantitative assessment is done by different criteria. Depending on the criteria and indices the result varies. Therefore it is necessary to standardize both processes (qualitative and quantitative assessment in order to allow an objective image fusion quality evaluation. Various studies have been conducted at the University of Osnabrueck (UOS to establish a standardized process to objectively compare fused image quality. First established image fusion quality assessment protocols, i.e. Quality with No Reference (QNR and Khan's protocol, were compared on varies fusion experiments. Second the process of visual quality assessment was structured and standardized with the aim to provide an evaluation protocol. This manuscript reports on the results of the comparison and provides recommendations for future research.

  3. Standardizing Quality Assessment of Fused Remotely Sensed Images

    Science.gov (United States)

    Pohl, C.; Moellmann, J.; Fries, K.

    2017-09-01

    The multitude of available operational remote sensing satellites led to the development of many image fusion techniques to provide high spatial, spectral and temporal resolution images. The comparison of different techniques is necessary to obtain an optimized image for the different applications of remote sensing. There are two approaches in assessing image quality: 1. Quantitatively by visual interpretation and 2. Quantitatively using image quality indices. However an objective comparison is difficult due to the fact that a visual assessment is always subject and a quantitative assessment is done by different criteria. Depending on the criteria and indices the result varies. Therefore it is necessary to standardize both processes (qualitative and quantitative assessment) in order to allow an objective image fusion quality evaluation. Various studies have been conducted at the University of Osnabrueck (UOS) to establish a standardized process to objectively compare fused image quality. First established image fusion quality assessment protocols, i.e. Quality with No Reference (QNR) and Khan's protocol, were compared on varies fusion experiments. Second the process of visual quality assessment was structured and standardized with the aim to provide an evaluation protocol. This manuscript reports on the results of the comparison and provides recommendations for future research.

  4. Conference: STANDARD MODEL @ LHC

    CERN Multimedia

    2012-01-01

    HCØ institute Universitetsparken 5 DK-2100 Copenhagen Ø Denmark Room: Auditorium 2 STANDARD MODEL @ LHC Niels Bohr International Academy and Discovery Center 10-13 April 2012 This four day meeting will bring together both experimental and theoretical aspects of Standard Model phenomenology at the LHC. The very latest results from the LHC experiments will be under discussion. Topics covered will be split into the following categories:     * QCD (Hard,Soft & PDFs)     * Vector Boson production     * Higgs searches     * Top Quark Physics     * Flavour physics

  5. A Student Assessment Tool for Standardized Patient Simulations (SAT-SPS): Psychometric analysis.

    Science.gov (United States)

    Castro-Yuste, Cristina; García-Cabanillas, María José; Rodríguez-Cornejo, María Jesús; Carnicer-Fuentes, Concepción; Paloma-Castro, Olga; Moreno-Corral, Luis Javier

    2018-05-01

    The evaluation of the level of clinical competence acquired by the student is a complex process that must meet various requirements to ensure its quality. The psychometric analysis of the data collected by the assessment tools used is a fundamental aspect to guarantee the student's competence level. To conduct a psychometric analysis of an instrument which assesses clinical competence in nursing students at simulation stations with standardized patients in OSCE-format tests. The construct of clinical competence was operationalized as a set of observable and measurable behaviors, measured by the newly-created Student Assessment Tool for Standardized Patient Simulations (SAT-SPS), which was comprised of 27 items. The categories assigned to the items were 'incorrect or not performed' (0), 'acceptable' (1), and 'correct' (2). 499 nursing students. Data were collected by two independent observers during the assessment of the students' performance at a four-station OSCE with standardized patients. Descriptive statistics were used to summarize the variables. The difficulty levels and floor and ceiling effects were determined for each item. Reliability was analyzed using internal consistency and inter-observer reliability. The validity analysis was performed considering face validity, content and construct validity (through exploratory factor analysis), and criterion validity. Internal reliability and inter-observer reliability were higher than 0.80. The construct validity analysis suggested a three-factor model accounting for 37.1% of the variance. These three factors were named 'Nursing process', 'Communication skills', and 'Safe practice'. A significant correlation was found between the scores obtained and the students' grades in general, as well as with the grades obtained in subjects with clinical content. The assessment tool has proven to be sufficiently reliable and valid for the assessment of the clinical competence of nursing students using standardized patients

  6. Standard model without Higgs particles

    International Nuclear Information System (INIS)

    Kovalenko, S.G.

    1992-10-01

    A modification of the standard model of electroweak interactions with the nonlocal Higgs sector is proposed. Proper form of nonlocality makes Higgs particles unobservable after the electroweak symmetry breaking. They appear only as a virtual state because their propagator is an entire function. We discuss some specific consequences of this approach comparing it with the conventional standard model. (author). 12 refs

  7. Bounds on the Higgs mass in the standard model and the minimal supersymmetric standard model

    CERN Document Server

    Quiros, M.

    1995-01-01

    Depending on the Higgs-boson and top-quark masses, M_H and M_t, the effective potential of the {\\bf Standard Model} can develop a non-standard minimum for values of the field much larger than the weak scale. In those cases the standard minimum becomes metastable and the possibility of decay to the non-standard one arises. Comparison of the decay rate to the non-standard minimum at finite (and zero) temperature with the corresponding expansion rate of the Universe allows to identify the region, in the (M_H, M_t) plane, where the Higgs field is sitting at the standard electroweak minimum. In the {\\bf Minimal Supersymmetric Standard Model}, approximate analytical expressions for the Higgs mass spectrum and couplings are worked out, providing an excellent approximation to the numerical results which include all next-to-leading-log corrections. An appropriate treatment of squark decoupling allows to consider large values of the stop and/or sbottom mixing parameters and thus fix a reliable upper bound on the mass o...

  8. Dynamics of the standard model

    CERN Document Server

    Donoghue, John F; Holstein, Barry R

    2014-01-01

    Describing the fundamental theory of particle physics and its applications, this book provides a detailed account of the Standard Model, focusing on techniques that can produce information about real observed phenomena. The book begins with a pedagogic account of the Standard Model, introducing essential techniques such as effective field theory and path integral methods. It then focuses on the use of the Standard Model in the calculation of physical properties of particles. Rigorous methods are emphasized, but other useful models are also described. This second edition has been updated to include recent theoretical and experimental advances, such as the discovery of the Higgs boson. A new chapter is devoted to the theoretical and experimental understanding of neutrinos, and major advances in CP violation and electroweak physics have been given a modern treatment. This book is valuable to graduate students and researchers in particle physics, nuclear physics and related fields.

  9. Electroweak baryogenesis and the standard model

    International Nuclear Information System (INIS)

    Huet, P.

    1994-01-01

    Electroweak baryogenesis is addressed within the context of the standard model of particle physics. Although the minimal standard model has the means of fulfilling the three Sakharov's conditions, it falls short to explaining the making of the baryon asymmetry of the universe. In particular, it is demonstrated that the phase of the CKM mixing matrix is an, insufficient source of CP violation. The shortcomings of the standard model could be bypassed by enlarging the symmetry breaking sector and adding a new source of CP violation

  10. Towards a standard model for research in agent-based modeling and simulation

    Directory of Open Access Journals (Sweden)

    Nuno Fachada

    2015-11-01

    Full Text Available Agent-based modeling (ABM is a bottom-up modeling approach, where each entity of the system being modeled is uniquely represented as an independent decision-making agent. ABMs are very sensitive to implementation details. Thus, it is very easy to inadvertently introduce changes which modify model dynamics. Such problems usually arise due to the lack of transparency in model descriptions, which constrains how models are assessed, implemented and replicated. In this paper, we present PPHPC, a model which aims to serve as a standard in agent based modeling research, namely, but not limited to, conceptual model specification, statistical analysis of simulation output, model comparison and parallelization studies. This paper focuses on the first two aspects (conceptual model specification and statistical analysis of simulation output, also providing a canonical implementation of PPHPC. The paper serves as a complete reference to the presented model, and can be used as a tutorial for simulation practitioners who wish to improve the way they communicate their ABMs.

  11. The minimal non-minimal standard model

    International Nuclear Information System (INIS)

    Bij, J.J. van der

    2006-01-01

    In this Letter I discuss a class of extensions of the standard model that have a minimal number of possible parameters, but can in principle explain dark matter and inflation. It is pointed out that the so-called new minimal standard model contains a large number of parameters that can be put to zero, without affecting the renormalizability of the model. With the extra restrictions one might call it the minimal (new) non-minimal standard model (MNMSM). A few hidden discrete variables are present. It is argued that the inflaton should be higher-dimensional. Experimental consequences for the LHC and the ILC are discussed

  12. Alignment between South African mathematics assessment standards and the TIMSS assessment frameworks

    Directory of Open Access Journals (Sweden)

    Mdutshekelwa Ndlovu

    2012-12-01

    Full Text Available South Africa’s performance in international benchmark tests is a major cause for concern amongst educators and policymakers, raising questions about the effectiveness of the curriculum reform efforts of the democratic era. The purpose of the study reported in this article was to investigate the degree of alignment between the TIMSS 2003 Grade 8 Mathematics assessment frameworks and the Revised National Curriculum Statements (RNCS assessment standards for Grade 8 Mathematics, later revised to become the Curriculum and Assessment Policy Statements (CAPS. Such an investigation could help to partly shed light on why South African learners do not perform well and point out discrepancies that need to be attended to. The methodology of document analysis was adopted for the study, with the RNCS and the TIMSS 2003 Grade 8 Mathematics frameworks forming the principal documents. Porter’s moderately complex index of alignment was adopted for its simplicity. The computed index of 0.751 for the alignment between the RNCS assessment standards and the TIMSS assessment objectives was found to be significantly statistically low, at the alpha level of 0.05, according to Fulmer’s critical values for 20 cells and 90 or 120 standard points. The study suggests that inadequate attention has been paid to the alignment of the South African mathematics curriculum to the successive TIMSS assessment frameworks in terms of the cognitive level descriptions. The study recommends that participation in TIMSS should rigorously and critically inform ongoing curriculum reform efforts.

  13. Beyond the Standard Model

    International Nuclear Information System (INIS)

    Lykken, Joseph D.

    2010-01-01

    'BSM physics' is a phrase used in several ways. It can refer to physical phenomena established experimentally but not accommodated by the Standard Model, in particular dark matter and neutrino oscillations (technically also anything that has to do with gravity, since gravity is not part of the Standard Model). 'Beyond the Standard Model' can also refer to possible deeper explanations of phenomena that are accommodated by the Standard Model but only with ad hoc parameterizations, such as Yukawa couplings and the strong CP angle. More generally, BSM can be taken to refer to any possible extension of the Standard Model, whether or not the extension solves any particular set of puzzles left unresolved in the SM. In this general sense one sees reference to the BSM 'theory space' of all possible SM extensions, this being a parameter space of coupling constants for new interactions, new charges or other quantum numbers, and parameters describing possible new degrees of freedom or new symmetries. Despite decades of model-building it seems unlikely that we have mapped out most of, or even the most interesting parts of, this theory space. Indeed we do not even know what is the dimensionality of this parameter space, or what fraction of it is already ruled out by experiment. Since Nature is only implementing at most one point in this BSM theory space (at least in our neighborhood of space and time), it might seem an impossible task to map back from a finite number of experimental discoveries and measurements to a unique BSM explanation. Fortunately for theorists the inevitable limitations of experiments themselves, in terms of resolutions, rates, and energy scales, means that in practice there are only a finite number of BSM model 'equivalence classes' competing at any given time to explain any given set of results. BSM phenomenology is a two-way street: not only do experimental results test or constrain BSM models, they also suggest - to those who get close enough to listen

  14. Beyond the Standard Model

    Energy Technology Data Exchange (ETDEWEB)

    Lykken, Joseph D.; /Fermilab

    2010-05-01

    'BSM physics' is a phrase used in several ways. It can refer to physical phenomena established experimentally but not accommodated by the Standard Model, in particular dark matter and neutrino oscillations (technically also anything that has to do with gravity, since gravity is not part of the Standard Model). 'Beyond the Standard Model' can also refer to possible deeper explanations of phenomena that are accommodated by the Standard Model but only with ad hoc parameterizations, such as Yukawa couplings and the strong CP angle. More generally, BSM can be taken to refer to any possible extension of the Standard Model, whether or not the extension solves any particular set of puzzles left unresolved in the SM. In this general sense one sees reference to the BSM 'theory space' of all possible SM extensions, this being a parameter space of coupling constants for new interactions, new charges or other quantum numbers, and parameters describing possible new degrees of freedom or new symmetries. Despite decades of model-building it seems unlikely that we have mapped out most of, or even the most interesting parts of, this theory space. Indeed we do not even know what is the dimensionality of this parameter space, or what fraction of it is already ruled out by experiment. Since Nature is only implementing at most one point in this BSM theory space (at least in our neighborhood of space and time), it might seem an impossible task to map back from a finite number of experimental discoveries and measurements to a unique BSM explanation. Fortunately for theorists the inevitable limitations of experiments themselves, in terms of resolutions, rates, and energy scales, means that in practice there are only a finite number of BSM model 'equivalence classes' competing at any given time to explain any given set of results. BSM phenomenology is a two-way street: not only do experimental results test or constrain BSM models, they also suggest

  15. Making Use of the New Student Assessment Standards To Enhance Technological Literacy.

    Science.gov (United States)

    Russell, Jill

    2003-01-01

    Describes the student assessment standards outlined in "Advancing Excellence in Technological Literacy: Student Assessment, Professional Development, and Program Standards," a companion to the "Standards for Technological Literacy." Discusses how the standards apply to everyday teaching practices. (JOW)

  16. The Standard Model and Higgs physics

    Science.gov (United States)

    Torassa, Ezio

    2018-05-01

    The Standard Model is a consistent and computable theory that successfully describes the elementary particle interactions. The strong, electromagnetic and weak interactions have been included in the theory exploiting the relation between group symmetries and group generators, in order to smartly introduce the force carriers. The group properties lead to constraints between boson masses and couplings. All the measurements performed at the LEP, Tevatron, LHC and other accelerators proved the consistency of the Standard Model. A key element of the theory is the Higgs field, which together with the spontaneous symmetry breaking, gives mass to the vector bosons and to the fermions. Unlike the case of vector bosons, the theory does not provide prediction for the Higgs boson mass. The LEP experiments, while providing very precise measurements of the Standard Model theory, searched for the evidence of the Higgs boson until the year 2000. The discovery of the top quark in 1994 by the Tevatron experiments and of the Higgs boson in 2012 by the LHC experiments were considered as the completion of the fundamental particles list of the Standard Model theory. Nevertheless the neutrino oscillations, the dark matter and the baryon asymmetry in the Universe evidence that we need a new extended model. In the Standard Model there are also some unattractive theoretical aspects like the divergent loop corrections to the Higgs boson mass and the very small Yukawa couplings needed to describe the neutrino masses. For all these reasons, the hunt of discrepancies between Standard Model and data is still going on with the aim to finally describe the new extended theory.

  17. Naturalness of CP Violation in the Standard Model

    International Nuclear Information System (INIS)

    Gibbons, Gary W.; Gielen, Steffen; Pope, C. N.; Turok, Neil

    2009-01-01

    We construct a natural measure on the space of Cabibbo-Kobayashi-Maskawa matrices in the standard model, assuming the fermion mass matrices are randomly selected from a distribution which incorporates the observed quark mass hierarchy. This measure allows us to assess the likelihood of Jarlskog's CP violation parameter J taking its observed value J≅3x10 -5 . We find that the observed value, while well below the mathematically allowed maximum, is in fact typical once the observed quark masses are assumed

  18. Leading the Transition from the Alternate Assessment Based on Modified Achievement Standards to the General Assessment

    Science.gov (United States)

    Lazarus, Sheryl S.; Rieke, Rebekah

    2013-01-01

    Schools are facing many changes in the ways that teaching, learning, and assessment take place. Most states are moving from individual state standards to the new Common Core State Standards, which will be fewer, higher, and more rigorous than most current state standards. As the next generation of assessments used for accountability are rolled…

  19. Standards of Ombudsman Assessment: A New Normative Concept?

    Directory of Open Access Journals (Sweden)

    Milan Remac

    2013-07-01

    Full Text Available Today, an ombudsman is a traditional component of democratic legal systems. Generally, reports of the ombudsman are not legally binding. Due to this fact, the ombudsman can rely only on his own persuasiveness, on his acceptance by individuals and state institutions, on the understanding of the administration and on the accessibility and transparency of rules that underpin his reports. During investigations, ombudsmen assess whether the administration has acted in accordance with certain legal or extra-legal standards. Depending on the legal system, ombudsmen can investigate whether there is an instance of maladministration in the activities of administrative bodies, whether the administration has acted ‘properly’, whether it has acted in accordance with the law, whether administrative actions have breached the human rights of complainants or whether the actions of the administration were in accordance with anti-corruption rules etc. Regardless of the legislative standard of an ombudsman’s control, the ombudsman should consider and assess the situation described in complaints against certain criteria or against certain normative standards. A distinct set of standards which ombudsmen use during their investigation, or at least a clear statement of their assessment criteria, can increase the transparency of their procedures and the persuasiveness of their reports. Are the normative standards used by different ombudsmen the same? Do they possibly create a new normative concept? And can it possibly lead to a higher acceptance of their reports by the administration?

  20. Standardized assessment of infrared thermographic fever screening system performance

    Science.gov (United States)

    Ghassemi, Pejhman; Pfefer, Joshua; Casamento, Jon; Wang, Quanzeng

    2017-03-01

    Thermal modalities represent the only currently viable mass fever screening approach for outbreaks of infectious disease pandemics such as Ebola and SARS. Non-contact infrared thermometers (NCITs) and infrared thermographs (IRTs) have been previously used for mass fever screening in transportation hubs such as airports to reduce the spread of disease. While NCITs remain a more popular choice for fever screening in the field and at fixed locations, there has been increasing evidence in the literature that IRTs can provide greater accuracy in estimating core body temperature if appropriate measurement practices are applied - including the use of technically suitable thermographs. Therefore, the purpose of this study was to develop a battery of evaluation test methods for standardized, objective and quantitative assessment of thermograph performance characteristics critical to assessing suitability for clinical use. These factors include stability, drift, uniformity, minimum resolvable temperature difference, and accuracy. Two commercial IRT models were characterized. An external temperature reference source with high temperature accuracy was utilized as part of the screening thermograph. Results showed that both IRTs are relatively accurate and stable (<1% error of reading with stability of +/-0.05°C). Overall, results of this study may facilitate development of standardized consensus test methods to enable consistent and accurate use of IRTs for fever screening.

  1. Beyond the standard model

    International Nuclear Information System (INIS)

    Cuypers, F.

    1997-05-01

    These lecture notes are intended as a pedagogical introduction to several popular extensions of the standard model of strong and electroweak interactions. The topics include the Higgs sector, the left-right symmetric model, grand unification and supersymmetry. Phenomenological consequences and search procedures are emphasized. (author) figs., tabs., 18 refs

  2. The standard model in a nutshell

    CERN Document Server

    Goldberg, Dave

    2017-01-01

    For a theory as genuinely elegant as the Standard Model--the current framework describing elementary particles and their forces--it can sometimes appear to students to be little more than a complicated collection of particles and ranked list of interactions. The Standard Model in a Nutshell provides a comprehensive and uncommonly accessible introduction to one of the most important subjects in modern physics, revealing why, despite initial appearances, the entire framework really is as elegant as physicists say. Dave Goldberg uses a "just-in-time" approach to instruction that enables students to gradually develop a deep understanding of the Standard Model even if this is their first exposure to it. He covers everything from relativity, group theory, and relativistic quantum mechanics to the Higgs boson, unification schemes, and physics beyond the Standard Model. The book also looks at new avenues of research that could answer still-unresolved questions and features numerous worked examples, helpful illustrat...

  3. Analysis of third-party certification approaches using an occupational health and safety conformity-assessment model.

    Science.gov (United States)

    Redinger, C F; Levine, S P

    1998-11-01

    The occupational health and safety conformity-assessment model presented in this article was developed (1) to analyze 22 public and private programs to determine the extent to which these programs use third parties in conformity-assessment determinations, and (2) to establish a framework to guide future policy developments related to the use of third parties in occupational health and safety conformity-assessment activities. The units of analysis for this study included select Occupational Safety and Health Administration programs and standards, International Organization for Standardization-based standards and guidelines, and standards and guidelines developed by nongovernmental bodies. The model is based on a 15-cell matrix that categorizes first-, second-, and third-party activities in terms of assessment, accreditation, and accreditation-recognition activities. The third-party component of the model has three categories: industrial hygiene/safety testing and sampling; product, equipment, and laboratory certification; and, occupational health and safety management system registration/certification. Using the model, 16 of the 22 programs were found to have a third-party component in their conformity-assessment structure. The analysis revealed that (1) the model provides a useful means to describe and analyze various third-party approaches, (2) the model needs modification to capture aspects of traditional governmental conformity-assessment/enforcement activities, and (3) several existing third-party conformity-assessment systems offer robust models that can guide future third-party policy formulation and implementation activities.

  4. Custom v. Standardized Risk Models

    Directory of Open Access Journals (Sweden)

    Zura Kakushadze

    2015-05-01

    Full Text Available We discuss when and why custom multi-factor risk models are warranted and give source code for computing some risk factors. Pension/mutual funds do not require customization but standardization. However, using standardized risk models in quant trading with much shorter holding horizons is suboptimal: (1 longer horizon risk factors (value, growth, etc. increase noise trades and trading costs; (2 arbitrary risk factors can neutralize alpha; (3 “standardized” industries are artificial and insufficiently granular; (4 normalization of style risk factors is lost for the trading universe; (5 diversifying risk models lowers P&L correlations, reduces turnover and market impact, and increases capacity. We discuss various aspects of custom risk model building.

  5. The standard model and beyond

    CERN Document Server

    Langacker, Paul

    2017-01-01

    This new edition of The Standard Model and Beyond presents an advanced introduction to the physics and formalism of the standard model and other non-abelian gauge theories. It provides a solid background for understanding supersymmetry, string theory, extra dimensions, dynamical symmetry breaking, and cosmology. In addition to updating all of the experimental and phenomenological results from the first edition, it contains a new chapter on collider physics; expanded discussions of Higgs, neutrino, and dark matter physics; and many new problems. The book first reviews calculational techniques in field theory and the status of quantum electrodynamics. It then focuses on global and local symmetries and the construction of non-abelian gauge theories. The structure and tests of quantum chromodynamics, collider physics, the electroweak interactions and theory, and the physics of neutrino mass and mixing are thoroughly explored. The final chapter discusses the motivations for extending the standard model and examin...

  6. Perspectives in the standard model

    International Nuclear Information System (INIS)

    Ellis, R.K.; Hill, C.T.; Lykken, J.D.

    1992-01-01

    Particle physics is an experimentally based science, with a need for the best theorists to make contact with data and to enlarge and enhance their theoretical descriptions as the subject evolves. The authors felt it imperative that the TASI (Theoretical Advanced Study Institute) program reflect this need. The goal of this conference, was to provide the students with a comprehensive look at the current understanding of the standard model, as well as the techniques which promise to advance that understanding in the future. Topics covered include: symmetry breaking in the standard model; physics beyond the standard model; chiral effective Lagrangians; semi-classical string theory; renormalization of electroweak gauge interactions; electroweak experiments at LEP; the CKM matrix and CP violation; axion searches; lattice QCD; perturbative QCD; heavy quark effective field theory; heavy flavor physics on the lattice; and neutrinos. Separate abstracts were prepared for 13 papers in this conference

  7. Testing the Standard Model

    CERN Document Server

    Riles, K

    1998-01-01

    The Large Electron Project (LEP) accelerator near Geneva, more than any other instrument, has rigorously tested the predictions of the Standard Model of elementary particles. LEP measurements have probed the theory from many different directions and, so far, the Standard Model has prevailed. The rigour of these tests has allowed LEP physicists to determine unequivocally the number of fundamental 'generations' of elementary particles. These tests also allowed physicists to ascertain the mass of the top quark in advance of its discovery. Recent increases in the accelerator's energy allow new measurements to be undertaken, measurements that may uncover directly or indirectly the long-sought Higgs particle, believed to impart mass to all other particles.

  8. 24 CFR 115.206 - Performance assessments; Performance standards.

    Science.gov (United States)

    2010-04-01

    ... 24 Housing and Urban Development 1 2010-04-01 2010-04-01 false Performance assessments; Performance standards. 115.206 Section 115.206 Housing and Urban Development Regulations Relating to Housing... AGENCIES Certification of Substantially Equivalent Agencies § 115.206 Performance assessments; Performance...

  9. The impact of statistical adjustment on conditional standard errors of measurement in the assessment of physician communication skills.

    Science.gov (United States)

    Raymond, Mark R; Clauser, Brian E; Furman, Gail E

    2010-10-01

    The use of standardized patients to assess communication skills is now an essential part of assessing a physician's readiness for practice. To improve the reliability of communication scores, it has become increasingly common in recent years to use statistical models to adjust ratings provided by standardized patients. This study employed ordinary least squares regression to adjust ratings, and then used generalizability theory to evaluate the impact of these adjustments on score reliability and the overall standard error of measurement. In addition, conditional standard errors of measurement were computed for both observed and adjusted scores to determine whether the improvements in measurement precision were uniform across the score distribution. Results indicated that measurement was generally less precise for communication ratings toward the lower end of the score distribution; and the improvement in measurement precision afforded by statistical modeling varied slightly across the score distribution such that the most improvement occurred in the upper-middle range of the score scale. Possible reasons for these patterns in measurement precision are discussed, as are the limitations of the statistical models used for adjusting performance ratings.

  10. 7 CFR 319.40-11 - Plant pest risk assessment standards.

    Science.gov (United States)

    2010-01-01

    ... analysis to determine the plant pest risks associated with each requested importation in order to determine... 7 Agriculture 5 2010-01-01 2010-01-01 false Plant pest risk assessment standards. 319.40-11... Unmanufactured Wood Articles § 319.40-11 Plant pest risk assessment standards. When evaluating a request to...

  11. Toward a consistent modeling framework to assess multi-sectoral climate impacts.

    Science.gov (United States)

    Monier, Erwan; Paltsev, Sergey; Sokolov, Andrei; Chen, Y-H Henry; Gao, Xiang; Ejaz, Qudsia; Couzo, Evan; Schlosser, C Adam; Dutkiewicz, Stephanie; Fant, Charles; Scott, Jeffery; Kicklighter, David; Morris, Jennifer; Jacoby, Henry; Prinn, Ronald; Haigh, Martin

    2018-02-13

    Efforts to estimate the physical and economic impacts of future climate change face substantial challenges. To enrich the currently popular approaches to impact analysis-which involve evaluation of a damage function or multi-model comparisons based on a limited number of standardized scenarios-we propose integrating a geospatially resolved physical representation of impacts into a coupled human-Earth system modeling framework. Large internationally coordinated exercises cannot easily respond to new policy targets and the implementation of standard scenarios across models, institutions and research communities can yield inconsistent estimates. Here, we argue for a shift toward the use of a self-consistent integrated modeling framework to assess climate impacts, and discuss ways the integrated assessment modeling community can move in this direction. We then demonstrate the capabilities of such a modeling framework by conducting a multi-sectoral assessment of climate impacts under a range of consistent and integrated economic and climate scenarios that are responsive to new policies and business expectations.

  12. The standard model and colliders

    International Nuclear Information System (INIS)

    Hinchliffe, I.

    1987-03-01

    Some topics in the standard model of strong and electroweak interactions are discussed, as well as how these topics are relevant for the high energy colliders which will become operational in the next few years. The radiative corrections in the Glashow-Weinberg-Salam model are discussed, stressing how these corrections may be measured at LEP and the SLC. CP violation is discussed briefly, followed by a discussion of the Higgs boson and the searches which are relevant to hadron colliders are then discussed. Some of the problems which the standard model does not solve are discussed, and the energy ranges accessible to the new colliders are indicated

  13. Quality model for semantic IS standards

    NARCIS (Netherlands)

    Folmer, Erwin Johan Albert

    2011-01-01

    Semantic IS (Information Systems) standards are essential for achieving interoperability between organizations. However a recent survey suggests that not the full benefits of standards are achieved, due to the quality issues. This paper presents a quality model for semantic IS standards, that should

  14. Assessment of Safety Standards for Automotive Electronic Control Systems

    Science.gov (United States)

    2016-06-01

    This report summarizes the results of a study that assessed and compared six industry and government safety standards relevant to the safety and reliability of automotive electronic control systems. These standards include ISO 26262 (Road Vehicles - ...

  15. Standard model of knowledge representation

    Science.gov (United States)

    Yin, Wensheng

    2016-09-01

    Knowledge representation is the core of artificial intelligence research. Knowledge representation methods include predicate logic, semantic network, computer programming language, database, mathematical model, graphics language, natural language, etc. To establish the intrinsic link between various knowledge representation methods, a unified knowledge representation model is necessary. According to ontology, system theory, and control theory, a standard model of knowledge representation that reflects the change of the objective world is proposed. The model is composed of input, processing, and output. This knowledge representation method is not a contradiction to the traditional knowledge representation method. It can express knowledge in terms of multivariate and multidimensional. It can also express process knowledge, and at the same time, it has a strong ability to solve problems. In addition, the standard model of knowledge representation provides a way to solve problems of non-precision and inconsistent knowledge.

  16. Extensions of the Standard Model

    CERN Document Server

    Zwirner, Fabio

    1996-01-01

    Rapporteur talk at the International Europhysics Conference on High Energy Physics, Brussels (Belgium), July 27-August 2, 1995. This talk begins with a brief general introduction to the extensions of the Standard Model, reviewing the ideology of effective field theories and its practical implications. The central part deals with candidate extensions near the Fermi scale, focusing on some phenomenological aspects of the Minimal Supersymmetric Standard Model. The final part discusses some possible low-energy implications of further extensions near the Planck scale, namely superstring theories.

  17. Non-commutative standard model: model building

    CERN Document Server

    Chaichian, Masud; Presnajder, P

    2003-01-01

    A non-commutative version of the usual electro-weak theory is constructed. We discuss how to overcome the two major problems: (1) although we can have non-commutative U(n) (which we denote by U sub * (n)) gauge theory we cannot have non-commutative SU(n) and (2) the charges in non-commutative QED are quantized to just 0,+-1. We show how the latter problem with charge quantization, as well as with the gauge group, can be resolved by taking the U sub * (3) x U sub * (2) x U sub * (1) gauge group and reducing the extra U(1) factors in an appropriate way. Then we proceed with building the non-commutative version of the standard model by specifying the proper representations for the entire particle content of the theory, the gauge bosons, the fermions and Higgs. We also present the full action for the non-commutative standard model (NCSM). In addition, among several peculiar features of our model, we address the inherentCP violation and new neutrino interactions. (orig.)

  18. ASK Standards: Assessment, Skills, and Knowledge Content Standards for Student Affairs Practitioners and Scholars

    Science.gov (United States)

    ACPA College Student Educators International, 2011

    2011-01-01

    The Assessment Skills and Knowledge (ASK) standards seek to articulate the areas of content knowledge, skill and dispositions that student affairs professionals need in order to perform as practitioner-scholars to assess the degree to which students are mastering the learning and development outcomes the professionals intend. Consistent with…

  19. JPL Thermal Design Modeling Philosophy and NASA-STD-7009 Standard for Models and Simulations - A Case Study

    Science.gov (United States)

    Avila, Arturo

    2011-01-01

    The Standard JPL thermal engineering practice prescribes worst-case methodologies for design. In this process, environmental and key uncertain thermal parameters (e.g., thermal blanket performance, interface conductance, optical properties) are stacked in a worst case fashion to yield the most hot- or cold-biased temperature. Thus, these simulations would represent the upper and lower bounds. This, effectively, represents JPL thermal design margin philosophy. Uncertainty in the margins and the absolute temperatures is usually estimated by sensitivity analyses and/or by comparing the worst-case results with "expected" results. Applicability of the analytical model for specific design purposes along with any temperature requirement violations are documented in peer and project design review material. In 2008, NASA released NASA-STD-7009, Standard for Models and Simulations. The scope of this standard covers the development and maintenance of models, the operation of simulations, the analysis of the results, training, recommended practices, the assessment of the Modeling and Simulation (M&S) credibility, and the reporting of the M&S results. The Mars Exploration Rover (MER) project thermal control system M&S activity was chosen as a case study determining whether JPL practice is in line with the standard and to identify areas of non-compliance. This paper summarizes the results and makes recommendations regarding the application of this standard to JPL thermal M&S practices.

  20. Assessment of the Impacts of Standards and Labeling Programs inMexico (four products).

    Energy Technology Data Exchange (ETDEWEB)

    Sanchez, Itha; Pulido, Henry; McNeil, Michael A.; Turiel, Isaac; della Cava, Mirka

    2007-06-12

    This study analyzes impacts from energy efficiency standards and labeling in Mexico from 1994 through 2005 for four major products: household refrigerators, room air conditioners, three-phase (squirrel cage) induction motors, and clothes washers. It is a retrospective analysis, seeking to assess verified impacts on product efficiency in the Mexican market in the first ten years after standards were implemented. Such an analysis allows the Mexican government to compare actual to originally forecast program benefits. In addition, it provides an extremely valuable benchmark for other countries considering standards, and to the energy policy community as a whole. The methodology for evaluation begins with historical test data taken for a large number of models of each product type between 1994 and 2005. The pre-standard efficiency of models in 1994 is taken as a baseline throughout the analysis. Model efficiency data were provided by an independent certification laboratory (ANCE), which tested products as part of the certification and enforcement mechanism defined by the standards program. Using this data, together with economic and market data provided by both government and private sector sources, the analysis considers several types of national level program impacts. These include: Energy savings; Environmental (emissions) impacts, and Net financial impacts to consumers, manufacturers and utilities. Energy savings impacts are calculated using the same methodology as the original projections, allowing a comparison. Other impacts are calculated using a robust and sophisticated methodology developed by the Instituto de Investigaciones Electricas (IIE) and Lawrence Berkeley National Laboratory (LBNL), in a collaboration supported by the Collaborative Labeling and Standards Program (CLASP).

  1. Beyond the standard model

    International Nuclear Information System (INIS)

    Gaillard, M.K.

    1990-04-01

    The unresolved issues of the standard model are reviewed, with emphasis on the gauge hierarchy problem. A possible mechanism for generating a hierarchy in the context of superstring theory is described. 24 refs

  2. Mechanistic effect modeling for ecological risk assessment: where to go from here?

    Science.gov (United States)

    Grimm, Volker; Martin, Benjamin T

    2013-07-01

    Mechanistic effect models (MEMs) consider the mechanisms of how chemicals affect individuals and ecological systems such as populations and communities. There is an increasing awareness that MEMs have high potential to make risk assessment of chemicals more ecologically relevant than current standard practice. Here we discuss what kinds of MEMs are needed to improve scientific and regulatory aspects of risk assessment. To make valid predictions for a wide range of environmental conditions, MEMs need to include a sufficient amount of emergence, for example, population dynamics emerging from what individual organisms do. We present 1 example where the life cycle of individuals is described using Dynamic Energy Budget theory. The resulting individual-based population model is thus parameterized at the individual level but correctly predicts multiple patterns at the population level. This is the case for both control and treated populations. We conclude that the state-of-the-art in mechanistic effect modeling has reached a level where MEMs are robust and predictive enough to be used in regulatory risk assessment. Mechanistic effect models will thus be used to advance the scientific basis of current standard practice and will, if their development follows Good Modeling Practice, be included in a standardized way in future regulatory risk assessments. Copyright © 2013 SETAC.

  3. NUSS safety standards: A critical assessment

    International Nuclear Information System (INIS)

    Minogue, R.B.

    1985-01-01

    The NUSS safety standards are based on systematic review of safety criteria of many countries in a process carefully defined to assure completeness of coverage. They represent an international consensus of accepted safety principles and practices for regulation and for the design, construction, and operation of nuclear power plants. They are a codification of principles and practices already in use by some Member States. Thus, they are not standards which describe methodologies at their present state of evolution as a result of more recent experience and improvements in technological understanding. The NUSS standards assume an underlying body of national standards and a defined technological base. Detailed design and industrial practices vary between countries and the implementation of basic safety standards within countries has taken approaches that conform with national industrial practices. Thus, application of the NUSS standards requires reconciliation with the standards of the country where the reactor will be built as well as with the country from which procurement takes place. Experience in making that reconciliation will undoubtedly suggest areas of needed improvement. After the TMI accident a reassessment of the NUSS programme was made and it was concluded that, given the information at that time and the then level of technology, the basic approach was sound; the NUSS programme should be continued to completion, and the standards should be brought into use. It was also recognized, however, that in areas such as probabilistic risk assessment, human factors methodology, and consideration of detailed accident sequences, more advanced technology was emerging. As these technologies develop, and become more amenable to practical application, it is anticipated that the NUSS standards will need revision. Ideally those future revisions will also flow from experience in their use

  4. Standard Model festival

    Energy Technology Data Exchange (ETDEWEB)

    Anon.

    1987-10-15

    The 'Standard Model' of modern particle physics, with the quantum chromodynamics (QCD) theory of inter-quark forces superimposed on the unified electroweak picture, is still unchallenged, but it is not the end of physics. This was the message at the big International Symposium on Lepton and Photon Interactions at High Energies, held in Hamburg from 27-31 July.

  5. Development of a standard equipment management model for nuclear power plants

    Energy Technology Data Exchange (ETDEWEB)

    Chang, Hee Seung; Ju, Tae Young; Kim, Jung Wun [KHNP Central Research Institute, Daejeon (Korea, Republic of)

    2012-10-15

    Most utilities that have achieved high performance have introduced a management model to improve performance and operate plants safely. The Nuclear Energy Institute has developed and updated its Standard Nuclear Performance Model (SNPM) in order to provide a summary of nuclear processes, cost definitions, and key business performance measures for business performance comparison and benchmarking. Over the past decade, Korea Hydro and Nuclear Power Co. (KHNP) has introduced and implemented many engineering processes such as Equipment Reliability (ER), Maintenance Rule (MR), Single Point Vulnerability (SPV), Corrective Action Program (CAP), and Self Assessment (SA) to improve plant performance and to sustain high performance. Some processes, however, are not well interfaced with other processes, because they were developed separately and were focused on the process itself. KHNP is developing a Standard Equipment Management Model (SEMM) to integrate these engineering processes and to improve the interrelation among the processes. In this paper, a draft model and attributes of the SEMM are discussed.

  6. Development of a standard equipment management model for nuclear power plants

    International Nuclear Information System (INIS)

    Chang, Hee Seung; Ju, Tae Young; Kim, Jung Wun

    2012-01-01

    Most utilities that have achieved high performance have introduced a management model to improve performance and operate plants safely. The Nuclear Energy Institute has developed and updated its Standard Nuclear Performance Model (SNPM) in order to provide a summary of nuclear processes, cost definitions, and key business performance measures for business performance comparison and benchmarking. Over the past decade, Korea Hydro and Nuclear Power Co. (KHNP) has introduced and implemented many engineering processes such as Equipment Reliability (ER), Maintenance Rule (MR), Single Point Vulnerability (SPV), Corrective Action Program (CAP), and Self Assessment (SA) to improve plant performance and to sustain high performance. Some processes, however, are not well interfaced with other processes, because they were developed separately and were focused on the process itself. KHNP is developing a Standard Equipment Management Model (SEMM) to integrate these engineering processes and to improve the interrelation among the processes. In this paper, a draft model and attributes of the SEMM are discussed

  7. Neutrinos: in and out of the standard model

    Energy Technology Data Exchange (ETDEWEB)

    Parke, Stephen; /Fermilab

    2006-07-01

    The particle physics Standard Model has been tremendously successful in predicting the outcome of a large number of experiments. In this model Neutrinos are massless. Yet recent evidence points to the fact that neutrinos are massive particles with tiny masses compared to the other particles in the Standard Model. These tiny masses allow the neutrinos to change flavor and oscillate. In this series of Lectures, I will review the properties of Neutrinos In the Standard Model and then discuss the physics of Neutrinos Beyond the Standard Model. Topics to be covered include Neutrino Flavor Transformations and Oscillations, Majorana versus Dirac Neutrino Masses, the Seesaw Mechanism and Leptogenesis.

  8. LHC Higgs physics beyond the Standard Model

    International Nuclear Information System (INIS)

    Spannowsky, M.

    2007-01-01

    The Large Hadron Collider (LHC) at CERN will be able to perform proton collisions at a much higher center-of-mass energy and luminosity than any other collider. Its main purpose is to detect the Higgs boson, the last unobserved particle of the Standard Model, explaining the riddle of the origin of mass. Studies have shown, that for the whole allowed region of the Higgs mass processes exist to detect the Higgs at the LHC. However, the Standard Model cannot be a theory of everything and is not able to provide a complete understanding of physics. It is at most an effective theory up to a presently unknown energy scale. Hence, extensions of the Standard Model are necessary which can affect the Higgs-boson signals. We discuss these effects in two popular extensions of the Standard Model: the Minimal Supersymmetric Standard Model (MSSM) and the Standard Model with four generations (SM4G). Constraints on these models come predominantly from flavor physics and electroweak precision measurements. We show, that the SM4G is still viable and that a fourth generation has strong impact on decay and production processes of the Higgs boson. Furthermore, we study the charged Higgs boson in the MSSM, yielding a clear signal for physics beyond the Standard Model. For small tan β in minimal flavor violation (MFV) no processes for the detection of a charged Higgs boson do exist at the LHC. However, MFV is just motivated by the experimental agreement of results from flavor physics with Standard Model predictions, but not by any basic theoretical consideration. In this thesis, we calculate charged Higgs boson production cross sections beyond the assumption of MFV, where a large number of free parameters is present in the MSSM. We find that the soft-breaking parameters which enhance the charged-Higgs boson production most are just bound to large values, e.g. by rare B-meson decays. Although the charged-Higgs boson cross sections beyond MFV turn out to be sizeable, only a detailed

  9. LHC Higgs physics beyond the Standard Model

    Energy Technology Data Exchange (ETDEWEB)

    Spannowsky, M.

    2007-09-22

    The Large Hadron Collider (LHC) at CERN will be able to perform proton collisions at a much higher center-of-mass energy and luminosity than any other collider. Its main purpose is to detect the Higgs boson, the last unobserved particle of the Standard Model, explaining the riddle of the origin of mass. Studies have shown, that for the whole allowed region of the Higgs mass processes exist to detect the Higgs at the LHC. However, the Standard Model cannot be a theory of everything and is not able to provide a complete understanding of physics. It is at most an effective theory up to a presently unknown energy scale. Hence, extensions of the Standard Model are necessary which can affect the Higgs-boson signals. We discuss these effects in two popular extensions of the Standard Model: the Minimal Supersymmetric Standard Model (MSSM) and the Standard Model with four generations (SM4G). Constraints on these models come predominantly from flavor physics and electroweak precision measurements. We show, that the SM4G is still viable and that a fourth generation has strong impact on decay and production processes of the Higgs boson. Furthermore, we study the charged Higgs boson in the MSSM, yielding a clear signal for physics beyond the Standard Model. For small tan {beta} in minimal flavor violation (MFV) no processes for the detection of a charged Higgs boson do exist at the LHC. However, MFV is just motivated by the experimental agreement of results from flavor physics with Standard Model predictions, but not by any basic theoretical consideration. In this thesis, we calculate charged Higgs boson production cross sections beyond the assumption of MFV, where a large number of free parameters is present in the MSSM. We find that the soft-breaking parameters which enhance the charged-Higgs boson production most are just bound to large values, e.g. by rare B-meson decays. Although the charged-Higgs boson cross sections beyond MFV turn out to be sizeable, only a detailed

  10. TDA Assessment of Recommendations for Space Data System Standards

    Science.gov (United States)

    Posner, E. C.; Stevens, R.

    1984-01-01

    NASA is participating in the development of international standards for space data systems. Recommendations for standards thus far developed are assessed. The proposed standards for telemetry coding and packet telemetry provide worthwhile benefit to the DSN; their cost impact to the DSN should be small. Because of their advantage to the NASA space exploration program, their adoption should be supported by TDA, JPL, and OSTDS.

  11. The standard model and beyond

    International Nuclear Information System (INIS)

    Marciano, W.J.

    1989-05-01

    In these lectures, my aim is to present a status report on the standard model and some key tests of electroweak unification. Within that context, I also discuss how and where hints of new physics may emerge. To accomplish those goals, I have organized my presentation as follows. I survey the standard model parameters with particular emphasis on the gauge coupling constants and vector boson masses. Examples of new physics appendages are also commented on. In addition, I have included an appendix on dimensional regularization and a simple example which employs that technique. I focus on weak charged current phenomenology. Precision tests of the standard model are described and up-to-date values for the Cabibbo-Kobayashi-Maskawa (CKM) mixing matrix parameters are presented. Constraints implied by those tests for a 4th generation, extra Z' bosons, and compositeness are discussed. An overview of the physics of tau decays is also included. I discuss weak neutral current phenomenology and the extraction of sin 2 θW from experiment. The results presented there are based on a global analysis of all existing data. I have chosen to concentrate that discussion on radiative corrections, the effect of a heavy top quark mass, implications for grand unified theories (GUTS), extra Z' gauge bosons, and atomic parity violation. The potential for further experimental progress is also commented on. Finally, I depart from the narrowest version of the standard model and discuss effects of neutrino masses, mixings, and electromagnetic moments. 32 refs., 3 figs., 5 tabs

  12. Phenomenology beyond the standard model

    Energy Technology Data Exchange (ETDEWEB)

    Lykken, Joseph D.; /Fermilab

    2005-03-01

    An elementary review of models and phenomenology for physics beyond the Standard Model (excluding supersymmetry). The emphasis is on LHC physics. Based upon a talk given at the ''Physics at LHC'' conference, Vienna, 13-17 July 2004.

  13. Standard Model festival

    International Nuclear Information System (INIS)

    Anon.

    1987-01-01

    The 'Standard Model' of modern particle physics, with the quantum chromodynamics (QCD) theory of inter-quark forces superimposed on the unified electroweak picture, is still unchallenged, but it is not the end of physics. This was the message at the big International Symposium on Lepton and Photon Interactions at High Energies, held in Hamburg from 27-31 July

  14. Development and comparison in uncertainty assessment based Bayesian modularization method in hydrological modeling

    Science.gov (United States)

    Li, Lu; Xu, Chong-Yu; Engeland, Kolbjørn

    2013-04-01

    SummaryWith respect to model calibration, parameter estimation and analysis of uncertainty sources, various regression and probabilistic approaches are used in hydrological modeling. A family of Bayesian methods, which incorporates different sources of information into a single analysis through Bayes' theorem, is widely used for uncertainty assessment. However, none of these approaches can well treat the impact of high flows in hydrological modeling. This study proposes a Bayesian modularization uncertainty assessment approach in which the highest streamflow observations are treated as suspect information that should not influence the inference of the main bulk of the model parameters. This study includes a comprehensive comparison and evaluation of uncertainty assessments by our new Bayesian modularization method and standard Bayesian methods using the Metropolis-Hastings (MH) algorithm with the daily hydrological model WASMOD. Three likelihood functions were used in combination with standard Bayesian method: the AR(1) plus Normal model independent of time (Model 1), the AR(1) plus Normal model dependent on time (Model 2) and the AR(1) plus Multi-normal model (Model 3). The results reveal that the Bayesian modularization method provides the most accurate streamflow estimates measured by the Nash-Sutcliffe efficiency and provide the best in uncertainty estimates for low, medium and entire flows compared to standard Bayesian methods. The study thus provides a new approach for reducing the impact of high flows on the discharge uncertainty assessment of hydrological models via Bayesian method.

  15. About the standard solar model

    International Nuclear Information System (INIS)

    Cahen, S.

    1986-07-01

    A discussion of the still controversial solar helium content is presented, based on a comparison of recent standard solar models. Our last model yields an helium mass fraction ∼0.276, 6.4 SNU on 37 Cl and 126 SNU on 71 Ga

  16. Establishing the isolated Standard Model

    International Nuclear Information System (INIS)

    Wells, James D.; Zhang, Zhengkang; Zhao, Yue

    2017-02-01

    The goal of this article is to initiate a discussion on what it takes to claim ''there is no new physics at the weak scale,'' namely that the Standard Model (SM) is ''isolated.'' The lack of discovery of beyond the SM (BSM) physics suggests that this may be the case. But to truly establish this statement requires proving all ''connected'' BSM theories are false, which presents a significant challenge. We propose a general approach to quantitatively assess the current status and future prospects of establishing the isolated SM (ISM), which we give a reasonable definition of. We consider broad elements of BSM theories, and show many examples where current experimental results are not sufficient to verify the ISM. In some cases, there is a clear roadmap for the future experimental program, which we outline, while in other cases, further efforts - both theoretical and experimental - are needed in order to robustly claim the establishment of the ISM in the absence of new physics discoveries.

  17. A test-bed modeling study for wave resource assessment

    Science.gov (United States)

    Yang, Z.; Neary, V. S.; Wang, T.; Gunawan, B.; Dallman, A.

    2016-02-01

    Hindcasts from phase-averaged wave models are commonly used to estimate standard statistics used in wave energy resource assessments. However, the research community and wave energy converter industry is lacking a well-documented and consistent modeling approach for conducting these resource assessments at different phases of WEC project development, and at different spatial scales, e.g., from small-scale pilot study to large-scale commercial deployment. Therefore, it is necessary to evaluate current wave model codes, as well as limitations and knowledge gaps for predicting sea states, in order to establish best wave modeling practices, and to identify future research needs to improve wave prediction for resource assessment. This paper presents the first phase of an on-going modeling study to address these concerns. The modeling study is being conducted at a test-bed site off the Central Oregon Coast using two of the most widely-used third-generation wave models - WaveWatchIII and SWAN. A nested-grid modeling approach, with domain dimension ranging from global to regional scales, was used to provide wave spectral boundary condition to a local scale model domain, which has a spatial dimension around 60km by 60km and a grid resolution of 250m - 300m. Model results simulated by WaveWatchIII and SWAN in a structured-grid framework are compared to NOAA wave buoy data for the six wave parameters, including omnidirectional wave power, significant wave height, energy period, spectral width, direction of maximum directionally resolved wave power, and directionality coefficient. Model performance and computational efficiency are evaluated, and the best practices for wave resource assessments are discussed, based on a set of standard error statistics and model run times.

  18. Almost-commutative geometries beyond the standard model

    International Nuclear Information System (INIS)

    Stephan, Christoph A

    2006-01-01

    In Iochum et al (2004 J. Math. Phys. 45 5003), Jureit and Stephan (2005 J. Math. Phys. 46 043512), Schuecker T (2005 Preprint hep-th/0501181) and Jureit et al (2005 J. Math. Phys. 46 072303), a conjecture is presented that almost-commutative geometries, with respect to sensible physical constraints, allow only the standard model of particle physics and electro-strong models as Yang-Mills-Higgs theories. In this paper, a counter-example will be given. The corresponding almost-commutative geometry leads to a Yang-Mills-Higgs model which consists of the standard model of particle physics and two new fermions of opposite electro-magnetic charge. This is the second Yang-Mills-Higgs model within noncommutative geometry, after the standard model, which could be compatible with experiments. Combined to a hydrogen-like composite particle, these new particles provide a novel dark matter candidate

  19. Gauge coupling unification in superstring derived standard-like models

    International Nuclear Information System (INIS)

    Faraggi, A.E.

    1992-11-01

    I discuss gauge coupling unification in a class of superstring standard-like models, which are derived in the free fermionic formulation. Recent calculations indicate that the superstring unification scale is at O(10 18 GeV) while the minimal supersymmetric standard model is consistent with LEP data if the unification scale is at O(10 16 )GeV. A generic feature of the superstring standard-like models is the appearance of extra color triplets (D,D), and electroweak doublets (l,l), in vector-like representations, beyond the supersymmetric standard model. I show that the gauge coupling unification at O(10 18 GeV) in the superstring standard-like models can be consistent with LEP data. I present an explicit standard-like model that can realize superstring gauge coupling unification. (author)

  20. The standard model

    International Nuclear Information System (INIS)

    Marciano, W.J.

    1994-03-01

    In these lectures, my aim is to provide a survey of the standard model with emphasis on its renormalizability and electroweak radiative corrections. Since this is a school, I will try to be somewhat pedagogical by providing examples of loop calculations. In that way, I hope to illustrate some of the commonly employed tools of particle physics. With those goals in mind, I have organized my presentations as follows: In Section 2, renormalization is discussed from an applied perspective. The technique of dimensional regularization is described and used to define running couplings and masses. The utility of the renormalization group for computing leading logs is illustrated for the muon anomalous magnetic moment. In Section 3 electroweak radiative corrections are discussed. Standard model predictions are surveyed and used to constrain the top quark mass. The S, T, and U parameters are introduced and employed to probe for ''new physics''. The effect of Z' bosons on low energy phenomenology is described. In Section 4, a detailed illustration of electroweak radiative corrections is given for atomic parity violation. Finally, in Section 5, I conclude with an outlook for the future

  1. Constructing Assessment Model of Primary and Secondary Educational Quality with Talent Quality as the Core Standard

    Science.gov (United States)

    Chen, Benyou

    2014-01-01

    Quality is the core of education and it is important to standardization construction of primary and secondary education in urban (U) and rural (R) areas. The ultimate goal of the integration of urban and rural education is to pursuit quality urban and rural education. Based on analysing the related policy basis and the existing assessment models…

  2. Physical Activity Stories: Assessing the "Meaning Standard" in Physical Education

    Science.gov (United States)

    Johnson, Tyler G.

    2016-01-01

    The presence of the "meaning standard" in both national and state content standards suggests that professionals consider it an important outcome of a quality physical education program. However, only 10 percent of states require an assessment to examine whether students achieve this standard. The purpose of this article is to introduce…

  3. Big bang nucleosynthesis - The standard model and alternatives

    Science.gov (United States)

    Schramm, David N.

    1991-01-01

    The standard homogeneous-isotropic calculation of the big bang cosmological model is reviewed, and alternate models are discussed. The standard model is shown to agree with the light element abundances for He-4, H-2, He-3, and Li-7 that are available. Improved observational data from recent LEP collider and SLC results are discussed. The data agree with the standard model in terms of the number of neutrinos, and provide improved information regarding neutron lifetimes. Alternate models are reviewed which describe different scenarios for decaying matter or quark-hadron induced inhomogeneities. The baryonic density relative to the critical density in the alternate models is similar to that of the standard model when they are made to fit the abundances. This reinforces the conclusion that the baryonic density relative to critical density is about 0.06, and also reinforces the need for both nonbaryonic dark matter and dark baryonic matter.

  4. Teacher Assessment Literacy: A Review of International Standards and Measures

    Science.gov (United States)

    DeLuca, Christopher; LaPointe-McEwan, Danielle; Luhanga, Ulemu

    2016-01-01

    Assessment literacy is a core professional requirement across educational systems. Hence, measuring and supporting teachers' assessment literacy have been a primary focus over the past two decades. At present, there are a multitude of assessment standards across the world and numerous assessment literacy measures that represent different…

  5. Assessing model-based reasoning using evidence-centered design a suite of research-based design patterns

    CERN Document Server

    Mislevy, Robert J; Riconscente, Michelle; Wise Rutstein, Daisy; Ziker, Cindy

    2017-01-01

    This Springer Brief provides theory, practical guidance, and support tools to help designers create complex, valid assessment tasks for hard-to-measure, yet crucial, science education standards. Understanding, exploring, and interacting with the world through models characterizes science in all its branches and at all levels of education. Model-based reasoning is central to science education and thus science assessment. Current interest in developing and using models has increased with the release of the Next Generation Science Standards, which identified this as one of the eight practices of science and engineering. However, the interactive, complex, and often technology-based tasks that are needed to assess model-based reasoning in its fullest forms are difficult to develop. Building on research in assessment, science education, and learning science, this Brief describes a suite of design patterns that can help assessment designers, researchers, and teachers create tasks for assessing aspects of model-based...

  6. A model for assessing information technology effectiveness in the business environment

    Directory of Open Access Journals (Sweden)

    Sandra Cristina Riascos Erazo

    2008-05-01

    Full Text Available The impact of technology on administrative processes has improved business strategies (especially regarding the e-ffect of information technology - IT, often leading to organisational success. Its effectiveness in this environment was thus modelled due to such importance; this paper describes studying a series of models aimed at assessing IT, its ad-vantages and disadvantages. A model is proposed involving different aspects for an integral assessment of IT effecti-veness and considering administrative activities’ particular characteristics. This analytical study provides guidelines for identifying IT effectiveness in a business environment and current key strategies in technological innovation. This stu-dy was based on ISO 9126, ISO 9001, ISO 15939 and ISO 25000 standards as well as COBIT and CMM stan-dards.

  7. Transformative Shifts in Art History Teaching: The Impact of Standards-Based Assessment

    Science.gov (United States)

    Ormond, Barbara

    2011-01-01

    This article examines pedagogical shifts in art history teaching that have developed as a response to the implementation of a standards-based assessment regime. The specific characteristics of art history standards-based assessment in the context of New Zealand secondary schools are explained to demonstrate how an exacting form of assessment has…

  8. Standard Model physics

    CERN Multimedia

    Altarelli, Guido

    1999-01-01

    Introduction structure of gauge theories. The QEDand QCD examples. Chiral theories. The electroweak theory. Spontaneous symmetry breaking. The Higgs mechanism Gauge boson and fermion masses. Yukawa coupling. Charges current couplings. The Cabibo-Kobayashi-Maskawa matrix and CP violation. Neutral current couplings. The Glasow-Iliopoulos-Maiani mechanism. Gauge boson and Higgs coupling. Radiative corrections and loops. Cancellation of the chiral anomaly. Limits on the Higgs comparaison. Problems of the Standard Model. Outlook.

  9. Assessing changes in drought characteristics with standardized indices

    Science.gov (United States)

    Vidal, Jean-Philippe; Najac, Julien; Martin, Eric; Franchistéguy, Laurent; Soubeyroux, Jean-Michel

    2010-05-01

    Standardized drought indices like the Standardized Precipitation Index (SPI) are more and more frequently adopted for drought reconstruction, monitoring and forecasting, and the SPI has been recently recommended by the World Meteorological Organization to characterize meteorological droughts. Such indices are based on the statistical distribution of a hydrometeorological variable (e.g., precipitation) in a given reference climate, and a drought event is defined as a period with continuously negative index values. Because of the way these indices are constructed, some issues may arise when using them in a non-stationnary climate. This work thus aims at highlighting such issues and demonstrating the different ways these indices may - or may not - be applied and interpreted in the context of an anthropogenic climate change. Three major points are detailed through examples taken from both a high-resolution gridded reanalysis dataset over France and transient projections from the ARPEGE general circulation model downscaled over France. The first point deals with the choice of the reference climate, and more specifically its type (from observations/reanalysis or from present-day modelled climate) and its record period. Second, the interpretation of actual changes are closely linked with the type of the selected drought feature over a future period: mean index value, under-threshold frequency, or drought event characteristics (number, mean duration and magnitude, seasonality, etc.). Finally, applicable approaches as well as related uncertainties depend on the availability of data from a future climate, whether in the form of a fully transient time series from present-day or only a future time slice. The projected evolution of drought characteristics under climate change must inform present decisions on long-term water resources planning. An assessment of changes in drought characteristics should therefore provide water managers with appropriate information that can help

  10. Assessing the Genetics Content in the Next Generation Science Standards.

    Science.gov (United States)

    Lontok, Katherine S; Zhang, Hubert; Dougherty, Michael J

    2015-01-01

    Science standards have a long history in the United States and currently form the backbone of efforts to improve primary and secondary education in science, technology, engineering, and math (STEM). Although there has been much political controversy over the influence of standards on teacher autonomy and student performance, little light has been shed on how well standards cover science content. We assessed the coverage of genetics content in the Next Generation Science Standards (NGSS) using a consensus list of American Society of Human Genetics (ASHG) core concepts. We also compared the NGSS against state science standards. Our goals were to assess the potential of the new standards to support genetic literacy and to determine if they improve the coverage of genetics concepts relative to state standards. We found that expert reviewers cannot identify ASHG core concepts within the new standards with high reliability, suggesting that the scope of content addressed by the standards may be inconsistently interpreted. Given results that indicate that the disciplinary core ideas (DCIs) included in the NGSS documents produced by Achieve, Inc. clarify the content covered by the standards statements themselves, we recommend that the NGSS standards statements always be viewed alongside their supporting disciplinary core ideas. In addition, gaps exist in the coverage of essential genetics concepts, most worryingly concepts dealing with patterns of inheritance, both Mendelian and complex. Finally, state standards vary widely in their coverage of genetics concepts when compared with the NGSS. On average, however, the NGSS support genetic literacy better than extant state standards.

  11. Assessing the Genetics Content in the Next Generation Science Standards.

    Directory of Open Access Journals (Sweden)

    Katherine S Lontok

    Full Text Available Science standards have a long history in the United States and currently form the backbone of efforts to improve primary and secondary education in science, technology, engineering, and math (STEM. Although there has been much political controversy over the influence of standards on teacher autonomy and student performance, little light has been shed on how well standards cover science content. We assessed the coverage of genetics content in the Next Generation Science Standards (NGSS using a consensus list of American Society of Human Genetics (ASHG core concepts. We also compared the NGSS against state science standards. Our goals were to assess the potential of the new standards to support genetic literacy and to determine if they improve the coverage of genetics concepts relative to state standards. We found that expert reviewers cannot identify ASHG core concepts within the new standards with high reliability, suggesting that the scope of content addressed by the standards may be inconsistently interpreted. Given results that indicate that the disciplinary core ideas (DCIs included in the NGSS documents produced by Achieve, Inc. clarify the content covered by the standards statements themselves, we recommend that the NGSS standards statements always be viewed alongside their supporting disciplinary core ideas. In addition, gaps exist in the coverage of essential genetics concepts, most worryingly concepts dealing with patterns of inheritance, both Mendelian and complex. Finally, state standards vary widely in their coverage of genetics concepts when compared with the NGSS. On average, however, the NGSS support genetic literacy better than extant state standards.

  12. Modeling in the Common Core State Standards

    Science.gov (United States)

    Tam, Kai Chung

    2011-01-01

    The inclusion of modeling and applications into the mathematics curriculum has proven to be a challenging task over the last fifty years. The Common Core State Standards (CCSS) has made mathematical modeling both one of its Standards for Mathematical Practice and one of its Conceptual Categories. This article discusses the need for mathematical…

  13. Peer Review of Assessment Network: Supporting Comparability of Standards

    Science.gov (United States)

    Booth, Sara; Beckett, Jeff; Saunders, Cassandra

    2016-01-01

    Purpose: This paper aims to test the need in the Australian higher education (HE) sector for a national network for the peer review of assessment in response to the proposed HE standards framework and propose a sector-wide framework for calibrating and assuring achievement standards, both within and across disciplines, through the establishment of…

  14. Beyond the Standard Model

    International Nuclear Information System (INIS)

    Ross, G.G.

    1995-01-01

    The attempts to develop models beyond the Standard Model are briefly reviewed paying particular regard to the mechanisms responsible for symmetry breaking and mass generation. A comparison is made of the theoretical expectations with recent precision measurements for theories with composite Higgs and for supersymmetric theories with elementary Higgs boson(s). The implications of a heavy top quark and the origin of the light quark and lepton masses and mixing angles are considered within these frameworks. ((orig.))

  15. Searches for non-Standard Model Higgs bosons

    CERN Document Server

    Dumitriu, Ana Elena; The ATLAS collaboration

    2018-01-01

    This presentation focuses on the Searches for non-Standard Model Higgs bosons using 36.1 fb of data collected by the ATLAS experiment. There are several theoretical models with an extended Higgs sector considered: 2 Higgs Doublet Models (2HDM), Supersymmetry (SUSY), which brings along super-partners of the SM particles (+ The Minimal Supersymmetric Standard Model (MSSM), whose Higgs sector is equivalent to the one of a constrained 2HDM of type II and the next-to MSSM (NMSSM)), General searches and Invisible decaying Higgs boson.

  16. Physics-Based Fragment Acceleration Modeling for Pressurized Tank Burst Risk Assessments

    Science.gov (United States)

    Manning, Ted A.; Lawrence, Scott L.

    2014-01-01

    As part of comprehensive efforts to develop physics-based risk assessment techniques for space systems at NASA, coupled computational fluid and rigid body dynamic simulations were carried out to investigate the flow mechanisms that accelerate tank fragments in bursting pressurized vessels. Simulations of several configurations were compared to analyses based on the industry-standard Baker explosion model, and were used to formulate an improved version of the model. The standard model, which neglects an external fluid, was found to agree best with simulation results only in configurations where the internal-to-external pressure ratio is very high and fragment curvature is small. The improved model introduces terms that accommodate an external fluid and better account for variations based on circumferential fragment count. Physics-based analysis was critical in increasing the model's range of applicability. The improved tank burst model can be used to produce more accurate risk assessments of space vehicle failure modes that involve high-speed debris, such as exploding propellant tanks and bursting rocket engines.

  17. 42 CFR 493.1299 - Standard: Postanalytic systems quality assessment.

    Science.gov (United States)

    2010-10-01

    ... 42 Public Health 5 2010-10-01 2010-10-01 false Standard: Postanalytic systems quality assessment. 493.1299 Section 493.1299 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH....1291. (b) The postanalytic systems quality assessment must include a review of the effectiveness of...

  18. 42 CFR 493.1249 - Standard: Preanalytic systems quality assessment.

    Science.gov (United States)

    2010-10-01

    ... 42 Public Health 5 2010-10-01 2010-10-01 false Standard: Preanalytic systems quality assessment. 493.1249 Section 493.1249 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH....1241 through 493.1242. (b) The preanalytic systems quality assessment must include a review of the...

  19. Big Bang nucleosynthesis: The standard model

    International Nuclear Information System (INIS)

    Steigman, G.

    1989-01-01

    Current observational data on the abundances of deuterium, helium-3, helium-4 and lithium-7 are reviewed and these data are used to infer (or to bound) the primordial abundances of these elements. The physics of primordial nucleosynthesis in the context of the ''standard'' (isotropic, homogeneous,...) hot big bang model is outlined and the primordial abundances predicted within the context of this model are presented. The theoretical predictions are then confronted with the observational data. This confrontation reveals the remarkable consistency of the standard model, constrains the nucleon abundance to lie within a narrow range and, permits the existence of no more than one additional flavor of light neutrinos

  20. Extensions of the standard model

    International Nuclear Information System (INIS)

    Ramond, P.

    1983-01-01

    In these lectures we focus on several issues that arise in theoretical extensions of the standard model. First we describe the kinds of fermions that can be added to the standard model without affecting known phenomenology. We focus in particular on three types: the vector-like completion of the existing fermions as would be predicted by a Kaluza-Klein type theory, which we find cannot be realistically achieved without some chiral symmetry; fermions which are vector-like by themselves, such as do appear in supersymmetric extensions, and finally anomaly-free chiral sets of fermions. We note that a chiral symmetry, such as the Peccei-Quinn symmetry can be used to produce a vector-like theory which, at scales less than M/sub W/, appears to be chiral. Next, we turn to the analysis of the second hierarchy problem which arises in Grand Unified extensions of the standard model, and plays a crucial role in proton decay of supersymmetric extensions. We review the known mechanisms for avoiding this problem and present a new one which seems to lead to the (family) triplication of the gauge group. Finally, this being a summer school, we present a list of homework problems. 44 references

  1. Risk Management Model from the Perspective of the Implementing ISO 9001:2015 Standard Within Financial Services Companies

    Directory of Open Access Journals (Sweden)

    Cătălina Sitnikov

    2017-11-01

    Full Text Available In its new form, the ISO 9001:2015 standard activates and utilizes a thought pattern based on risk assessment functioning in parallel with the implementation of the system regarding quality management. Therefore, we strive to identify the risks and opportunities associated with the processes and products needed to create and implement a system of quality management based on the ISO 9001:2015 standard. This standard is defined by a strong client-based orientation, motivation and managerial involvement from the higher levels, as well as a process-based approach and a commitment towards constant improvement. By implementing the requirements of the new version of the ISO 9001:2015 standard, the organisation needs to determine all the processes necessary to the system of quality management, as well as to identify those which include activities dealing with risks and opportunities. Considering the importance and the impact of the requirements of the new version of the ISO 9001:2015 standard, starting from theoretical concepts and underscoring a set of research vectors, a model of financial risk assessment has been devised. The model is based on the correlation which can be established between the multiplicity of components relating to the components of the new standard structure, SL Annex, elements of an approach derived from risk patterns of processes and risk types which are assessed from the perspective of financial services companies.

  2. Upgrade of internal events PSA model using the AESJ level-1 PSA standard for operating state

    International Nuclear Information System (INIS)

    Sato, Teruyoshi; Yoneyama, Mitsuru; Hirokawa, Naoki; Sato, Chikahiro; Sato, Eisuke; Tomizawa, Shigeatsu

    2009-01-01

    In 2003, the Atomic Energy Society of Japan (AESJ) started to develop the Level-1 Probabilistic Safety Assessment (PSA) standard of internal events for operating state (AESJ standard). The AESJ standard has been finished to be asked for public comment. Using the AESJ standard (draft version), the authors have upgraded the PSA model for Tokyo Electric Power Company (TEPCO) BWR-5 plant not only to reflect latest knowledge but also to ensure high quality of PSA model (not yet peer-reviewed) for the purpose of better operation and maintenance management of TEPCO BWR plants. For example, the categorization of structures, systems and components (SSCs) will be performed to improve nuclear reactor safety using information of risk importance. (author)

  3. 42 CFR 493.1289 - Standard: Analytic systems quality assessment.

    Science.gov (United States)

    2010-10-01

    ... 42 Public Health 5 2010-10-01 2010-10-01 false Standard: Analytic systems quality assessment. 493.1289 Section 493.1289 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH AND... through 493.1283. (b) The analytic systems quality assessment must include a review of the effectiveness...

  4. MODEL AUTHENTIC SELF-ASSESSMENT DALAM PENGEMBANGAN EMPLOYABILITY SKILLS MAHASISWA PENDIDIKAN TINGGI VOKASI

    Directory of Open Access Journals (Sweden)

    I Made Suarta

    2015-06-01

    ______________________________________________________________ AUTHENTIC SELF-ASSESSMENT MODEL FOR DEVELOPING EMPLOYABILITY SKILLS STUDENT IN HIGHER VOCATIONAL EDUCATION Abstract The purpose of this research is to develop assessment tools to evaluate achievement of employability skills which are integrated in the learning database applications. The assessment model developed is a combination of self-assessment and authentic assessment, proposed as models of authentic self-assessment. The steps of developing authentic self-assessment models include: identifying the standards, selecting an authentic task, identifying the criteria for the task, and creating the rubric. The results of development assessment tools include: (1 problem solving skills assessment model, (2 self-management skills assessment model, and (3 competence database applications assessment model. This model can be used to assess the cognitive, affective, and psychomotor achievement. The results indicate: achievement of problem solving and self-management ability was in good category, and competencies in designing conceptual and logical database was in high category. This model also has met the basic principles of assessment, i.e.: validity, reliability, focused on competencies, comprehen-sive, objectivity, and the principle of educating. Keywords: authentic assessment, self-assessment, problem solving skills, self-management skills, vocational education

  5. Assessing the Effects of Corporate Social Responsibility Standards in Global Value Chains

    DEFF Research Database (Denmark)

    Lund-Thomsen, Peter

    This paper considers the issue of corporate social responsibility (CSR) standard impact assessment in global value chains. CSR standards have proliferated in recent years, and several studies have attempted to assess their effects on local producers, workers, and the environment in developing...... countries. However, much less attention has been paid to the “dark side” of impact assessment – the ethical and political dilemmas that arise in the process of carrying out impact studies. This paper addresses this gap in literature, arguing that impact assessments of CSR standards may do more harm than...... good to the intended beneficiaries - developing country firms, farmers, workers, and communities - unless these ethical and political dilemmas are given serious consideration....

  6. Environmental assessment. Energy efficiency standards for consumer products

    Energy Technology Data Exchange (ETDEWEB)

    McSwain, Berah

    1980-06-01

    The Energy Policy and Conservation Act of 1975 requires DOE to prescribe energy efficiency standards for 13 consumer products. The Consumer Products Efficiency Standards (CPES) program covers: refrigerators and refrigerator-freezers, freezers, clothes dryers, water heaters, room air conditioners, home heating equipment, kitchen ranges and ovens, central air conditioners (cooling and heat pumps), furnaces, dishwashers, television sets, clothes washers, and humidifiers and dehumidifiers. This Environmental Assessment evaluates the potential environmental and socioeconomic impacts expected as a result of setting efficiency standards for all of the consumer products covered by the CPES program. DOE has proposed standards for eight of the products covered by the Program in a Notice of Proposed Rulemaking (NOPR). DOE expects to propose standards for home heating equipment, central air conditioners (heat pumps only), dishwashers, television sets, clothes washers, and humidifiers and dehumidifiers in 1981. No significant adverse environmental or socioeconomic impacts have been found to result from instituting the CPES.

  7. Guide for developing conceptual models for ecological risk assessments

    International Nuclear Information System (INIS)

    Suter, G.W., II.

    1996-05-01

    Ecological conceptual models are the result of the problem formulation phase of an ecological risk assessment, which is an important component of the Remedial Investigation process. They present hypotheses of how the site contaminants might affect the site ecology. The contaminant sources, routes, media, routes, and endpoint receptors are presented in the form of a flow chart. This guide is for preparing the conceptual models; use of this guide will standardize the models so that they will be of high quality, useful to the assessment process, and sufficiently consistent so that connections between sources of exposure and receptors can be extended across operable units (OU). Generic conceptual models are presented for source, aquatic integrator, groundwater integrator, and terrestrial OUs

  8. Standard setting and quality of assessment: A conceptual approach ...

    African Journals Online (AJOL)

    Quality performance standards and the effect of assessment outcomes are important in the educational milieu, as assessment remains the representative ... not be seen as a methodological process of setting pass/fail cut-off points only, but as a powerful catalyst for quality improvements in HPE by promoting excellence in ...

  9. Primordial nucleosynthesis: Beyond the standard model

    International Nuclear Information System (INIS)

    Malaney, R.A.

    1991-01-01

    Non-standard primordial nucleosynthesis merits continued study for several reasons. First and foremost are the important implications determined from primordial nucleosynthesis regarding the composition of the matter in the universe. Second, the production and the subsequent observation of the primordial isotopes is the most direct experimental link with the early (t approx-lt 1 sec) universe. Third, studies of primordial nucleosynthesis allow for important, and otherwise unattainable, constraints on many aspects of particle physics. Finally, there is tentative evidence which suggests that the Standard Big Bang (SBB) model is incorrect in that it cannot reproduce the inferred primordial abundances for a single value of the baryon-to-photon ratio. Reviewed here are some aspects of non-standard primordial nucleosynthesis which mostly overlap with the authors own personal interest. He begins with a short discussion of the SBB nucleosynthesis theory, high-lighting some recent related developments. Next he discusses how recent observations of helium and lithium abundances may indicate looming problems for the SBB model. He then discusses how the QCD phase transition, neutrinos, and cosmic strings can influence primordial nucleosynthesis. He concludes with a short discussion of the multitude of other non-standard nucleosynthesis models found in the literature, and make some comments on possible progress in the future. 58 refs., 7 figs., 2 tabs

  10. A solar neutrino loophole: standard solar models

    Energy Technology Data Exchange (ETDEWEB)

    Rouse, C A [General Atomic Co., San Diego, Calif. (USA)

    1975-11-01

    The salient aspects of the existence theorem for a unique solution to a system of linear of nonlinear first-order, ordinary differential equations are given and applied to the equilibrium stellar structure equations. It is shown that values of pressure, temperature, mass and luminosity are needed at one point - and for the sun, the logical point is the solar radius. It is concluded that since standard solar model calculations use split boundary conditions, a solar neutrino loophole still remains: solar model calculations that seek to satisfy the necessary condition for a unique solution to the solar structure equations suggest a solar interior quite different from that deduced in standard models. This, in turn, suggests a theory of formation and solar evolution significantly different from the standard theory.

  11. Establishing the isolated Standard Model

    Energy Technology Data Exchange (ETDEWEB)

    Wells, James D.; Zhang, Zhengkang [Michigan Univ., Ann Arbor, MI (United States). Michigan Center for Theoretical Physics; Deutsches Elektronen-Synchrotron (DESY), Hamburg (Germany); Zhao, Yue [Michigan Univ., Ann Arbor, MI (United States). Michigan Center for Theoretical Physics

    2017-02-15

    The goal of this article is to initiate a discussion on what it takes to claim ''there is no new physics at the weak scale,'' namely that the Standard Model (SM) is ''isolated.'' The lack of discovery of beyond the SM (BSM) physics suggests that this may be the case. But to truly establish this statement requires proving all ''connected'' BSM theories are false, which presents a significant challenge. We propose a general approach to quantitatively assess the current status and future prospects of establishing the isolated SM (ISM), which we give a reasonable definition of. We consider broad elements of BSM theories, and show many examples where current experimental results are not sufficient to verify the ISM. In some cases, there is a clear roadmap for the future experimental program, which we outline, while in other cases, further efforts - both theoretical and experimental - are needed in order to robustly claim the establishment of the ISM in the absence of new physics discoveries.

  12. Beyond the Standard Model

    CERN Multimedia

    CERN. Geneva

    2005-01-01

    The necessity for new physics beyond the Standard Model will be motivated. Theoretical problems will be exposed and possible solutions will be described. The goal is to present the exciting new physics ideas that will be tested in the near future. Supersymmetry, grand unification, extra dimensions and string theory will be presented.

  13. MATLAB-implemented estimation procedure for model-based assessment of hepatic insulin degradation from standard intravenous glucose tolerance test data.

    Science.gov (United States)

    Di Nardo, Francesco; Mengoni, Michele; Morettini, Micaela

    2013-05-01

    Present study provides a novel MATLAB-based parameter estimation procedure for individual assessment of hepatic insulin degradation (HID) process from standard frequently-sampled intravenous glucose tolerance test (FSIGTT) data. Direct access to the source code, offered by MATLAB, enabled us to design an optimization procedure based on the alternating use of Gauss-Newton's and Levenberg-Marquardt's algorithms, which assures the full convergence of the process and the containment of computational time. Reliability was tested by direct comparison with the application, in eighteen non-diabetic subjects, of well-known kinetic analysis software package SAAM II, and by application on different data. Agreement between MATLAB and SAAM II was warranted by intraclass correlation coefficients ≥0.73; no significant differences between corresponding mean parameter estimates and prediction of HID rate; and consistent residual analysis. Moreover, MATLAB optimization procedure resulted in a significant 51% reduction of CV% for the worst-estimated parameter by SAAM II and in maintaining all model-parameter CV% MATLAB-based procedure was suggested as a suitable tool for the individual assessment of HID process. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.

  14. Development of an accident consequence assessment code for evaluating site suitability of light- and heavy-water reactors based on the Korean Technical standards

    Energy Technology Data Exchange (ETDEWEB)

    Hwang, Won Tae; Jeong, Hae Sung; Jeong, Hyo Joon; Kil, A Reum; Kim, Eun Han; Han, Moon Hee [Nuclear Environment Safety Research Division, Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2016-12-15

    Methodologies for a series of radiological consequence assessments show a distinctive difference according to the design principles of the original nuclear suppliers and their technical standards to be imposed. This is due to the uncertainties of the accidental source term, radionuclide behavior in the environment, and subsequent radiological dose. Both types of PWR and PHWR are operated in Korea. However, technical standards for evaluating atmospheric dispersion have been enacted based on the U.S. NRC's positions regardless of the reactor types. For this reason, it might cause a controversy between the licensor and licensee of a nuclear power plant. It was modelled under the framework of the NRC Regulatory Guide 1.145 for light-water reactors, reflecting the features of heavy-water reactors as specified in the Canadian National Standard and the modelling features in MACCS2, such as atmospheric diffusion coefficient, ground deposition, surface roughness, radioactive plume depletion, and exposure from ground deposition. An integrated accident consequence assessment code, ACCESS (Accident Consequence Assessment Code for Evaluating Site Suitability), was developed by taking into account the unique regulatory positions for reactor types under the framework of the current Korean technical standards. Field tracer experiments and hand calculations have been carried out for validation and verification of the models. The modelling approaches of ACCESS and its features are introduced, and its applicative results for a hypothetical accidental scenario are comprehensively discussed. In an applicative study, the predicted results by the light-water reactor assessment model were higher than those by other models in terms of total doses.

  15. Assessment of liquefaction-induced hazards using Bayesian networks based on standard penetration test data

    Science.gov (United States)

    Tang, Xiao-Wei; Bai, Xu; Hu, Ji-Lei; Qiu, Jiang-Nan

    2018-05-01

    Liquefaction-induced hazards such as sand boils, ground cracks, settlement, and lateral spreading are responsible for considerable damage to engineering structures during major earthquakes. Presently, there is no effective empirical approach that can assess different liquefaction-induced hazards in one model. This is because of the uncertainties and complexity of the factors related to seismic liquefaction and liquefaction-induced hazards. In this study, Bayesian networks (BNs) are used to integrate multiple factors related to seismic liquefaction, sand boils, ground cracks, settlement, and lateral spreading into a model based on standard penetration test data. The constructed BN model can assess four different liquefaction-induced hazards together. In a case study, the BN method outperforms an artificial neural network and Ishihara and Yoshimine's simplified method in terms of accuracy, Brier score, recall, precision, and area under the curve (AUC) of the receiver operating characteristic (ROC). This demonstrates that the BN method is a good alternative tool for the risk assessment of liquefaction-induced hazards. Furthermore, the performance of the BN model in estimating liquefaction-induced hazards in Japan's 2011 Tōhoku earthquake confirms its correctness and reliability compared with the liquefaction potential index approach. The proposed BN model can also predict whether the soil becomes liquefied after an earthquake and can deduce the chain reaction process of liquefaction-induced hazards and perform backward reasoning. The assessment results from the proposed model provide informative guidelines for decision-makers to detect the damage state of a field following liquefaction.

  16. Assessing Quality of Data Standards: Framework and Illustration Using XBRL GAAP Taxonomy

    Science.gov (United States)

    Zhu, Hongwei; Wu, Harris

    The primary purpose of data standards or metadata schemas is to improve the interoperability of data created by multiple standard users. Given the high cost of developing data standards, it is desirable to assess the quality of data standards. We develop a set of metrics and a framework for assessing data standard quality. The metrics include completeness and relevancy. Standard quality can also be indirectly measured by assessing interoperability of data instances. We evaluate the framework using data from the financial sector: the XBRL (eXtensible Business Reporting Language) GAAP (Generally Accepted Accounting Principles) taxonomy and US Securities and Exchange Commission (SEC) filings produced using the taxonomy by approximately 500 companies. The results show that the framework is useful and effective. Our analysis also reveals quality issues of the GAAP taxonomy and provides useful feedback to taxonomy users. The SEC has mandated that all publicly listed companies must submit their filings using XBRL. Our findings are timely and have practical implications that will ultimately help improve the quality of financial data.

  17. The standard model and beyond

    International Nuclear Information System (INIS)

    Gaillard, M.K.

    1989-05-01

    The field of elementary particle, or high energy, physics seeks to identify the most elementary constituents of nature and to study the forces that govern their interactions. Increasing the energy of a probe in a laboratory experiment increases its power as an effective microscope for discerning increasingly smaller structures of matter. Thus we have learned that matter is composed of molecules that are in turn composed of atoms, that the atom consists of a nucleus surrounded by a cloud of electrons, and that the atomic nucleus is a collection of protons and neutrons. The more powerful probes provided by high energy particle accelerators have taught us that a nucleon is itself made of objects called quarks. The forces among quarks and electrons are understood within a general theoretical framework called the ''standard model,'' that accounts for all interactions observed in high energy laboratory experiments to date. These are commonly categorized as the ''strong,'' ''weak'' and ''electromagnetic'' interactions. In this lecture I will describe the standard model, and point out some of its limitations. Probing for deeper structures in quarks and electrons defines the present frontier of particle physics. I will discuss some speculative ideas about extensions of the standard model and/or yet more fundamental forces that may underlie our present picture. 11 figs., 1 tab

  18. Inflation in the standard cosmological model

    Science.gov (United States)

    Uzan, Jean-Philippe

    2015-12-01

    The inflationary paradigm is now part of the standard cosmological model as a description of its primordial phase. While its original motivation was to solve the standard problems of the hot big bang model, it was soon understood that it offers a natural theory for the origin of the large-scale structure of the universe. Most models rely on a slow-rolling scalar field and enjoy very generic predictions. Besides, all the matter of the universe is produced by the decay of the inflaton field at the end of inflation during a phase of reheating. These predictions can be (and are) tested from their imprint of the large-scale structure and in particular the cosmic microwave background. Inflation stands as a window in physics where both general relativity and quantum field theory are at work and which can be observationally studied. It connects cosmology with high-energy physics. Today most models are constructed within extensions of the standard model, such as supersymmetry or string theory. Inflation also disrupts our vision of the universe, in particular with the ideas of chaotic inflation and eternal inflation that tend to promote the image of a very inhomogeneous universe with fractal structure on a large scale. This idea is also at the heart of further speculations, such as the multiverse. This introduction summarizes the connections between inflation and the hot big bang model and details the basics of its dynamics and predictions. xml:lang="fr"

  19. A comparison of radiological risk assessment models: Risk assessment models used by the BEIR V Committee, UNSCEAR, ICRP, and EPA (for NESHAP)

    International Nuclear Information System (INIS)

    Wahl, L.E.

    1994-03-01

    Radiological risk assessments and resulting risk estimates have been developed by numerous national and international organizations, including the National Research Council's fifth Committee on the Biological Effects of Ionizing Radiations (BEIR V), the United Nations Scientific Committee on the Effects of Atomic Radiation (UNSCEAR), and the International Commission on Radiological Protection (ICRP). A fourth organization, the Environmental Protection Agency (EPA), has also performed a risk assessment as a basis for the National Emission Standards for Hazardous Air Pollutants (NESHAP). This paper compares the EPA's model of risk assessment with the models used by the BEIR V Committee, UNSCEAR, and ICRP. Comparison is made of the values chosen by each organization for several model parameters: populations used in studies and population transfer coefficients, dose-response curves and dose-rate effects, risk projection methods, and risk estimates. This comparison suggests that the EPA has based its risk assessment on outdated information and that the organization should consider adopting the method used by the BEIR V Committee, UNSCEAR, or ICRP

  20. Simple standard problem for the Preisach moving model

    International Nuclear Information System (INIS)

    Morentin, F.J.; Alejos, O.; Francisco, C. de; Munoz, J.M.; Hernandez-Gomez, P.; Torres, C.

    2004-01-01

    The present work proposes a simple magnetic system as a candidate for a Standard Problem for Preisach-based models. The system consists in a regular square array of magnetic particles totally oriented along the direction of application of an external magnetic field. The behavior of such system was numerically simulated for different values of the interaction between particles and of the standard deviation of the critical fields of the particles. The characteristic parameters of the Preisach moving model were worked out during simulations, i.e., the mean value and the standard deviation of the interaction field. For this system, results reveal that the mean interaction field depends linearly on the system magnetization, as the Preisach moving model predicts. Nevertheless, the standard deviation cannot be considered as independent of the magnetization. In fact, the standard deviation shows a maximum at demagnetization and two minima at magnetization saturation. Furthermore, not all the demagnetization states are equivalent. The plot standard deviation vs. magnetization is a multi-valuated curve when the system undergoes an AC demagnetization procedure. In this way, the standard deviation increases as the system goes from coercivity to the AC demagnetized state

  1. Assessing and improving the quality of modeling : a series of empirical studies about the UML

    NARCIS (Netherlands)

    Lange, C.F.J.

    2007-01-01

    Assessing and Improving the Quality of Modeling A Series of Empirical Studies about the UML This thesis addresses the assessment and improvement of the quality of modeling in software engineering. In particular, we focus on the Unified Modeling Language (UML), which is the de facto standard in

  2. NASA Standard for Models and Simulations (M and S): Development Process and Rationale

    Science.gov (United States)

    Zang, Thomas A.; Blattnig, Steve R.; Green, Lawrence L.; Hemsch, Michael J.; Luckring, James M.; Morison, Joseph H.; Tripathi, Ram K.

    2009-01-01

    After the Columbia Accident Investigation Board (CAIB) report. the NASA Administrator at that time chartered an executive team (known as the Diaz Team) to identify the CAIB report elements with Agency-wide applicability, and to develop corrective measures to address each element. This report documents the chronological development and release of an Agency-wide Standard for Models and Simulations (M&S) (NASA Standard 7009) in response to Action #4 from the report, "A Renewed Commitment to Excellence: An Assessment of the NASA Agency-wide Applicability of the Columbia Accident Investigation Board Report, January 30, 2004".

  3. The Standard Model with one universal extra dimension

    Indian Academy of Sciences (India)

    An exhaustive list of the explicit expressions for all physical couplings induced by the ... the standard Green's functions, which implies that the Standard Model observables do ...... renormalizability of standard Green's functions is implicit in this.

  4. Planning Model of Physics Learning In Senior High School To Develop Problem Solving Creativity Based On National Standard Of Education

    Science.gov (United States)

    Putra, A.; Masril, M.; Yurnetti, Y.

    2018-04-01

    One of the causes of low achievement of student’s competence in physics learning in high school is the process which they have not been able to develop student’s creativity in problem solving. This is shown that the teacher’s learning plan is not accordance with the National Eduction Standard. This study aims to produce a reconstruction model of physics learning that fullfil the competency standards, content standards, and assessment standards in accordance with applicable curriculum standards. The development process follows: Needs analysis, product design, product development, implementation, and product evaluation. The research process involves 2 peers judgment, 4 experts judgment and two study groups of high school students in Padang. The data obtained, in the form of qualitative and quantitative data that collected through documentation, observation, questionnaires, and tests. The result of this research up to the product development stage that obtained the physics learning plan model that meets the validity of the content and the validity of the construction in terms of the fulfillment of Basic Competence, Content Standards, Process Standards and Assessment Standards.

  5. Creating NDA working standards through high-fidelity spent fuel modeling

    International Nuclear Information System (INIS)

    Skutnik, Steven E.; Gauld, Ian C.; Romano, Catherine E.; Trellue, Holly

    2012-01-01

    The Next Generation Safeguards Initiative (NGSI) is developing advanced non-destructive assay (NDA) techniques for spent nuclear fuel assemblies to advance the state-of-the-art in safeguards measurements. These measurements aim beyond the capabilities of existing methods to include the evaluation of plutonium and fissile material inventory, independent of operator declarations. Testing and evaluation of advanced NDA performance will require reference assemblies with well-characterized compositions to serve as working standards against which the NDA methods can be benchmarked and for uncertainty quantification. To support the development of standards for the NGSI spent fuel NDA project, high-fidelity modeling of irradiated fuel assemblies is being performed to characterize fuel compositions and radiation emission data. The assembly depletion simulations apply detailed operating history information and core simulation data as it is available to perform high fidelity axial and pin-by-pin fuel characterization for more than 1600 nuclides. The resulting pin-by-pin isotopic inventories are used to optimize the NDA measurements and provide information necessary to unfold and interpret the measurement data, e.g., passive gamma emitters, neutron emitters, neutron absorbers, and fissile content. A key requirement of this study is the analysis of uncertainties associated with the calculated compositions and signatures for the standard assemblies; uncertainties introduced by the calculation methods, nuclear data, and operating information. An integral part of this assessment involves the application of experimental data from destructive radiochemical assay to assess the uncertainty and bias in computed inventories, the impact of parameters such as assembly burnup gradients and burnable poisons, and the influence of neighboring assemblies on periphery rods. This paper will present the results of high fidelity assembly depletion modeling and uncertainty analysis from independent

  6. Designing Standardized Patient Assessments to Measure SBIRT Skills

    Science.gov (United States)

    Wamsley, Maria A.; Julian, Katherine A.; O'Sullivan, Patricia; Satterfield, Jason M.; Satre, Derek D.; McCance-Katz, Elinore; Batki, Steven L.

    2013-01-01

    Objectives: Resident physicians report insufficient experience caring for patients with substance use disorders (SUDs). Resident training in Screening, Brief Intervention, and Referral to Treatment (SBIRT) has been recommended. We describe the development of a standardized patient (SP) assessment to measure SBIRT skills, resident perceptions of…

  7. Primordial lithium and the standard model(s)

    International Nuclear Information System (INIS)

    Deliyannis, C.P.; Demarque, P.; Kawaler, S.D.; Krauss, L.M.; Romanelli, P.

    1989-01-01

    We present the results of new theoretical work on surface 7 Li and 6 Li evolution in the oldest halo stars along with a new and refined analysis of the predicted primordial lithium abundance resulting from big-bang nucleosynthesis. This allows us to determine the constraints which can be imposed upon cosmology by a consideration of primordial lithium using both standard big-bang and standard stellar-evolution models. Such considerations lead to a constraint on the baryon density today of 0.0044 2 <0.025 (where the Hubble constant is 100h Km sec/sup -1/ Mpc /sup -1/), and impose limitations on alternative nucleosynthesis scenarios

  8. Training in Vocational Assessment: Preparing Rehabilitation Counselors and Meeting the Requirements of the CORE Standards

    Science.gov (United States)

    Tansey, Timothy N.

    2008-01-01

    Assessment represents a foundational component of rehabilitation counseling services. The revised Council on Rehabilitation Education (CORE) standards implemented in 2004 resulted in the redesign of the knowledge and outcomes under the Assessment standard. The author reviews the current CORE standard for training in assessment within the context…

  9. Model based risk assessment - the CORAS framework

    Energy Technology Data Exchange (ETDEWEB)

    Gran, Bjoern Axel; Fredriksen, Rune; Thunem, Atoosa P-J.

    2004-04-15

    Traditional risk analysis and assessment is based on failure-oriented models of the system. In contrast to this, model-based risk assessment (MBRA) utilizes success-oriented models describing all intended system aspects, including functional, operational and organizational aspects of the target. The target models are then used as input sources for complementary risk analysis and assessment techniques, as well as a basis for the documentation of the assessment results. The EU-funded CORAS project developed a tool-supported methodology for the application of MBRA in security-critical systems. The methodology has been tested with successful outcome through a series of seven trial within the telemedicine and ecommerce areas. The CORAS project in general and the CORAS application of MBRA in particular have contributed positively to the visibility of model-based risk assessment and thus to the disclosure of several potentials for further exploitation of various aspects within this important research field. In that connection, the CORAS methodology's possibilities for further improvement towards utilization in more complex architectures and also in other application domains such as the nuclear field can be addressed. The latter calls for adapting the framework to address nuclear standards such as IEC 60880 and IEC 61513. For this development we recommend applying a trial driven approach within the nuclear field. The tool supported approach for combining risk analysis and system development also fits well with the HRP proposal for developing an Integrated Design Environment (IDE) providing efficient methods and tools to support control room systems design. (Author)

  10. Evaluation of HVS models in the application of medical image quality assessment

    Science.gov (United States)

    Zhang, L.; Cavaro-Menard, C.; Le Callet, P.

    2012-03-01

    In this study, four of the most widely used Human Visual System (HVS) models are applied on Magnetic Resonance (MR) images for signal detection task. Their performances are evaluated against gold standard derived from radiologists' majority decision. The task-based image quality assessment requires taking into account the human perception specificities, for which various HVS models have been proposed. However to our knowledge, no work was conducted to evaluate and compare the suitability of these models with respect to the assessment of medical image qualities. This pioneering study investigates the performances of different HVS models on medical images in terms of approximation to radiologist performance. We propose to score the performance of each HVS model using the AUC (Area Under the receiver operating characteristic Curve) and its variance estimate as the figure of merit. The radiologists' majority decision is used as gold standard so that the estimated AUC measures the distance between the HVS model and the radiologist perception. To calculate the variance estimate of AUC, we adopted the one-shot method that is independent of the HVS model's output range. The results of this study will help to provide arguments to the application of some HVS model on our future medical image quality assessment metric.

  11. Assessment of hospital performance with a case-mix standardized mortality model using an existing administrative database in Japan

    Directory of Open Access Journals (Sweden)

    Fushimi Kiyohide

    2010-05-01

    Full Text Available Abstract Background Few studies have examined whether risk adjustment is evenly applicable to hospitals with various characteristics and case-mix. In this study, we applied a generic prediction model to nationwide discharge data from hospitals with various characteristics. Method We used standardized data of 1,878,767 discharged patients provided by 469 hospitals from July 1 to October 31, 2006. We generated and validated a case-mix in-hospital mortality prediction model using 50/50 split sample validation. We classified hospitals into two groups based on c-index value (hospitals with c-index ≥ 0.8; hospitals with c-index Results The model demonstrated excellent discrimination as indicated by the high average c-index and small standard deviation (c-index = 0.88 ± 0.04. Expected mortality rate of each hospital was highly correlated with observed mortality rate (r = 0.693, p Conclusion The model fits well to a group of hospitals with a wide variety of acute care events, though model fit is less satisfactory for specialized hospitals and those with convalescent wards. Further sophistication of the generic prediction model would be recommended to obtain optimal indices to region specific conditions.

  12. Physics beyond the standard model in the non-perturbative unification scheme

    International Nuclear Information System (INIS)

    Kapetanakis, D.; Zoupanos, G.

    1990-01-01

    The non-perturbative unification scenario predicts reasonably well the low energy gauge couplings of the standard model. Agreement with the measured low energy couplings is obtained by assuming certain kind of physics beyond the standard model. A number of possibilities for physics beyond the standard model is examined. The best candidates so far are the standard model with eight fermionic families and a similar number of Higgs doublets, and the supersymmetric standard model with five families. (author)

  13. Control system architecture: The standard and non-standard models

    International Nuclear Information System (INIS)

    Thuot, M.E.; Dalesio, L.R.

    1993-01-01

    Control system architecture development has followed the advances in computer technology through mainframes to minicomputers to micros and workstations. This technology advance and increasingly challenging accelerator data acquisition and automation requirements have driven control system architecture development. In summarizing the progress of control system architecture at the last International Conference on Accelerator and Large Experimental Physics Control Systems (ICALEPCS) B. Kuiper asserted that the system architecture issue was resolved and presented a open-quotes standard modelclose quotes. The open-quotes standard modelclose quotes consists of a local area network (Ethernet or FDDI) providing communication between front end microcomputers, connected to the accelerator, and workstations, providing the operator interface and computational support. Although this model represents many present designs, there are exceptions including reflected memory and hierarchical architectures driven by requirements for widely dispersed, large channel count or tightly coupled systems. This paper describes the performance characteristics and features of the open-quotes standard modelclose quotes to determine if the requirements of open-quotes non-standardclose quotes architectures can be met. Several possible extensions to the open-quotes standard modelclose quotes are suggested including software as well as the hardware architectural features

  14. Conformal standard model with an extended scalar sector

    Energy Technology Data Exchange (ETDEWEB)

    Latosiński, Adam [Max-Planck-Institut für Gravitationsphysik (Albert-Einstein-Institut),Mühlenberg 1, D-14476 Potsdam (Germany); Lewandowski, Adrian; Meissner, Krzysztof A. [Faculty of Physics, University of Warsaw,Pasteura 5, 02-093 Warsaw (Poland); Nicolai, Hermann [Max-Planck-Institut für Gravitationsphysik (Albert-Einstein-Institut),Mühlenberg 1, D-14476 Potsdam (Germany)

    2015-10-26

    We present an extended version of the Conformal Standard Model (characterized by the absence of any new intermediate scales between the electroweak scale and the Planck scale) with an enlarged scalar sector coupling to right-chiral neutrinos. The scalar potential and the Yukawa couplings involving only right-chiral neutrinos are invariant under a new global symmetry SU(3){sub N} that complements the standard U(1){sub B−L} symmetry, and is broken explicitly only by the Yukawa interaction, of order O(10{sup −6}), coupling right-chiral neutrinos and the electroweak lepton doublets. We point out four main advantages of this enlargement, namely: (1) the economy of the (non-supersymmetric) Standard Model, and thus its observational success, is preserved; (2) thanks to the enlarged scalar sector the RG improved one-loop effective potential is everywhere positive with a stable global minimum, thereby avoiding the notorious instability of the Standard Model vacuum; (3) the pseudo-Goldstone bosons resulting from spontaneous breaking of the SU(3){sub N} symmetry are natural Dark Matter candidates with calculable small masses and couplings; and (4) the Majorana Yukawa coupling matrix acquires a form naturally adapted to leptogenesis. The model is made perturbatively consistent up to the Planck scale by imposing the vanishing of quadratic divergences at the Planck scale (‘softly broken conformal symmetry’). Observable consequences of the model occur mainly via the mixing of the new scalars and the standard model Higgs boson.

  15. Standard model Higgs physics at colliders

    International Nuclear Information System (INIS)

    Rosca, A.

    2007-01-01

    In this report we briefly review the experimental status and prospects to verify the Higgs mechanism of spontaneous symmetry breaking. The focus is on the most relevant aspects of the phenomenology of the Standard Model Higgs boson at current (Tevatron) and future (Large Hadron Collider, LHC and International Linear Collider, ILC) particle colliders. We review the Standard Model searches: searches at the Tevatron, the program planned at the LHC and prospects at the ILC. Emphasis is put on what follows after a candidate discovery at the LHC: the various measurements which are necessary to precisely determine what the properties of this Higgs candidate are. (author)

  16. DYNAMO-HIA--a Dynamic Modeling tool for generic Health Impact Assessments.

    Directory of Open Access Journals (Sweden)

    Stefan K Lhachimi

    Full Text Available BACKGROUND: Currently, no standard tool is publicly available that allows researchers or policy-makers to quantify the impact of policies using epidemiological evidence within the causal framework of Health Impact Assessment (HIA. A standard tool should comply with three technical criteria (real-life population, dynamic projection, explicit risk-factor states and three usability criteria (modest data requirements, rich model output, generally accessible to be useful in the applied setting of HIA. With DYNAMO-HIA (Dynamic Modeling for Health Impact Assessment, we introduce such a generic software tool specifically designed to facilitate quantification in the assessment of the health impacts of policies. METHODS AND RESULTS: DYNAMO-HIA quantifies the impact of user-specified risk-factor changes on multiple diseases and in turn on overall population health, comparing one reference scenario with one or more intervention scenarios. The Markov-based modeling approach allows for explicit risk-factor states and simulation of a real-life population. A built-in parameter estimation module ensures that only standard population-level epidemiological evidence is required, i.e. data on incidence, prevalence, relative risks, and mortality. DYNAMO-HIA provides a rich output of summary measures--e.g. life expectancy and disease-free life expectancy--and detailed data--e.g. prevalences and mortality/survival rates--by age, sex, and risk-factor status over time. DYNAMO-HIA is controlled via a graphical user interface and is publicly available from the internet, ensuring general accessibility. We illustrate the use of DYNAMO-HIA with two example applications: a policy causing an overall increase in alcohol consumption and quantifying the disease-burden of smoking. CONCLUSION: By combining modest data needs with general accessibility and user friendliness within the causal framework of HIA, DYNAMO-HIA is a potential standard tool for health impact assessment based

  17. Standardization of milk mid-infrared spectrometers for the transfer and use of multiple models.

    Science.gov (United States)

    Grelet, C; Pierna, J A Fernández; Dardenne, P; Soyeurt, H; Vanlierde, A; Colinet, F; Bastin, C; Gengler, N; Baeten, V; Dehareng, F

    2017-10-01

    An increasing number of models are being developed to provide information from milk Fourier transform mid-infrared (FT-MIR) spectra on fine milk composition, technological properties of milk, or even cows' physiological status. In this context, and to take advantage of these existing models, the purpose of this work was to evaluate whether a spectral standardization method can enable the use of multiple equations within a network of different FT-MIR spectrometers. The piecewise direct standardization method was used, matching "slave" instruments to a common reference, the "master." The effect of standardization on network reproducibility was assessed on 66 instruments from 3 different brands by comparing the spectral variability of the slaves and the master with and without standardization. With standardization, the global Mahalanobis distance from the slave spectra to the master spectra was reduced on average from 2,655.9 to 14.3, representing a significant reduction of noninformative spectral variability. The transfer of models from instrument to instrument was tested using 3 FT-MIR models predicting (1) the quantity of daily methane emitted by dairy cows, (2) the concentration of polyunsaturated fatty acids in milk, and (3) the fresh cheese yield. The differences, in terms of root mean squared error, between master predictions and slave predictions were reduced after standardization on average from 103 to 17 g/d, from 0.0315 to 0.0045 g/100 mL of milk, and from 2.55 to 0.49 g of curd/100 g of milk, respectively. For all the models, standard deviations of predictions among all the instruments were also reduced by 5.11 times for methane, 5.01 times for polyunsaturated fatty acids, and 7.05 times for fresh cheese yield, showing an improvement of prediction reproducibility within the network. Regarding the results obtained, spectral standardization allows the transfer and use of multiple models on all instruments as well as the improvement of spectral and prediction

  18. Standard Model measurements with the ATLAS detector

    Directory of Open Access Journals (Sweden)

    Hassani Samira

    2015-01-01

    Full Text Available Various Standard Model measurements have been performed in proton-proton collisions at a centre-of-mass energy of √s = 7 and 8 TeV using the ATLAS detector at the Large Hadron Collider. A review of a selection of the latest results of electroweak measurements, W/Z production in association with jets, jet physics and soft QCD is given. Measurements are in general found to be well described by the Standard Model predictions.

  19. Integration of nursing assessment concepts into the medical entities dictionary using the LOINC semantic structure as a terminology model.

    Science.gov (United States)

    Cieslowski, B J; Wajngurt, D; Cimino, J J; Bakken, S

    2001-01-01

    Recent investigations have tested the applicability of various terminology models for the representing nursing concepts including those related to nursing diagnoses, nursing interventions, and standardized nursing assessments as a prerequisite for building a reference terminology that supports the nursing domain. We used the semantic structure of Clinical LOINC (Logical Observations, Identifiers, Names, and Codes) as a reference terminology model to support the integration of standardized assessment terms from two nursing terminologies into the Medical Entities Dictionary (MED), the concept-oriented, metadata dictionary at New York Presbyterian Hospital. Although the LOINC semantic structure was used previously to represent laboratory terms in the MED, selected hierarchies and semantic slots required revisions in order to incorporate the nursing assessment concepts. This project was an initial step in integrating nursing assessment concepts into the MED in a manner consistent with evolving standards for reference terminology models. Moreover, the revisions provide the foundation for adding other types of standardized assessments to the MED.

  20. The effective Standard Model after LHC Run I

    International Nuclear Information System (INIS)

    Ellis, John; Sanz, Verónica; You, Tevong

    2015-01-01

    We treat the Standard Model as the low-energy limit of an effective field theory that incorporates higher-dimensional operators to capture the effects of decoupled new physics. We consider the constraints imposed on the coefficients of dimension-6 operators by electroweak precision tests (EWPTs), applying a framework for the effects of dimension-6 operators on electroweak precision tests that is more general than the standard S,T formalism, and use measurements of Higgs couplings and the kinematics of associated Higgs production at the Tevatron and LHC, as well as triple-gauge couplings at the LHC. We highlight the complementarity between EWPTs, Tevatron and LHC measurements in obtaining model-independent limits on the effective Standard Model after LHC Run 1. We illustrate the combined constraints with the example of the two-Higgs doublet model.

  1. Minimal extension of the standard model scalar sector

    International Nuclear Information System (INIS)

    O'Connell, Donal; Wise, Mark B.; Ramsey-Musolf, Michael J.

    2007-01-01

    The minimal extension of the scalar sector of the standard model contains an additional real scalar field with no gauge quantum numbers. Such a field does not couple to the quarks and leptons directly but rather through its mixing with the standard model Higgs field. We examine the phenomenology of this model focusing on the region of parameter space where the new scalar particle is significantly lighter than the usual Higgs scalar and has small mixing with it. In this region of parameter space most of the properties of the additional scalar particle are independent of the details of the scalar potential. Furthermore the properties of the scalar that is mostly the standard model Higgs can be drastically modified since its dominant branching ratio may be to a pair of the new lighter scalars

  2. The Effective Standard Model after LHC Run I

    CERN Document Server

    Ellis, John; You, Tevong

    2015-01-01

    We treat the Standard Model as the low-energy limit of an effective field theory that incorporates higher-dimensional operators to capture the effects of decoupled new physics. We consider the constraints imposed on the coefficients of dimension-6 operators by electroweak precision tests (EWPTs), applying a framework for the effects of dimension-6 operators on electroweak precision tests that is more general than the standard $S,T$ formalism, and use measurements of Higgs couplings and the kinematics of associated Higgs production at the Tevatron and LHC, as well as triple-gauge couplings at the LHC. We highlight the complementarity between EWPTs, Tevatron and LHC measurements in obtaining model-independent limits on the effective Standard Model after LHC Run~1. We illustrate the combined constraints with the example of the two-Higgs doublet model.

  3. From the standard model to dark matter

    International Nuclear Information System (INIS)

    Wilczek, F.

    1995-01-01

    The standard model of particle physics is marvelously successful. However, it is obviously not a complete or final theory. I shall argue here that the structure of the standard model gives some quite concrete, compelling hints regarding what lies beyond. Taking these hints seriously, one is led to predict the existence of new types of very weakly interacting matter, stable on cosmological time scales and produced with cosmologically interesting densities--that is, ''dark matter''. copyright 1995 American Institute of Physics

  4. Standardized reporting for rapid relative effectiveness assessments of pharmaceuticals.

    Science.gov (United States)

    Kleijnen, Sarah; Pasternack, Iris; Van de Casteele, Marc; Rossi, Bernardette; Cangini, Agnese; Di Bidino, Rossella; Jelenc, Marjetka; Abrishami, Payam; Autti-Rämö, Ilona; Seyfried, Hans; Wildbacher, Ingrid; Goettsch, Wim G

    2014-11-01

    Many European countries perform rapid assessments of the relative effectiveness (RE) of pharmaceuticals as part of the reimbursement decision making process. Increased sharing of information on RE across countries may save costs and reduce duplication of work. The objective of this article is to describe the development of a tool for rapid assessment of RE of new pharmaceuticals that enter the market, the HTA Core Model® for Rapid Relative Effectiveness Assessment (REA) of Pharmaceuticals. Eighteen member organisations of the European Network of Health Technology Assessment (EUnetHTA) participated in the development of the model. Different versions of the model were developed and piloted in this collaboration and adjusted accordingly based on feedback on the content and feasibility of the model. The final model deviates from the traditional HTA Core Model® used for assessing other types of technologies. This is due to the limited scope (strong focus on RE), the timing of the assessment (just after market authorisation), and strict timelines (e.g. 90 days) required for performing the assessment. The number of domains and assessment elements was limited and it was decided that the primary information sources should preferably be a submission file provided by the marketing authorisation holder and the European Public Assessment Report. The HTA Core Model® for Rapid REA (version 3.0) was developed to produce standardised transparent RE information of pharmaceuticals. Further piloting can provide input for possible improvements, such as further refining the assessment elements and new methodological guidance on relevant areas.

  5. Enhancements to ASHRAE Standard 90.1 Prototype Building Models

    Energy Technology Data Exchange (ETDEWEB)

    Goel, Supriya; Athalye, Rahul A.; Wang, Weimin; Zhang, Jian; Rosenberg, Michael I.; Xie, YuLong; Hart, Philip R.; Mendon, Vrushali V.

    2014-04-16

    This report focuses on enhancements to prototype building models used to determine the energy impact of various versions of ANSI/ASHRAE/IES Standard 90.1. Since the last publication of the prototype building models, PNNL has made numerous enhancements to the original prototype models compliant with the 2004, 2007, and 2010 editions of Standard 90.1. Those enhancements are described here and were made for several reasons: (1) to change or improve prototype design assumptions; (2) to improve the simulation accuracy; (3) to improve the simulation infrastructure; and (4) to add additional detail to the models needed to capture certain energy impacts from Standard 90.1 improvements. These enhancements impact simulated prototype energy use, and consequently impact the savings estimated from edition to edition of Standard 90.1.

  6. Consistent Conformal Extensions of the Standard Model arXiv

    CERN Document Server

    Loebbert, Florian; Plefka, Jan

    The question of whether classically conformal modifications of the standard model are consistent with experimental obervations has recently been subject to renewed interest. The method of Gildener and Weinberg provides a natural framework for the study of the effective potential of the resulting multi-scalar standard model extensions. This approach relies on the assumption of the ordinary loop hierarchy $\\lambda_\\text{s} \\sim g^2_\\text{g}$ of scalar and gauge couplings. On the other hand, Andreassen, Frost and Schwartz recently argued that in the (single-scalar) standard model, gauge invariant results require the consistent scaling $\\lambda_\\text{s} \\sim g^4_\\text{g}$. In the present paper we contrast these two hierarchy assumptions and illustrate the differences in the phenomenological predictions of minimal conformal extensions of the standard model.

  7. A 'theory of everything'? [Extending the Standard Model

    International Nuclear Information System (INIS)

    Ross, G.G.

    1993-01-01

    The Standard Model provides us with an amazingly successful theory of the strong, weak and electromagnetic interactions. Despite this, many physicists believe it represents only a step towards understanding the ultimate ''theory of everything''. In this article we describe why the Standard Model is thought to be incomplete and some of the suggestions for its extension. (Author)

  8. Convergence or Divergence: Alignment of Standards, Assessment, and Issues of Diversity.

    Science.gov (United States)

    Carter, Norvella, Ed.

    In this report, teacher educators scrutinize the relationships between the standards and assessment movement in education and the United States' increasingly multicultural population. The papers include: "Foreword" (Jacqueline Jordan Irvine); (1) "Diversity and Standards: Defining the Issues" (Norvella P. Carter); (2) "Accountability and…

  9. Towards LHC physics with nonlocal Standard Model

    Energy Technology Data Exchange (ETDEWEB)

    Biswas, Tirthabir, E-mail: tbiswas@loyno.edu [Department of Physics, Loyola University, 6363 St. Charles Avenue, Box 92, New Orleans, LA 70118 (United States); Okada, Nobuchika, E-mail: okadan@ua.edu [Department of Physics and Astronomy, University of Alabama, Tuscaloosa, AL 35487-0324 (United States)

    2015-09-15

    We take a few steps towards constructing a string-inspired nonlocal extension of the Standard Model. We start by illustrating how quantum loop calculations can be performed in nonlocal scalar field theory. In particular, we show the potential to address the hierarchy problem in the nonlocal framework. Next, we construct a nonlocal abelian gauge model and derive modifications of the gauge interaction vertex and field propagators. We apply the modifications to a toy version of the nonlocal Standard Model and investigate collider phenomenology. We find the lower bound on the scale of nonlocality from the 8 TeV LHC data to be 2.5–3 TeV.

  10. Noncommutative geometry and the standard model vacuum

    International Nuclear Information System (INIS)

    Barrett, John W.; Dawe Martins, Rachel A.

    2006-01-01

    The space of Dirac operators for the Connes-Chamseddine spectral action for the standard model of particle physics coupled to gravity is studied. The model is extended by including right-handed neutrino states, and the S 0 -reality axiom is not assumed. The possibility of allowing more general fluctuations than the inner fluctuations of the vacuum is proposed. The maximal case of all possible fluctuations is studied by considering the equations of motion for the vacuum. While there are interesting nontrivial vacua with Majorana-type mass terms for the leptons, the conclusion is that the equations are too restrictive to allow solutions with the standard model mass matrix

  11. Beyond the standard model in many directions

    Energy Technology Data Exchange (ETDEWEB)

    Chris Quigg

    2004-04-28

    These four lectures constitute a gentle introduction to what may lie beyond the standard model of quarks and leptons interacting through SU(3){sub c} {direct_product} SU(2){sub L} {direct_product} U(1){sub Y} gauge bosons, prepared for an audience of graduate students in experimental particle physics. In the first lecture, I introduce a novel graphical representation of the particles and interactions, the double simplex, to elicit questions that motivate our interest in physics beyond the standard model, without recourse to equations and formalism. Lecture 2 is devoted to a short review of the current status of the standard model, especially the electroweak theory, which serves as the point of departure for our explorations. The third lecture is concerned with unified theories of the strong, weak, and electromagnetic interactions. In the fourth lecture, I survey some attempts to extend and complete the electroweak theory, emphasizing some of the promise and challenges of supersymmetry. A short concluding section looks forward.

  12. Discrete symmetry breaking beyond the standard model

    NARCIS (Netherlands)

    Dekens, Wouter Gerard

    2015-01-01

    The current knowledge of elementary particles and their interactions is summarized in the Standard Model of particle physics. Practically all the predictions of this model, that have been tested, were confirmed experimentally. Nonetheless, there are phenomena which the model cannot explain. For

  13. Comparing Panelists' Understanding of Standard Setting across Multiple Levels of an Alternate Science Assessment

    Science.gov (United States)

    Hansen, Mary A.; Lyon, Steven R.; Heh, Peter; Zigmond, Naomi

    2013-01-01

    Large-scale assessment programs, including alternate assessments based on alternate achievement standards (AA-AAS), must provide evidence of technical quality and validity. This study provides information about the technical quality of one AA-AAS by evaluating the standard setting for the science component. The assessment was designed to have…

  14. Planning the Unplanned Experiment: Assessing the Efficacy of Standards for Safety Critical Software

    Science.gov (United States)

    Graydon, Patrick J.; Holloway, C. Michael

    2015-01-01

    We need well-founded means of determining whether software is t for use in safety-critical applications. While software in industries such as aviation has an excellent safety record, the fact that software aws have contributed to deaths illustrates the need for justi ably high con dence in software. It is often argued that software is t for safety-critical use because it conforms to a standard for software in safety-critical systems. But little is known about whether such standards `work.' Reliance upon a standard without knowing whether it works is an experiment; without collecting data to assess the standard, this experiment is unplanned. This paper reports on a workshop intended to explore how standards could practicably be assessed. Planning the Unplanned Experiment: Assessing the Ecacy of Standards for Safety Critical Software (AESSCS) was held on 13 May 2014 in conjunction with the European Dependable Computing Conference (EDCC). We summarize and elaborate on the workshop's discussion of the topic, including both the presented positions and the dialogue that ensued.

  15. Assessing flood risk at the global scale: model setup, results, and sensitivity

    International Nuclear Information System (INIS)

    Ward, Philip J; Jongman, Brenden; Weiland, Frederiek Sperna; Winsemius, Hessel C; Bouwman, Arno; Ligtvoet, Willem; Van Beek, Rens; Bierkens, Marc F P

    2013-01-01

    Globally, economic losses from flooding exceeded $19 billion in 2012, and are rising rapidly. Hence, there is an increasing need for global-scale flood risk assessments, also within the context of integrated global assessments. We have developed and validated a model cascade for producing global flood risk maps, based on numerous flood return-periods. Validation results indicate that the model simulates interannual fluctuations in flood impacts well. The cascade involves: hydrological and hydraulic modelling; extreme value statistics; inundation modelling; flood impact modelling; and estimating annual expected impacts. The initial results estimate global impacts for several indicators, for example annual expected exposed population (169 million); and annual expected exposed GDP ($1383 billion). These results are relatively insensitive to the extreme value distribution employed to estimate low frequency flood volumes. However, they are extremely sensitive to the assumed flood protection standard; developing a database of such standards should be a research priority. Also, results are sensitive to the use of two different climate forcing datasets. The impact model can easily accommodate new, user-defined, impact indicators. We envisage several applications, for example: identifying risk hotspots; calculating macro-scale risk for the insurance industry and large companies; and assessing potential benefits (and costs) of adaptation measures. (letter)

  16. Standard-model bundles

    CERN Document Server

    Donagi, Ron; Pantev, Tony; Waldram, Dan; Donagi, Ron; Ovrut, Burt; Pantev, Tony; Waldram, Dan

    2002-01-01

    We describe a family of genus one fibered Calabi-Yau threefolds with fundamental group ${\\mathbb Z}/2$. On each Calabi-Yau $Z$ in the family we exhibit a positive dimensional family of Mumford stable bundles whose symmetry group is the Standard Model group $SU(3)\\times SU(2)\\times U(1)$ and which have $c_{3} = 6$. We also show that for each bundle $V$ in our family, $c_{2}(Z) - c_{2}(V)$ is the class of an effective curve on $Z$. These conditions ensure that $Z$ and $V$ can be used for a phenomenologically relevant compactification of Heterotic M-theory.

  17. Xpand chest drain: assessing equivalence to current standard ...

    African Journals Online (AJOL)

    leakage from 'open to air' system or breakage of glass bottle (with associated risk to ... and an air-leak detection system. It is connected to a ... need to add water. Xpand chest drain: assessing equivalence to current standard therapy – a randomised controlled trial. CHARL COOPER, M.B. CH.B. TIMOTHY HARDCASTLE ...

  18. Multislice computed tomography: angiographic emulation versus standard assessment for detection of coronary stenoses

    Energy Technology Data Exchange (ETDEWEB)

    Schnapauff, Dirk; Hamm, Bernd; Dewey, Marc [Humboldt-Universitaet zu Berlin, Department of Radiology, Charite - Universitaetsmedizin Berlin, Chariteplatz 1, P.O. Box 10098, Berlin (Germany); Duebel, Hans-Peter; Baumann, Gert [Charite - Universitaetsmedizin Berlin, Department of Cardiology, Berlin (Germany); Scholze, Juergen [Charite - Universitaetsmedizin Berlin, Charite Outpatient Centre, Berlin (Germany)

    2007-07-15

    The present study investigated angiographic emulation of multislice computed tomography (MSCT) (catheter-like visualization) as an alternative approach of analyzing and visualizing findings in comparison with standard assessment. Thirty patients (120 coronary arteries) were randomly selected from 90 prospectively investigated patients with suspected coronary artery disease who underwent MSCT (16-slice scanner, 0.5 mm collimation, 400 ms rotation time) prior to conventional coronary angiography for comparison of both approaches. Sensitivity and specificity of angiographic emulation [81% (26/32) and 93% (82/88)] were not significantly different from those of standard assessment [88% (28/32) and 99% (87/88)], while the per-case analysis time was significantly shorter for angiographic emulation than for standard assessment (3.4 {+-} 1.5 vs 7.0 {+-} 2.5 min, P < 0.001). Both interventional and referring cardiologists preferred angiographic emulation over standard curved multiplanar reformations of MSCT coronary angiography for illustration, mainly because of improved overall lucidity and depiction of sidebranches (P < 0.001). In conclusion, angiographic emulation of MSCT reduces analysis time, yields a diagnostic accuracy comparable to that of standard assessment, and is preferred by cardiologists for visualization of results. (orig.)

  19. CP violation in the standard model and beyond

    International Nuclear Information System (INIS)

    Buras, A.J.

    1984-01-01

    The present status of CP violation in the standard six quark model is reviewed and a combined analysis with B-meson decays is presented. The theoretical uncertainties in the analysis are discussed and the resulting KM weak mixing angles, the phase delta and the ratio epsilon'/epsilon are given as functions of Tsub(B), GAMMA(b -> u)/GAMMA(b -> c), msub(t) and the B parameter. For certain ranges of the values of these parameters the standard model is not capable in reproducing the experimental values for epsilon' and epsilon parameters. Anticipating possible difficulties we discuss various alternatives to the standard explanation of CP violation such as horizontal interactions, left-right symmetric models and supersymmetry. CP violation outside the kaon system is also briefly discussed. (orig.)

  20. DIFFERENCES IN MANAGER ASSESSMENTS OF ISO 14000 STANDARD IMPLEMENTATION IN TURKEY

    Directory of Open Access Journals (Sweden)

    Sıtkı Gözlü

    2005-12-01

    Full Text Available This study reports the results of a survey about the improvements achieved as result of ISO 14000 Environmental Management System (EMS standard implementation and the differences of improvements with respect to firm characteristics. A survey has been conducted in order to explain the improvements related to environmental management process and overall firm performance. The survey involved sixty-six enterprises implementing ISO 14000 EMS standard in Turkey. In order to assess improvements obtained from ISO 14000 EMS implementation, statements related to environmental management process and overall firm performance indicators have been prepared. The statements in this study are relevant to previous research. A factor analysis was employed to determine the factors of the variables explaining improvements. Nine factors have been identified related to achieved improvements, such as establishment of pro-active environmental management system, effectiveness in resource utilization, effectiveness of process control, relationships with industry and government, meeting expectations of stakeholders, demonstration of social responsibility, profitability, productivity, and competitiveness. Then, a T- test was conducted to determine the differences of managers’ assessments with respect to certain firm characteristics. The findings have shown that there are differences in the assessments of improvements achieved as a result of ISO 14000 EMS standard implementation with respect to sales volume, foreign-capital possession, and ISO 14000 EMS standard implementation. On the other hand, industrial sector, age of establishment, and export orientation are not statistically significant for the differences in the assessments of improvements.

  1. Standardization of figures and assessment procedures for DTM verticalaccuracy

    Directory of Open Access Journals (Sweden)

    Vittorio Casella

    2015-07-01

    Full Text Available Digital Terrain Models (DTMs are widely used in many sectors. They play a key role in hydrological risk prevention, risk mitigation and numeric simulations. This paper deals with two questions: (i when it is stated that a DTM has a given vertical accuracy, is this assertion univocal? (ii when DTM vertical accuracy is assessed by means of checkpoints, does their location influence results? First, the paper illustrates that two vertical accuracy definitions are conceivable: Vertical Accuracy at the Nodes (VAN, the average vertical distance between the model and the terrain, evaluated at the DTM's nodes and Vertical Accuracy at the interpolated Points (VAP, in which the vertical distance is evaluated at the generic points. These two quantities are not coincident and, when they are calculated for the same DTM, different numeric values are reached. Unfortunately, the two quantities are often interchanged, but this is misleading. Second, the paper shows a simulated example of a DTM vertical accuracy assessment, highlighting that the checkpoints’ location plays a key role: when checkpoints coincide with the DTM nodes, VAN is estimated; when checkpoints are randomly located, VAP is estimated, instead. Third, an in-depth, theoretical characterization of the two considered quantities is performed, based on symbolic computation, and suitable standardization coefficients are proposed. Finally, our discussion has a well-defined frame: it doesn't deal with all the items of the DTM vertical accuracy budget, which would require a much longer essay, but only with one, usually called fundamental vertical accuracy.

  2. Beyond the standard model with B and K physics

    International Nuclear Information System (INIS)

    Grossman, Y

    2003-01-01

    In the first part of the talk the flavor physics input to models beyond the standard model is described. One specific example of such new physics model is given: A model with bulk fermions in a non factorizable one extra dimension. In the second part of the talk we discuss several observables that are sensitive to new physics. We explain what type of new physics can produce deviations from the standard model predictions in each of these observables

  3. Standardization and Scaling of a Community-Based Palliative Care Model.

    Science.gov (United States)

    Bull, Janet; Kamal, Arif H; Harker, Matthew; Taylor, Donald H; Bonsignore, Lindsay; Morris, John; Massie, Lisa; Singh Bhullar, Parampal; Howell, Mary; Hendrix, Mark; Bennett, Deeana; Abernethy, Amy

    2017-11-01

    Although limited, the descriptions of Community-Based Palliative Care (CBPC) demonstrates variability in team structures, eligibility, and standardization across care settings. In 2014, Four Seasons Compassion for Life, a nonprofit hospice and palliative care (PC) organization in Western North Carolina (WNC), was awarded a Centers for Medicare and Medicaid Services Health Care Innovation (CMMI) Award to expand upon their existing innovative model to implement, evaluate, and demonstrate CBPC in the United States. The objective of this article is to describe the processes and challenges of scaling and standardizing the CBPC model. Four Season's CBPC model serves patients in both inpatient and outpatient settings using an interdisciplinary team to address symptom management, psychosocial/spiritual care, advance care planning, and patient/family education. Medicare beneficiaries who are ≥65 years of age with a life-limiting illness were eligible for the CMMI project. The CBPC model was scaled across numerous counties in WNC and Upstate South Carolina. Over the first two years of the project, scaling occurred into 21 counties with the addition of 2 large hospitals, 52 nursing facilities, and 2 new clinics. To improve efficiency and effectiveness, a PC screening referral guide and a risk stratification approach were developed and implemented. Care processes, including patient referral and initial visit, were mapped. This article describes an interdisciplinary CBPC model in all care settings to individuals with life-limiting illness and offers guidance for risk stratification assessments and mapping care processes that may help PC programs as they develop and work to improve efficiencies.

  4. Searches for Beyond Standard Model Physics with ATLAS and CMS

    CERN Document Server

    Rompotis, Nikolaos; The ATLAS collaboration

    2017-01-01

    The exploration of the high energy frontier with ATLAS and CMS experiments provides one of the best opportunities to look for physics beyond the Standard Model. In this talk, I review the motivation, the strategy and some recent results related to beyond Standard Model physics from these experiments. The review will cover beyond Standard Model Higgs boson searches, supersymmetry and searches for exotic particles.

  5. Standard model and beyond

    International Nuclear Information System (INIS)

    Quigg, C.

    1984-09-01

    The SU(3)/sub c/ circle crossSU(2)/sub L/circle crossU(1)/sub Y/ gauge theory of ineractions among quarks and leptons is briefly described, and some recent notable successes of the theory are mentioned. Some shortcomings in our ability to apply the theory are noted, and the incompleteness of the standard model is exhibited. Experimental hints that Nature may be richer in structure than the minimal theory are discussed. 23 references

  6. Higgs triplets in the standard model

    International Nuclear Information System (INIS)

    Gunion, J.F.; Vega, R.; Wudka, J.

    1990-01-01

    Even though the standard model of the strong and electroweak interactions has proven enormously successful, it need not be the case that a single Higgs-doublet field is responsible for giving masses to the weakly interacting vector bosons and the fermions. In this paper we explore the phenomenology of a Higgs sector for the standard model which contains both doublet and triplet fields [under SU(2) L ]. The resulting Higgs bosons have many exotic features and surprising experimental signatures. Since a critical task of future accelerators will be to either discover or establish the nonexistence of Higgs bosons with mass below the TeV scale, it will be important to keep in mind the alternative possibilities characteristic of this and other nonminimal Higgs sectors

  7. Safety standards for near surface disposal and the safety case and supporting safety assessment for demonstrating compliance with the standards

    International Nuclear Information System (INIS)

    Metcalf, P.

    2003-01-01

    The report presents the safety standards for near surface disposal (ICRP guidance and IAEA standards) and the safety case and supporting safety assessment for demonstrating compliance with the standards. Special attention is paid to the recommendations for disposal of long-lived solid radioactive waste. The requirements are based on the principle for the same level of protection of future individuals as for the current generation. Two types of exposure are considered: human intrusion and natural processes and protection measures are discussed. Safety requirements for near surface disposal are discussed including requirements for protection of human health and environment, requirements or safety assessments, waste acceptance and requirements etc

  8. Tests of the standard electroweak model in beta decay

    Energy Technology Data Exchange (ETDEWEB)

    Severijns, N.; Beck, M. [Universite Catholique de Louvain (UCL), Louvain-la-Neuve (Belgium); Naviliat-Cuncic, O. [Caen Univ., CNRS-ENSI, 14 (France). Lab. de Physique Corpusculaire

    2006-05-15

    We review the current status of precision measurements in allowed nuclear beta decay, including neutron decay, with emphasis on their potential to look for new physics beyond the standard electroweak model. The experimental results are interpreted in the framework of phenomenological model-independent descriptions of nuclear beta decay as well as in some specific extensions of the standard model. The values of the standard couplings and the constraints on the exotic couplings of the general beta decay Hamiltonian are updated. For the ratio between the axial and the vector couplings we obtain C{sub A},/C{sub V} = -1.26992(69) under the standard model assumptions. Particular attention is devoted to the discussion of the sensitivity and complementarity of different precision experiments in direct beta decay. The prospects and the impact of recent developments of precision tools and of high intensity low energy beams are also addressed. (author)

  9. Tests of the standard electroweak model in beta decay

    International Nuclear Information System (INIS)

    Severijns, N.; Beck, M.; Naviliat-Cuncic, O.

    2006-05-01

    We review the current status of precision measurements in allowed nuclear beta decay, including neutron decay, with emphasis on their potential to look for new physics beyond the standard electroweak model. The experimental results are interpreted in the framework of phenomenological model-independent descriptions of nuclear beta decay as well as in some specific extensions of the standard model. The values of the standard couplings and the constraints on the exotic couplings of the general beta decay Hamiltonian are updated. For the ratio between the axial and the vector couplings we obtain C A ,/C V = -1.26992(69) under the standard model assumptions. Particular attention is devoted to the discussion of the sensitivity and complementarity of different precision experiments in direct beta decay. The prospects and the impact of recent developments of precision tools and of high intensity low energy beams are also addressed. (author)

  10. Assessing the reliability of the borderline regression method as a standard setting procedure for objective structured clinical examination

    Directory of Open Access Journals (Sweden)

    Sara Mortaz Hejri

    2013-01-01

    Full Text Available Background: One of the methods used for standard setting is the borderline regression method (BRM. This study aims to assess the reliability of BRM when the pass-fail standard in an objective structured clinical examination (OSCE was calculated by averaging the BRM standards obtained for each station separately. Materials and Methods: In nine stations of the OSCE with direct observation the examiners gave each student a checklist score and a global score. Using a linear regression model for each station, we calculated the checklist score cut-off on the regression equation for the global scale cut-off set at 2. The OSCE pass-fail standard was defined as the average of all station′s standard. To determine the reliability, the root mean square error (RMSE was calculated. The R2 coefficient and the inter-grade discrimination were calculated to assess the quality of OSCE. Results: The mean total test score was 60.78. The OSCE pass-fail standard and its RMSE were 47.37 and 0.55, respectively. The R2 coefficients ranged from 0.44 to 0.79. The inter-grade discrimination score varied greatly among stations. Conclusion: The RMSE of the standard was very small indicating that BRM is a reliable method of setting standard for OSCE, which has the advantage of providing data for quality assurance.

  11. Health impact assessment in the United States: Has practice followed standards?

    International Nuclear Information System (INIS)

    Schuchter, Joseph; Bhatia, Rajiv; Corburn, Jason; Seto, Edmund

    2014-01-01

    As an emerging practice, Health Impact Assessment is heterogeneous in purpose, form, and scope and applied in a wide range of decision contexts. This heterogeneity challenges efforts to evaluate the quality and impact of practice. We examined whether information in completed HIA reports reflected objectively-evaluable criteria proposed by the North American HIA Practice Standards Working Group in 2009. From publically-available reports of HIAs conducted in the U.S. and published from 2009 to 2011, we excluded those that were components of, or comment letters on, Environmental Impact Assessments (5) or were demonstration projects or student exercises (8). For the remaining 23 reports, we used practice standards as a template to abstract data on the steps of HIA, including details on the rationale, authorship, funding, decision and decision-makers, participation, pathways and methods, quality of evidence, and recommendations. Most reports described screening, scoping, and assessment processes, but there was substantial variation in the extent of these processes and the degree of stakeholder participation. Community stakeholders participated in screening or scoping in just two-thirds of the HIAs (16). On average, these HIAs analyzed 5.5 determinants related to 10.6 health impacts. Most HIA reports did not include evaluation or monitoring plans. This study identifies issues for field development and improvement. The standards might be adapted to better account for variability in resources, produce fit-for-purpose HIAs, and facilitate innovation guided by the principles. - Highlights: • Our study examined reported HIAs in the U.S. against published practice standards. • Most HIAs used some screening, scoping and assessment elements from the standards. • The extent of these processes and stakeholder participation varied widely. • The average HIA considered multiple health determinants and impacts. • Evaluation or monitoring plans were generally not included in

  12. Health impact assessment in the United States: Has practice followed standards?

    Energy Technology Data Exchange (ETDEWEB)

    Schuchter, Joseph, E-mail: jws@berkeley.edu [University of California, Berkeley, School of Public Health, Department of Environmental Health Sciences, 50 University Hall, Berkeley, CA 94720-7360 (United States); Bhatia, Rajiv [University of California, Berkeley, Institute of Urban and Regional Development (United States); Corburn, Jason [University of California, Berkeley, College of Environmental Design, Department of City and Regional Planning (United States); Seto, Edmund [University of Washington, School of Public Health, Department of Environmental and Occupational Health (United States)

    2014-07-01

    As an emerging practice, Health Impact Assessment is heterogeneous in purpose, form, and scope and applied in a wide range of decision contexts. This heterogeneity challenges efforts to evaluate the quality and impact of practice. We examined whether information in completed HIA reports reflected objectively-evaluable criteria proposed by the North American HIA Practice Standards Working Group in 2009. From publically-available reports of HIAs conducted in the U.S. and published from 2009 to 2011, we excluded those that were components of, or comment letters on, Environmental Impact Assessments (5) or were demonstration projects or student exercises (8). For the remaining 23 reports, we used practice standards as a template to abstract data on the steps of HIA, including details on the rationale, authorship, funding, decision and decision-makers, participation, pathways and methods, quality of evidence, and recommendations. Most reports described screening, scoping, and assessment processes, but there was substantial variation in the extent of these processes and the degree of stakeholder participation. Community stakeholders participated in screening or scoping in just two-thirds of the HIAs (16). On average, these HIAs analyzed 5.5 determinants related to 10.6 health impacts. Most HIA reports did not include evaluation or monitoring plans. This study identifies issues for field development and improvement. The standards might be adapted to better account for variability in resources, produce fit-for-purpose HIAs, and facilitate innovation guided by the principles. - Highlights: • Our study examined reported HIAs in the U.S. against published practice standards. • Most HIAs used some screening, scoping and assessment elements from the standards. • The extent of these processes and stakeholder participation varied widely. • The average HIA considered multiple health determinants and impacts. • Evaluation or monitoring plans were generally not included in

  13. Zebrafish as a correlative and predictive model for assessing biomaterial nanotoxicity.

    Science.gov (United States)

    Fako, Valerie E; Furgeson, Darin Y

    2009-06-21

    The lack of correlative and predictive models to assess acute and chronic toxicities limits the rapid pre-clinical development of new therapeutics. This barrier is due in part to the exponential growth of nanotechnology and nanotherapeutics, coupled with the lack of rigorous and robust screening assays and putative standards. It is a fairly simple and cost-effective process to initially screen the toxicity of a nanomaterial by using invitro cell cultures; unfortunately it is nearly impossible to imitate a complimentary invivo system. Small mammalian models are the most common method used to assess possible toxicities and biodistribution of nanomaterials in humans. Alternatively, Daniorerio, commonly known as zebrafish, are proving to be a quick, cheap, and facile model to conservatively assess toxicity of nanomaterials.

  14. OSHA's approach to risk assessment for setting a revised occupational exposure standard for 1,3-butadiene.

    Science.gov (United States)

    Grossman, E A; Martonik, J

    1990-01-01

    In its 1980 benzene decision [Industrial Union Department, ALF-CIO v. American Petroleum Institute, 448 U.S. 607 (1980)], the Supreme Court ruled that "before he can promulgate any permanent health or safety standard, the Secretary [of Labor] is required to make a threshold finding that a place of employment is unsafe--in the sense that significant risks are present and can be lessened by a change in practices" (448 U.S. at 642). The Occupational Safety and Health Administration (OSHA) has interpreted this to mean that whenever possible, it must quantify the risk associated with occupational exposure to a toxic substance at the current permissible exposure limit (PEL). If OSHA determines that there is significant risk to workers' health at its current standard, then it must quantify the risk associated with a variety of alternative standards to determine at what level, if any, occupational exposure to a substance no longer poses a significant risk. For rulemaking on occupational exposure to 1,3-butadiene, there are two studies that are suitable for quantitative risk assessment. One is a mouse inhalation bioassay conducted by the National Toxicology Program (NTP), and the other is a rat inhalation bioassay conducted by Hazelton Laboratories Europe. Of the four risk assessments that have been submitted to OSHA, all four have used the mouse and/or rat data with a variety of models to quantify the risk associated with occupational exposure to 1,3-butadiene. In addition, OSHA has performed its own risk assessment using the female mouse and female rat data and the one-hit and multistage models.(ABSTRACT TRUNCATED AT 250 WORDS) PMID:2401254

  15. Experimentally testing the standard cosmological model

    Energy Technology Data Exchange (ETDEWEB)

    Schramm, D.N. (Chicago Univ., IL (USA) Fermi National Accelerator Lab., Batavia, IL (USA))

    1990-11-01

    The standard model of cosmology, the big bang, is now being tested and confirmed to remarkable accuracy. Recent high precision measurements relate to the microwave background; and big bang nucleosynthesis. This paper focuses on the latter since that relates more directly to high energy experiments. In particular, the recent LEP (and SLC) results on the number of neutrinos are discussed as a positive laboratory test of the standard cosmology scenario. Discussion is presented on the improved light element observational data as well as the improved neutron lifetime data. alternate nucleosynthesis scenarios of decaying matter or of quark-hadron induced inhomogeneities are discussed. It is shown that when these scenarios are made to fit the observed abundances accurately, the resulting conclusions on the baryonic density relative to the critical density, {Omega}{sub b}, remain approximately the same as in the standard homogeneous case, thus, adding to the robustness of the standard model conclusion that {Omega}{sub b} {approximately} 0.06. This latter point is the deriving force behind the need for non-baryonic dark matter (assuming {Omega}{sub total} = 1) and the need for dark baryonic matter, since {Omega}{sub visible} < {Omega}{sub b}. Recent accelerator constraints on non-baryonic matter are discussed, showing that any massive cold dark matter candidate must now have a mass M{sub x} {approx gt} 20 GeV and an interaction weaker than the Z{sup 0} coupling to a neutrino. It is also noted that recent hints regarding the solar neutrino experiments coupled with the see-saw model for {nu}-masses may imply that the {nu}{sub {tau}} is a good hot dark matter candidate. 73 refs., 5 figs.

  16. Experimentally testing the standard cosmological model

    International Nuclear Information System (INIS)

    Schramm, D.N.

    1990-11-01

    The standard model of cosmology, the big bang, is now being tested and confirmed to remarkable accuracy. Recent high precision measurements relate to the microwave background; and big bang nucleosynthesis. This paper focuses on the latter since that relates more directly to high energy experiments. In particular, the recent LEP (and SLC) results on the number of neutrinos are discussed as a positive laboratory test of the standard cosmology scenario. Discussion is presented on the improved light element observational data as well as the improved neutron lifetime data. alternate nucleosynthesis scenarios of decaying matter or of quark-hadron induced inhomogeneities are discussed. It is shown that when these scenarios are made to fit the observed abundances accurately, the resulting conclusions on the baryonic density relative to the critical density, Ω b , remain approximately the same as in the standard homogeneous case, thus, adding to the robustness of the standard model conclusion that Ω b ∼ 0.06. This latter point is the deriving force behind the need for non-baryonic dark matter (assuming Ω total = 1) and the need for dark baryonic matter, since Ω visible b . Recent accelerator constraints on non-baryonic matter are discussed, showing that any massive cold dark matter candidate must now have a mass M x approx-gt 20 GeV and an interaction weaker than the Z 0 coupling to a neutrino. It is also noted that recent hints regarding the solar neutrino experiments coupled with the see-saw model for ν-masses may imply that the ν τ is a good hot dark matter candidate. 73 refs., 5 figs

  17. The standard model on non-commutative space-time

    International Nuclear Information System (INIS)

    Calmet, X.; Jurco, B.; Schupp, P.; Wohlgenannt, M.; Wess, J.

    2002-01-01

    We consider the standard model on a non-commutative space and expand the action in the non-commutativity parameter θ μν . No new particles are introduced; the structure group is SU(3) x SU(2) x U(1). We derive the leading order action. At zeroth order the action coincides with the ordinary standard model. At leading order in θ μν we find new vertices which are absent in the standard model on commutative space-time. The most striking features are couplings between quarks, gluons and electroweak bosons and many new vertices in the charged and neutral currents. We find that parity is violated in non-commutative QCD. The Higgs mechanism can be applied. QED is not deformed in the minimal version of the NCSM to the order considered. (orig.)

  18. Simulation and Modeling Capability for Standard Modular Hydropower Technology

    Energy Technology Data Exchange (ETDEWEB)

    Stewart, Kevin M. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Smith, Brennan T. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Witt, Adam M. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); DeNeale, Scott T. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Bevelhimer, Mark S. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Pries, Jason L. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Burress, Timothy A. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Kao, Shih-Chieh [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Mobley, Miles H. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Lee, Kyutae [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Curd, Shelaine L. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Tsakiris, Achilleas [Univ. of Tennessee, Knoxville, TN (United States); Mooneyham, Christian [Univ. of Tennessee, Knoxville, TN (United States); Papanicolaou, Thanos [Univ. of Tennessee, Knoxville, TN (United States); Ekici, Kivanc [Univ. of Tennessee, Knoxville, TN (United States); Whisenant, Matthew J. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Welch, Tim [US Department of Energy, Washington, DC (United States); Rabon, Daniel [US Department of Energy, Washington, DC (United States)

    2017-08-01

    Grounded in the stakeholder-validated framework established in Oak Ridge National Laboratory’s SMH Exemplary Design Envelope Specification, this report on Simulation and Modeling Capability for Standard Modular Hydropower (SMH) Technology provides insight into the concepts, use cases, needs, gaps, and challenges associated with modeling and simulating SMH technologies. The SMH concept envisions a network of generation, passage, and foundation modules that achieve environmentally compatible, cost-optimized hydropower using standardization and modularity. The development of standardized modeling approaches and simulation techniques for SMH (as described in this report) will pave the way for reliable, cost-effective methods for technology evaluation, optimization, and verification.

  19. The Standard Model is Natural as Magnetic Gauge Theory

    DEFF Research Database (Denmark)

    Sannino, Francesco

    2011-01-01

    matter. The absence of scalars in the electric theory indicates that the associated magnetic theory is free from quadratic divergences. Our novel solution to the Standard Model hierarchy problem leads also to a new insight on the mystery of the observed number of fundamental fermion generations......We suggest that the Standard Model can be viewed as the magnetic dual of a gauge theory featuring only fermionic matter content. We show this by first introducing a Pati-Salam like extension of the Standard Model and then relating it to a possible dual electric theory featuring only fermionic...

  20. A quantitative model to assess Social Responsibility in Environmental Science and Technology.

    Science.gov (United States)

    Valcárcel, M; Lucena, R

    2014-01-01

    The awareness of the impact of human activities in society and environment is known as "Social Responsibility" (SR). It has been a topic of growing interest in many enterprises since the fifties of the past Century, and its implementation/assessment is nowadays supported by international standards. There is a tendency to amplify its scope of application to other areas of the human activities, such as Research, Development and Innovation (R + D + I). In this paper, a model of quantitative assessment of Social Responsibility in Environmental Science and Technology (SR EST) is described in detail. This model is based on well established written standards as the EFQM Excellence model and the ISO 26000:2010 Guidance on SR. The definition of five hierarchies of indicators, the transformation of qualitative information into quantitative data and the dual procedure of self-evaluation and external evaluation are the milestones of the proposed model, which can be applied to Environmental Research Centres and institutions. In addition, a simplified model that facilitates its implementation is presented in the article. © 2013 Elsevier B.V. All rights reserved.

  1. Standard Model Vacuum Stability and Weyl Consistency Conditions

    DEFF Research Database (Denmark)

    Antipin, Oleg; Gillioz, Marc; Krog, Jens

    2013-01-01

    At high energy the standard model possesses conformal symmetry at the classical level. This is reflected at the quantum level by relations between the different beta functions of the model. These relations are known as the Weyl consistency conditions. We show that it is possible to satisfy them...... order by order in perturbation theory, provided that a suitable coupling constant counting scheme is used. As a direct phenomenological application, we study the stability of the standard model vacuum at high energies and compare with previous computations violating the Weyl consistency conditions....

  2. Edible safety requirements and assessment standards for agricultural genetically modified organisms.

    Science.gov (United States)

    Deng, Pingjian; Zhou, Xiangyang; Zhou, Peng; Du, Zhong; Hou, Hongli; Yang, Dongyan; Tan, Jianjun; Wu, Xiaojin; Zhang, Jinzhou; Yang, Yongcun; Liu, Jin; Liu, Guihua; Li, Yonghong; Liu, Jianjun; Yu, Lei; Fang, Shisong; Yang, Xiaoke

    2008-05-01

    This paper describes the background, principles, concepts and methods of framing the technical regulation for edible safety requirement and assessment of agricultural genetically modified organisms (agri-GMOs) for Shenzhen Special Economic Zone in the People's Republic of China. It provides a set of systematic criteria for edible safety requirements and the assessment process for agri-GMOs. First, focusing on the degree of risk and impact of different agri-GMOs, we developed hazard grades for toxicity, allergenicity, anti-nutrition effects, and unintended effects and standards for the impact type of genetic manipulation. Second, for assessing edible safety, we developed indexes and standards for different hazard grades of recipient organisms, for the influence of types of genetic manipulation and hazard grades of agri-GMOs. To evaluate the applicability of these criteria and their congruency with other safety assessment systems for GMOs applied by related organizations all over the world, we selected some agri-GMOs (soybean, maize, potato, capsicum and yeast) as cases to put through our new assessment system, and compared our results with the previous assessments. It turned out that the result of each of the cases was congruent with the original assessment.

  3. A Risk Assessment Example for Soil Invertebrates Using Spatially Explicit Agent-Based Models

    DEFF Research Database (Denmark)

    Reed, Melissa; Alvarez, Tania; Chelinho, Sonia

    2016-01-01

    Current risk assessment methods for measuring the toxicity of plant protection products (PPPs) on soil invertebrates use standardized laboratory conditions to determine acute effects on mortality and sublethal effects on reproduction. If an unacceptable risk is identified at the lower tier...... population models for ubiquitous soil invertebrates (collembolans and earthworms) as refinement options in current risk assessment. Both are spatially explicit agent-based models (ABMs), incorporating individual and landscape variability. The models were used to provide refined risk assessments for different...... application scenarios of a hypothetical pesticide applied to potato crops (full-field spray onto the soil surface [termed “overall”], in-furrow, and soil-incorporated pesticide applications). In the refined risk assessment, the population models suggest that soil invertebrate populations would likely recover...

  4. CP violation and electroweak baryogenesis in the Standard Model

    Directory of Open Access Journals (Sweden)

    Brauner Tomáš

    2014-04-01

    Full Text Available One of the major unresolved problems in current physics is understanding the origin of the observed asymmetry between matter and antimatter in the Universe. It has become a common lore to claim that the Standard Model of particle physics cannot produce sufficient asymmetry to explain the observation. Our results suggest that this conclusion can be alleviated in the so-called cold electroweak baryogenesis scenario. On the Standard Model side, we continue the program initiated by Smit eight years ago; one derives the effective CP-violating action for the Standard Model bosons and uses the resulting effective theory in numerical simulations. We address a disagreement between two previous computations performed effectively at zero temperature, and demonstrate that it is very important to include temperature effects properly. Our conclusion is that the cold electroweak baryogenesis scenario within the Standard Model is tightly constrained, yet producing enough baryon asymmetry using just known physics still seems possible.

  5. Standard Model Particles from Split Octonions

    Directory of Open Access Journals (Sweden)

    Gogberashvili M.

    2016-01-01

    Full Text Available We model physical signals using elements of the algebra of split octonions over the field of real numbers. Elementary particles are corresponded to the special elements of the algebra that nullify octonionic norms (zero divisors. It is shown that the standard model particle spectrum naturally follows from the classification of the independent primitive zero divisors of split octonions.

  6. Exploring the Standard Model of Particles

    Science.gov (United States)

    Johansson, K. E.; Watkins, P. M.

    2013-01-01

    With the recent discovery of a new particle at the CERN Large Hadron Collider (LHC) the Higgs boson could be about to be discovered. This paper provides a brief summary of the standard model of particle physics and the importance of the Higgs boson and field in that model for non-specialists. The role of Feynman diagrams in making predictions for…

  7. EMBEDding the CEFR in Academic Writing Assessment : A case study in training and standardization

    NARCIS (Netherlands)

    Haines, Kevin; Lowie, Wander; Jansma, Petra; Schmidt, Nicole

    2013-01-01

    The CEFR is increasingly being used as the framework of choice for the assessment of language proficiency at universities across Europe. However, to attain consistent assessment, familiarization and standardization are essential. In this paper we report a case study of embedding a standardization

  8. Assessing MBA Student Teamwork under the AACSB Assurance of Learning Standards

    Science.gov (United States)

    Procino, Matthew C.

    2012-01-01

    Since the 2003 release of the AACSB's Assurance of Learning standards, outcomes assessment has been a required practice for business schools wishing to receive their endorsement. While most accredited institutions had been dabbling with the measurement of student learning, the new standards raised the bar considerably. It is now necessary to…

  9. When standards become business models: Reinterpreting "failure" in the standardization paradigm

    NARCIS (Netherlands)

    Hawkins, R.; Ballon, P.

    2007-01-01

    Purpose - This paper aims to explore the question: 'What is the relationship between standards and business models?' and illustrate the conceptual linkage with reference to developments in the mobile communications industry. Design/methodology/approach - A succinct overview of literature on

  10. Assessing validity of a depression screening instrument in the absence of a gold standard.

    Science.gov (United States)

    Gelaye, Bizu; Tadesse, Mahlet G; Williams, Michelle A; Fann, Jesse R; Vander Stoep, Ann; Andrew Zhou, Xiao-Hua

    2014-07-01

    We evaluated the extent to which use of a hypothesized imperfect gold standard, the Composite International Diagnostic Interview (CIDI), biases the estimates of diagnostic accuracy of the Patient Health Questionnaire-9 (PHQ-9). We also evaluate how statistical correction can be used to address this bias. The study was conducted among 926 adults where structured interviews were conducted to collect information about participants' current major depressive disorder using PHQ-9 and CIDI instruments. First, we evaluated the relative psychometric properties of PHQ-9 using CIDI as a gold standard. Next, we used a Bayesian latent class model to correct for the bias. In comparison with CIDI, the relative sensitivity and specificity of the PHQ-9 for detecting major depressive disorder at a cut point of 10 or more were 53.1% (95% confidence interval: 45.4%-60.8%) and 77.5% (95% confidence interval, 74.5%-80.5%), respectively. Using a Bayesian latent class model to correct for the bias arising from the use of an imperfect gold standard increased the sensitivity and specificity of PHQ-9 to 79.8% (95% Bayesian credible interval, 64.9%-90.8%) and 79.1% (95% Bayesian credible interval, 74.7%-83.7%), respectively. Our results provided evidence that assessing diagnostic validity of mental health screening instrument, where application of a gold standard might not be available, can be accomplished by using appropriate statistical methods. Copyright © 2014 Elsevier Inc. All rights reserved.

  11. Standardization of natural phenomena risk assessment methodology at the Savannah River Plant

    International Nuclear Information System (INIS)

    Huang, J.C.; Hsu, Y.S.

    1985-01-01

    Safety analyses at the Savannah River Plant (SRP) normally require consideration of the risks of incidents caused by natural events such as high-velocity straight winds, tornadic winds, and earthquakes. The probabilities for these events to occur at SRP had been studied independently by several investigators, but the results of their studies were never systematically evaluated. As part of the endeavor to standardize our environmental risk assessment methodology, these independent studies have been thoroughly reviewed and critiqued, and appropriate probability models for these natural events have been selected. The selected probability models for natural phenomena, high-velocity straight winds and tornadic winds in particular, are in agreement with those being used at other DOE sites, and have been adopted as a guide for all safety studies conducted for SRP operations and facilities. 7 references, 3 figures

  12. Improving Flood Damage Assessment Models in Italy

    Science.gov (United States)

    Amadio, M.; Mysiak, J.; Carrera, L.; Koks, E.

    2015-12-01

    The use of Stage-Damage Curve (SDC) models is prevalent in ex-ante assessments of flood risk. To assess the potential damage of a flood event, SDCs describe a relation between water depth and the associated potential economic damage over land use. This relation is normally developed and calibrated through site-specific analysis based on ex-post damage observations. In some cases (e.g. Italy) SDCs are transferred from other countries, undermining the accuracy and reliability of simulation results. Against this background, we developed a refined SDC model for Northern Italy, underpinned by damage compensation records from a recent flood event. Our analysis considers both damage to physical assets and production losses from business interruptions. While the first is calculated based on land use information, production losses are measured through the spatial distribution of Gross Value Added (GVA). An additional component of the model assesses crop-specific agricultural losses as a function of flood seasonality. Our results show an overestimation of asset damage from non-calibrated SDC values up to a factor of 4.5 for tested land use categories. Furthermore, we estimate that production losses amount to around 6 per cent of the annual GVA. Also, maximum yield losses are less than a half of the amount predicted by the standard SDC methods.

  13. Using Microsoft Excel to Assess Standards: A "Techtorial". Article #2 in a 6-Part Series

    Science.gov (United States)

    Mears, Derrick

    2009-01-01

    Standards-based assessment is a term currently being used quite often in educational reform discussions. The philosophy behind this initiative is to utilize "standards" or "benchmarks" to focus instruction and assessments of student learning. The National Standards for Physical Education (NASPE, 2004) provide a framework to guide this process for…

  14. Beyond-the-Standard Model Higgs Physics using the ATLAS Experiment

    CERN Document Server

    Madsen, Alexander; The ATLAS collaboration

    2015-01-01

    The discovery of a Higgs boson with a mass of about 125 GeV has prompted the question of whether or not this particle is part of a larger and more complex Higgs sector than that envisioned in the Standard Model. In this talk, the latest results from the ATLAS experiment on Beyond-the-Standard Model (BSM) Higgs searches are outlined. Searches for additional Higgs bosons are presented and interpreted in well-motivated BSM Higgs frameworks, such as two-Higgs-doublet Models and the Minimal and Next to Minimal Supersymmetric Standard Model.

  15. Beyond-the-Standard Model Higgs Physics using the ATLAS Experiment

    CERN Document Server

    Vanadia, Marco; The ATLAS collaboration

    2015-01-01

    The discovery of a Higgs boson with a mass of about 125 GeV has prompted the question of whether or not this particle is part of a larger and more complex Higgs sector than that envisioned in the Standard Model. In this talk, the latest Run 1 results from the ATLAS Experiment on Beyond-the-Standard Model (BSM) Higgs searches are outlined. Searches for additional Higgs bosons are presented and interpreted in wellmotivated BSM Higgs frameworks, including the two-Higgs-doublet Models and the Minimal and Next to Minimal Supersymmetric Standard Model.

  16. Beyond-the-Standard Model Higgs Physics using the ATLAS Experiment

    CERN Document Server

    Scutti, Federico; The ATLAS collaboration

    2015-01-01

    The discovery of a Higgs boson with a mass of about 125 GeV has prompted the question of whether or not this particle is part of a larger and more complex Higgs sector than that envisioned in the Standard Model. In this talk, the current results from the ATLAS experiment on Beyond-the-Standard Model (BSM) Higgs searches are summarized. Searches for additional Higgs bosons are presented and interpreted in well-motivated BSM Higgs frameworks, such as two-Higgs-doublet Models and the Minimal and Next to Minimal Supersymmetric Standard Model.

  17. Beyond-the-Standard Model Higgs Physics using the ATLAS Experiment

    CERN Document Server

    Nagata, Kazuki; The ATLAS collaboration

    2014-01-01

    The discovery of a Higgs boson with a mass of about 125 GeV has prompted the question of whether or not this particle is part of a larger and more complex Higgs sector than that envisioned in the Standard Model. In this talk, the current results from the ATLAS experiment on Beyond-the-Standard Model (BSM) Higgs searches are outlined. Searches for additional Higgs bosons are presented and interpreted in well-motivated BSM Higgs frameworks, such as two-Higgs-doublet Models and the Minimal and Next to Minimal Supersymmetric Standard Model.

  18. Beyond-the-Standard Model Higgs physics using the ATLAS experiment

    CERN Document Server

    Ernis, G; The ATLAS collaboration

    2014-01-01

    The discovery of a Higgs boson with a mass of about 125 GeV has prompted the question of whether or not this particle is part of a larger and more complex Higgs sector than that envisioned in the Standard Model. In this talk, the current results from the ATLAS experiment on Beyond-the-Standard Model (BSM) Higgs searches are outlined. Searches for additional Higgs bosons are presented and interpreted in well-motivated BSM Higgs frameworks, such as two-Higgs-doublet Models and the Minimal and Next to Minimal Supersymmetric Standard Model.

  19. The standard model on non-commutative space-time

    Energy Technology Data Exchange (ETDEWEB)

    Calmet, X.; Jurco, B.; Schupp, P.; Wohlgenannt, M. [Sektion Physik, Universitaet Muenchen (Germany); Wess, J. [Sektion Physik, Universitaet Muenchen (Germany); Max-Planck-Institut fuer Physik, Muenchen (Germany)

    2002-03-01

    We consider the standard model on a non-commutative space and expand the action in the non-commutativity parameter {theta}{sup {mu}}{sup {nu}}. No new particles are introduced; the structure group is SU(3) x SU(2) x U(1). We derive the leading order action. At zeroth order the action coincides with the ordinary standard model. At leading order in {theta}{sup {mu}}{sup {nu}} we find new vertices which are absent in the standard model on commutative space-time. The most striking features are couplings between quarks, gluons and electroweak bosons and many new vertices in the charged and neutral currents. We find that parity is violated in non-commutative QCD. The Higgs mechanism can be applied. QED is not deformed in the minimal version of the NCSM to the order considered. (orig.)

  20. Genetic Programming and Standardization in Water Temperature Modelling

    Directory of Open Access Journals (Sweden)

    Maritza Arganis

    2009-01-01

    Full Text Available An application of Genetic Programming (an evolutionary computational tool without and with standardization data is presented with the aim of modeling the behavior of the water temperature in a river in terms of meteorological variables that are easily measured, to explore their explanatory power and to emphasize the utility of the standardization of variables in order to reduce the effect of those with large variance. Recorded data corresponding to the water temperature behavior at the Ebro River, Spain, are used as analysis case, showing a performance improvement on the developed model when data are standardized. This improvement is reflected in a reduction of the mean square error. Finally, the models obtained in this document were applied to estimate the water temperature in 2004, in order to provide evidence about their applicability to forecasting purposes.

  1. Assessment and improvement of condensation model in RELAP5/MOD3

    Energy Technology Data Exchange (ETDEWEB)

    Rho, Hui Cheon; Choi, Kee Yong; Park, Hyeon Sik; Kim, Sang Jae [Korea Advanced Institute of Science and Technology, Taejon (Korea, Republic of); Lee, Sang Il [Korea Power Engineering Co., Inc., Seoul (Korea, Republic of)

    1997-07-15

    The objective of this research is to remove the uncertainty of the condensation model through the assessment and improvement of the various heat transfer correlations used in the RELAP5/MOD3 code. The condensation model of the standard RELAP5/MOD3 code is systematically arranged and analyzed. A condensation heat transfer database is constructed from the previous experimental data on various condensation phenomena. Based on the constructed database, the condensation models in the code are assessed and improved. An experiment on the reflux condensation in a tube of steam generator in the presence of noncondensable gases is planned to acquire the experimental data.

  2. The Beyond the standard model working group: Summary report

    Energy Technology Data Exchange (ETDEWEB)

    G. Azuelos et al.

    2004-03-18

    dimensions. There, we considered: constraints on Kaluza Klein (KK) excitations of the SM gauge bosons from existing data (part XIII) and the corresponding projected LHC reach (part XIV); techniques for discovering and studying the radion field which is generic in most extra-dimensional scenarios (part XV); the impact of mixing between the radion and the Higgs sector, a fully generic possibility in extra-dimensional models (part XVI); production rates and signatures of universal extra dimensions at hadron colliders (part XVII); black hole production at hadron colliders, which would lead to truly spectacular events (part XVIII). The above contributions represent a tremendous amount of work on the part of the individuals involved and represent the state of the art for many of the currently most important phenomenological research avenues. Of course, much more remains to be done. For example, one should continue to work on assessing the extent to which the discovery reach will be extended if one goes beyond the LHC to the super-high-luminosity LHC (SLHC) or to a very large hadron collider (VLHC) with {radical}s {approx} 40 TeV. Overall, we believe our work shows that the LHC and future hadronic colliders will play a pivotal role in the discovery and study of any kind of new physics beyond the Standard Model. They provide tremendous potential for incredibly exciting new discoveries.

  3. Newly graduated doctors' competence in managing cardiopulmonary arrests assessed using a standardized Advanced Life Support (ALS) assessment

    DEFF Research Database (Denmark)

    Jensen, Morten Lind; Hesselfeldt, Rasmus; Rasmussen, Maria Birkvad

    2008-01-01

    Aim of the study: Several studies using a variety of assessment approaches have demonstrated that young doctors possess insufficient resuscitation competence. The aims of this study were to assess newly graduated doctors’ resuscitation competence against an internationally recognised standard and...

  4. Planning the Unplanned Experiment: Towards Assessing the Efficacy of Standards for Safety-Critical Software

    Science.gov (United States)

    Graydon, Patrick J.; Holloway, C. M.

    2015-01-01

    Safe use of software in safety-critical applications requires well-founded means of determining whether software is fit for such use. While software in industries such as aviation has a good safety record, little is known about whether standards for software in safety-critical applications 'work' (or even what that means). It is often (implicitly) argued that software is fit for safety-critical use because it conforms to an appropriate standard. Without knowing whether a standard works, such reliance is an experiment; without carefully collecting assessment data, that experiment is unplanned. To help plan the experiment, we organized a workshop to develop practical ideas for assessing software safety standards. In this paper, we relate and elaborate on the workshop discussion, which revealed subtle but important study design considerations and practical barriers to collecting appropriate historical data and recruiting appropriate experimental subjects. We discuss assessing standards as written and as applied, several candidate definitions for what it means for a standard to 'work,' and key assessment strategies and study techniques and the pros and cons of each. Finally, we conclude with thoughts about the kinds of research that will be required and how academia, industry, and regulators might collaborate to overcome the noted barriers.

  5. The Universal Thermal Climate Index UTCI compared to ergonomics standards for assessing the thermal environment.

    Science.gov (United States)

    Bröde, Peter; Błazejczyk, Krzysztof; Fiala, Dusan; Havenith, George; Holmér, Ingvar; Jendritzky, Gerd; Kuklane, Kalev; Kampmann, Bernhard

    2013-01-01

    The growing need for valid assessment procedures of the outdoor thermal environment in the fields of public weather services, public health systems, urban planning, tourism & recreation and climate impact research raised the idea to develop the Universal Thermal Climate Index UTCI based on the most recent scientific progress both in thermo-physiology and in heat exchange theory. Following extensive validation of accessible models of human thermoregulation, the advanced multi-node 'Fiala' model was selected to form the basis of UTCI. This model was coupled with an adaptive clothing model which considers clothing habits by the general urban population and behavioral changes in clothing insulation related to actual environmental temperature. UTCI was developed conceptually as an equivalent temperature. Thus, for any combination of air temperature, wind, radiation, and humidity, UTCI is defined as the air temperature in the reference condition which would elicit the same dynamic response of the physiological model. This review analyses the sensitivity of UTCI to humidity and radiation in the heat and to wind in the cold and compares the results with observational studies and internationally standardized assessment procedures. The capabilities, restrictions and potential future extensions of UTCI are discussed.

  6. Beyond the Standard Model course

    CERN Multimedia

    CERN. Geneva HR-RFA

    2006-01-01

    The necessity for new physics beyond the Standard Model will be motivated. Theoretical problems will be exposed and possible solutions will be described. The goal is to present the exciting new physics ideas that will be tested in the near future, at LHC and elsewhere. Supersymmetry, grand unification, extra dimensions and a glimpse of string theory will be presented.

  7. 25 CFR 36.12 - Standard III-Program needs assessment.

    Science.gov (United States)

    2010-04-01

    ...., (1) Perceptions of the parents, tribes, educators, and the students with regard to the relevance and....12 Indians BUREAU OF INDIAN AFFAIRS, DEPARTMENT OF THE INTERIOR EDUCATION MINIMUM ACADEMIC STANDARDS... Program Evaluation. This assessment shall include at least the following: (a) A clear statement of student...

  8. Beyond the Standard Model (2/5)

    CERN Multimedia

    CERN. Geneva

    2000-01-01

    After a critical discussion of the questions left unanswered by the Standard Model, I will review the main attemps to construct new theories. In particular, I will discuss grand unification, supersymmetry, technicolour, and theories with extra dimensions.

  9. Beyond the Standard Model (5/5)

    CERN Multimedia

    CERN. Geneva

    2000-01-01

    After a critical discussion of the questions left unanswered by the Standard Model, I will review the main attemps to construct new theories. In particular, I will discuss grand unification, supersymmetry, technicolour, and theories with extra dimensions.

  10. Beyond the Standard Model (3/5)

    CERN Multimedia

    CERN. Geneva

    2000-01-01

    After a critical discussion of the questions left unanswered by the Standard Model, I will review the main attemps to construct new theories. In particular, I will discuss grand unification, supersymmetry, technicolour, and theories with extra dimensions.

  11. Beyond the Standard Model (4/5)

    CERN Multimedia

    CERN. Geneva

    2000-01-01

    After a critical discussion of the questions left unanswered by the Standard Model, I will review the main attemps to construct new theories. In particular, I will discuss grand unification, supersymmetry, technicolour, and theories with extra dimensions.

  12. Beyond the Standard Model (1/5)

    CERN Multimedia

    CERN. Geneva

    2000-01-01

    After a critical discussion of the questions left unanswered by the Standard Model, I will review the main attemps to construct new theories. In particular, I will discuss grand unification, supersymmetry, technicolour, and theories with extra dimensions.

  13. ATLAS Z Excess in Minimal Supersymmetric Standard Model

    International Nuclear Information System (INIS)

    Lu, Xiaochuan; Terada, Takahiro

    2015-06-01

    Recently the ATLAS collaboration reported a 3 sigma excess in the search for the events containing a dilepton pair from a Z boson and large missing transverse energy. Although the excess is not sufficiently significant yet, it is quite tempting to explain this excess by a well-motivated model beyond the standard model. In this paper we study a possibility of the minimal supersymmetric standard model (MSSM) for this excess. Especially, we focus on the MSSM spectrum where the sfermions are heavier than the gauginos and Higgsinos. We show that the excess can be explained by the reasonable MSSM mass spectrum.

  14. Search for Higgs bosons beyond the Standard Model

    Directory of Open Access Journals (Sweden)

    Mankel Rainer

    2015-01-01

    Full Text Available While the existence of a Higgs boson with a mass near 125 GeV has been clearly established, the detailed structure of the entire Higgs sector is yet unclear. Beyond the standard model interpretation, various scenarios for extended Higgs sectors are being considered. Such options include the minimal and next-to-minimal supersymmetric extensions (MSSM and NMSSM of the standard model, more generic Two-Higgs Doublet models (2HDM, as well as truly exotic Higgs bosons decaying e.g. into totally invisible final states. This article presents recent results from the CMS experiment.

  15. Standard model baryogenesis

    CERN Document Server

    Gavela, M.B.; Orloff, J.; Pene, O

    1994-01-01

    Simply on CP arguments, we argue against a Standard Model explanation of baryogenesis via the charge transport mechanism. A CP-asymmetry is found in the reflection coefficients of quarks hitting the electroweak phase boundary created during a first order phase transition. The problem is analyzed both in an academic zero temperature case and in the realistic finite temperature one. At finite temperature, a crucial role is played by the damping rate of quasi-quarks in a hot plasma, which induces loss of spatial coherence and suppresses reflection on the boundary even at tree-level. The resulting baryon asymmetry is many orders of magnitude below what observation requires. We comment as well on related works.

  16. Background and derivation of ANS-5.4 standard fission product release model. Technical report

    International Nuclear Information System (INIS)

    1982-01-01

    ANS Working Group 5.4 was established in 1974 to examine fission product releases from UO2 fuel. The scope of ANS-5.4 was narrowly defined to include the following: (1) Review available experimental data on release of volatile fission products from UO2 and mixed-oxide fuel; (2) Survey existing analytical models currently being applied to lightwater reactors; and (3) Develop a standard analytical model for volatile fission product release to the fuel rod void space. Place emphasis on obtaining a model for radioactive fission product releases to be used in assessing radiological consequences of postulated accidents

  17. Loop Corrections to Standard Model fields in inflation

    Energy Technology Data Exchange (ETDEWEB)

    Chen, Xingang [Institute for Theory and Computation, Harvard-Smithsonian Center for Astrophysics,60 Garden Street, Cambridge, MA 02138 (United States); Department of Physics, The University of Texas at Dallas,800 W Campbell Rd, Richardson, TX 75080 (United States); Wang, Yi [Department of Physics, The Hong Kong University of Science and Technology,Clear Water Bay, Kowloon, Hong Kong (China); Xianyu, Zhong-Zhi [Center of Mathematical Sciences and Applications, Harvard University,20 Garden Street, Cambridge, MA 02138 (United States)

    2016-08-08

    We calculate 1-loop corrections to the Schwinger-Keldysh propagators of Standard-Model-like fields of spin-0, 1/2, and 1, with all renormalizable interactions during inflation. We pay special attention to the late-time divergences of loop corrections, and show that the divergences can be resummed into finite results in the late-time limit using dynamical renormalization group method. This is our first step toward studying both the Standard Model and new physics in the primordial universe.

  18. Beyond-the-Standard Model Higgs Physics using the ATLAS Experiment

    CERN Document Server

    Vanadia, Marco; The ATLAS collaboration

    2015-01-01

    The discovery of a Higgs boson with a mass of about 125 GeV/$\\rm{c^2}$ has prompted the question of whether or not this particle is part of a larger and more complex Higgs sector than that envisioned in the Standard Model. In this report, the latest Run 1 results from the ATLAS Experiment on Beyond-the-Standard Model (BSM) Higgs searches are outlined. Searches for additional Higgs bosons are presented and interpreted in well motivated BSM Higgs frameworks, including the two-Higgs-doublet Models and the Minimal and Next to Minimal Supersymmetric Standard Model.

  19. Assessing ballast treatment standards for effect on rate of establishment using a stochastic model of the green crab

    Directory of Open Access Journals (Sweden)

    Cynthia Cooper

    2012-03-01

    Full Text Available This paper describes a stochastic model used to characterize the probability/risk of NIS establishment from ships' ballast water discharges. Establishment is defined as the existence of a sufficient number of individuals of a species to provide for a sustained population of the organism. The inherent variability in population dynamics of organisms in their native or established environments is generally difficult to quantify. Muchqualitative information is known about organism life cycles and biotic and abiotic environmental pressures on the population, but generally little quantitative data exist to develop a mechanistic model of populations in such complex environments. Moreover, there is little quantitative data to characterize the stochastic fluctuations of population size over time even without accounting for systematic responses to biotic and abiotic pressures. This research applies an approach using life-stage density and fecundity measures reported in research to determine a stochastic model of an organism's population dynamics. The model is illustrated withdata from research studies on the green crab that span a range of habitats of the established organism and were collected over some years to represent a range of time-varying biotic and abiotic conditions that are expected to exist in many receiving environments. This model is applied to introductions of NIS at the IMO D-2 and the U.S. ballast water discharge standard levels designated as Phase Two in the United States Coast Guard'sNotice of Proposed Rulemaking. Under a representative range of ballast volumes discharged at U.S. ports, the average rate of establishment of green crabs for ballast waters treated to the IMO-D2 concentration standard (less than 10 organisms/m3 is predicted to be reduced to about a third the average rate from untreated ballast water discharge. The longevity of populations from the untreated ballast water discharges is expected to be reducedby about 90% by

  20. Constraints on Nc in extensions of the standard model

    International Nuclear Information System (INIS)

    Shrock, Robert

    2007-01-01

    We consider a class of theories involving an extension of the standard model gauge group to an a priori arbitrary number of colors, N c , and derive constraints on N c . One motivation for this is the string theory landscape. For two natural classes of embeddings of this N c -extended standard model in a supersymmetric grand unified theory, we show that requiring unbroken electromagnetic gauge invariance, asymptotic freedom of color, and three generations of quarks and leptons forces one to choose N c =3. Similarly, we show that for a theory combining the N c -extended standard model with a one-family SU(2) TC technicolor theory, only the value N c =3 is allowed

  1. Towards a standard design model for quad-rotors: A review of current models, their accuracy and a novel simplified model

    Science.gov (United States)

    Amezquita-Brooks, Luis; Liceaga-Castro, Eduardo; Gonzalez-Sanchez, Mario; Garcia-Salazar, Octavio; Martinez-Vazquez, Daniel

    2017-11-01

    Applications based on quad-rotor-vehicles (QRV) are becoming increasingly wide-spread. Many of these applications require accurate mathematical representations for control design, simulation and estimation. However, there is no consensus on a standardized model for these purposes. In this article a review of the most common elements included in QRV models reported in the literature is presented. This survey shows that some elements are recurrent for typical non-aerobatic QRV applications; in particular, for control design and high-performance simulation. By synthesising the common features of the reviewed models a standard generic model SGM is proposed. The SGM is cast as a typical state-space model without memory-less transformations, a structure which is useful for simulation and controller design. The survey also shows that many QRV applications use simplified representations, which may be considered simplifications of the SGM here proposed. In order to assess the effectiveness of the simplified models, a comprehensive comparison based on digital simulations is presented. With this comparison, it is possible to determine the accuracy of each model under particular operating ranges. Such information is useful for the selection of a model according to a particular application. In addition to the models found in the literature, in this article a novel simplified model is derived. The main characteristics of this model are that its inner dynamics are linear, it has low complexity and it has a high level of accuracy in all the studied operating ranges, a characteristic found only in more complex representations. To complement the article the main elements of the SGM are evaluated with the aid of experimental data and the computational complexity of all surveyed models is briefly analysed. Finally, the article presents a discussion on how the structural characteristics of the models are useful to suggest particular QRV control structures.

  2. Standard Model Higgs boson searches with the ATLAS detector

    Indian Academy of Sciences (India)

    The experimental results on the search of the Standard Model Higgs boson with 1 to 2 fb-1 of proton–proton collision data at s = 7 TeV recorded by the ATLAS detector are presented and discussed. No significant excess of events is found with respect to the expectations from Standard Model processes, and the production ...

  3. Standard Model at the LHC 2017

    CERN Document Server

    2017-01-01

    The SM@LHC 2017 conference will be held May 2-5, 2017 at Nikhef, Amsterdam. The meeting aims to bring together experimentalists and theorists to discuss the phenomenology, observational results and theoretical tools for Standard Model physics at the LHC.

  4. Conformal Extensions of the Standard Model with Veltman Conditions

    DEFF Research Database (Denmark)

    Antipin, Oleg; Mojaza, Matin; Sannino, Francesco

    2014-01-01

    Using the renormalisation group framework we classify different extensions of the standard model according to their degree of naturality. A new relevant class of perturbative models involving elementary scalars is the one in which the theory simultaneously satisfies the Veltman conditions...... and is conformal at the classical level. We term these extensions perturbative natural conformal (PNC) theories. We show that PNC models are very constrained and thus highly predictive. Among the several PNC examples that we exhibit, we discover a remarkably simple PNC extension of the standard model in which...

  5. The Wada Test: contributions to standardization of the stimulus for language and memory assessment

    Directory of Open Access Journals (Sweden)

    Mäder Maria Joana

    2004-01-01

    Full Text Available The Wada Test (WT is part of the presurgical evaluation for refractory epilepsy. The WT is not standardized and the protocols differ in important ways, including stimulus type of material presented for memory testing, timing of presentations and methods of assessment. The aim of this study was to contribute to establish parameters for a WT to Brazilian population investigating the performance of 100 normal subjects, without medication. Two parallel models were used based on Montreal Procedure adapted from Gail Risse's (MEG-MN,EUA protocol. The proportions of correct responses of normal subjects submitted to two parallel WT models were investigated and the two models were compared. The results showed that the two models are similar but significant differences among the stimulus type were observed. The results suggest that the stimulus type may influence the results of the WT and should be considered when constructing models and comparing different protocols.

  6. Flavour alignment in physics beyond the standard model

    Energy Technology Data Exchange (ETDEWEB)

    Braeuninger, Carolin Barbara

    2012-11-21

    There are numerous reasons to think that the Standard Model of physics is not the ultimate theory of nature on very small scales. However, attempts to construct theories that go beyond the Standard Model generically lead to high rates of flavour changing neutral processes that are in conflict with experiment: Quarks are the fundamental constituents of protons and neutrons. Together with electrons they form the visible matter of the universe1. They come in three generations or ''flavours''. In interactions, quarks of different generations can mix, i.e. a quark of one flavour can transform into a quark of another flavour. In the Standard Model, at first order in perturbation theory, such processes occur only via the exchange of a charged particle. Flavour changing neutral processes can only arise in processes involving loops of charged particles. This is due to the fact that all couplings of two quarks to a neutral particle are diagonal in the basis of the mass eigenstates of the quarks. There is thus no mixing of quarks of different flavour at first order. Since the loop processes are suppressed by a loop factor, the Standard Model predicts very low rates for neutral processes that change the flavour of quarks. So far, this is in agreement with experiment. In extensions of the Standard Model, new couplings to the quarks are usually introduced. In general there is no reason why the new coupling matrices should be diagonal in the mass basis of the quarks. These models therefore predict high rates for processes that mix quarks of different flavour. Extensions of the Standard Model must therefore have a non-trivial flavour structure. A possibility to avoid flavour violation is to assume that the new couplings are aligned with the mass matrices of the quarks, i.e. diagonal in the same basis. This alignment could be due to a flavour symmetry. In this thesis, two extensions of the Standard Model with alignment are studied. The first is a simple

  7. Flavour alignment in physics beyond the standard model

    International Nuclear Information System (INIS)

    Braeuninger, Carolin Barbara

    2012-01-01

    There are numerous reasons to think that the Standard Model of physics is not the ultimate theory of nature on very small scales. However, attempts to construct theories that go beyond the Standard Model generically lead to high rates of flavour changing neutral processes that are in conflict with experiment: Quarks are the fundamental constituents of protons and neutrons. Together with electrons they form the visible matter of the universe1. They come in three generations or ''flavours''. In interactions, quarks of different generations can mix, i.e. a quark of one flavour can transform into a quark of another flavour. In the Standard Model, at first order in perturbation theory, such processes occur only via the exchange of a charged particle. Flavour changing neutral processes can only arise in processes involving loops of charged particles. This is due to the fact that all couplings of two quarks to a neutral particle are diagonal in the basis of the mass eigenstates of the quarks. There is thus no mixing of quarks of different flavour at first order. Since the loop processes are suppressed by a loop factor, the Standard Model predicts very low rates for neutral processes that change the flavour of quarks. So far, this is in agreement with experiment. In extensions of the Standard Model, new couplings to the quarks are usually introduced. In general there is no reason why the new coupling matrices should be diagonal in the mass basis of the quarks. These models therefore predict high rates for processes that mix quarks of different flavour. Extensions of the Standard Model must therefore have a non-trivial flavour structure. A possibility to avoid flavour violation is to assume that the new couplings are aligned with the mass matrices of the quarks, i.e. diagonal in the same basis. This alignment could be due to a flavour symmetry. In this thesis, two extensions of the Standard Model with alignment are studied. The first is a simple extension of the Standard

  8. Standard model Higgs boson-inflaton and dark matter

    International Nuclear Information System (INIS)

    Clark, T. E.; Liu Boyang; Love, S. T.; Veldhuis, T. ter

    2009-01-01

    The standard model Higgs boson can serve as the inflaton field of slow roll inflationary models provided it exhibits a large nonminimal coupling with the gravitational scalar curvature. The Higgs boson self interactions and its couplings with a standard model singlet scalar serving as the source of dark matter are then subject to cosmological constraints. These bounds, which can be more stringent than those arising from vacuum stability and perturbative triviality alone, still allow values for the Higgs boson mass which should be accessible at the LHC. As the Higgs boson coupling to the dark matter strengthens, lower values of the Higgs boson mass consistent with the cosmological data are allowed.

  9. STRUCTURE OF MODELS FOR AGGREGATE ASSESSMENT OF FINANCIAL RISK COMMERCIAL BANKS

    OpenAIRE

    G. Kryshtal

    2016-01-01

    Conceptual approaches use a structural model for assessment of financial risk commercial banks, namely the risk measurement in combination: a comparison of its capital, calculated based on the standard approach of Basel II advanced approaches of Basel II and the structural model. Analysis of the application of the model in a economics crisis situation, such as the capital adequacy of the commercial banks. Deals with a unified approach to the choice of measure and its risk parameters to measur...

  10. The Measurement of Quality of Semantic Standards : the Application of a Quality Model on the SETU standard for eGovernment

    NARCIS (Netherlands)

    Folmer, Erwin; van Bekkum, Michael; Oude Luttighuis, Paul; van Hillegersberg, Jos

    2011-01-01

    eGovernment interoperability should be dealt with using high-quality standards. A quality model for standards is presented based on knowledge from the software engineering domain. In the tradition of action research the model is used on the SETU standard, a standard that is mandatory in the public

  11. Beyond the standard model at Tevatron

    International Nuclear Information System (INIS)

    Pagliarone, C.

    2000-01-01

    Tevatron experiments performed extensive searches for physics beyond the Standard Model. No positive results have been found so far showing that the data are consistent with the SM expectations. CDF and D0 continue the analysis of Run I data placing limits on new physics, including Supersymmetry, large space time dimensions and leptoquark models. With the Run II upgrades, providing an higher acceptance and higher luminosity, it will be possible to make important progresses in the search for new phenomena as well as in setting limits on a larger variety of theoretical models

  12. Review of Current Standard Model Results in ATLAS

    CERN Document Server

    Brandt, Gerhard; The ATLAS collaboration

    2018-01-01

    This talk highlights results selected from the Standard Model research programme of the ATLAS Collaboration at the Large Hadron Collider. Results using data from $p-p$ collisions at $\\sqrt{s}=7,8$~TeV in LHC Run-1 as well as results using data at $\\sqrt{s}=13$~TeV in LHC Run-2 are covered. The status of cross section measurements from soft QCD processes and jet production as well as photon production are presented. The presentation extends to vector boson production with associated jets. Precision measurements of the production of $W$ and $Z$ bosons, including a first measurement of the mass of the $W$ bosons, $m_W$, are discussed. The programme to measure electroweak processes with di-boson and tri-boson final states is outlined. All presented measurements are compatible with Standard Model descriptions and allow to further constrain it. In addition they allow to probe new physics which would manifest through extra gauge couplings, or Standard Model gauge couplings deviating from their predicted value.

  13. Stress-testing the Standard Model at the LHC

    CERN Document Server

    2016-01-01

    With the high-energy run of the LHC now underway, and clear manifestations of beyond-Standard-Model physics not yet seen in data from the previous run, the search for new physics at the LHC may be a quest for small deviations with big consequences. If clear signals are present, precise predictions and measurements will again be crucial for extracting the maximum information from the data, as in the case of the Higgs boson. Precision will therefore remain a key theme for particle physics research in the coming years. The conference will provide a forum for experimentalists and theorists to identify the challenges and refine the tools for high-precision tests of the Standard Model and searches for signals of new physics at Run II of the LHC. Topics to be discussed include: pinning down Standard Model corrections to key LHC processes; combining fixed-order QCD calculations with all-order resummations and parton showers; new developments in jet physics concerning jet substructure, associated jets and boosted je...

  14. To Which Degree Does Sector Specific Standardization Make Life Cycle Assessments Comparable?—The Case of Global Warming Potential of Smartphones

    Directory of Open Access Journals (Sweden)

    Anders S. G. Andrae

    2014-11-01

    Full Text Available Here attributional life cycle assessments (LCAs for the same smartphone model are presented by two different organizations (Orange, OGE and Huawei, HuW and the effect of different modeling approach is analyzed. A difference of around 32% (29.6 kg and 39.2 kg for CO2e baseline scores is found using same study object and sector specific LCA standard, however, different metrics, emission intensities, and LCA software programs. The CO2e difference is reduced to 12% (29.9 kg and 33.5 kg when OGE use HuW metrics for use phase power consumption and total mass, and when HuW use OGE metrics for gold mass and silicon die area. Further, a probability test confirms that present baseline climate change results, for one specific study object modeled with two largely different and independent LCA modeling approaches, are comparable if both use the European Telecommunications Standard Institute (ETSI LCA standard. The general conclusion is that the ETSI LCA standard strongly facilitates comparable CC results for technically comparable smartphone models. Moreover, thanks to the reporting requirements of ETSI LCA standard, a clear understanding of the differences between LCA modeling approaches is obtained. The research also discusses the magnitude of the CO2e reduction potential in the life cycle of smartphones.

  15. Modeling and Stability Assessment of Single-Phase Grid Synchronization Techniques

    DEFF Research Database (Denmark)

    Golestan, Saeed; Guerrero, Josep M.; Vasquez, Juan

    2018-01-01

    (GSTs) is of vital importance. This task is most often based on obtaining a linear time-invariant (LTI) model for the GST and applying standard stability tests to it. Another option is modeling and dynamics/stability assessment of GSTs in the linear time-periodic (LTP) framework, which has received...... a very little attention. In this letter, the procedure of deriving the LTP model for single-phase GSTs is first demonstrated. The accuracy of the LTP model in predicting the GST dynamic behavior and stability is then evaluated and compared with that of the LTI one. Two well-known single-phase GSTs, i...

  16. Safety assessment standards for modern plants in the UK

    International Nuclear Information System (INIS)

    Harbison, S.A.; Hannaford, J.

    1993-01-01

    The NII has revised its safety assessment principles (SAPs). This paper discusses the revised SAPs and their links with international standards. It considers the licensing of foreign designs of plant - a matter under active consideration in the UK -and discusses how the SAPs and the licensing process cater for that possibility. (author)

  17. Looking for physics beyond the standard model

    International Nuclear Information System (INIS)

    Binetruy, P.

    2002-01-01

    Motivations for new physics beyond the Standard Model are presented. The most successful and best motivated option, supersymmetry, is described in some detail, and the associated searches performed at LEP are reviewed. These include searches for additional Higgs bosons and for supersymmetric partners of the standard particles. These searches constrain the mass of the lightest supersymmetric particle which could be responsible for the dark matter of the universe. (authors)

  18. A randomized trial of standardized nursing patient assessment using wireless devices.

    Science.gov (United States)

    Dykes, Patricia C; Carroll, Diane L; Benoit, Angela; Coakley, Amanda; Chang, Frank; Empoliti, Joanne; Gallagher, Joan; Lasala, Cynthia; O'Malley, Rosemary; Rath, Greg; Silva, Judy; Li, Qi

    2007-10-11

    A complete and accurate patient assessment database is essential for effective communication, problem identification, planning and evaluation of patient status. When employed consistently for point-of-care documentation, information systems are associated with completeness and quality of documentation. The purpose of this paper is to report on the findings of a randomized, cross-over study conducted to evaluate the adequacy of a standard patient assessment module to support problem identification, care planning and tracking of nursing sensitive patient outcomes. The feasibility of wireless devices to support patient assessment data collection at the point-of-care was evaluated using wireless PDAs and tablet PCs. Seventy-nine (79) nurses from two patient care units at Massachusetts General Hospital (Boston, MA) were recruited into the study and randomized to complete patient assessment using wireless or paper devices. At the end of six weeks, nurses who where randomized to the paper assessment module were assigned to a device and those who used a device were assigned to paper for an additional six weeks. Impact was evaluated with regard to data capture, workflow implications and nurse satisfaction. Findings suggest that a standard patient assessment set promotes patient sensitive and quality data capture, which is augmented by the use of wireless devices.

  19. Institutional model for supporting standardization

    International Nuclear Information System (INIS)

    Sanford, M.O.; Jackson, K.J.

    1993-01-01

    Restoring the nuclear option for utilities requires standardized designs. This premise is widely accepted by all parties involved in ALWR development activities. Achieving and maintaining standardization, however, demands new perspectives on the roles and responsibilities for the various commercial organizations involved in nuclear power. Some efforts are needed to define a workable model for a long-term support structure that will allow the benefits of standardization to be realized. The Nuclear Power Oversight Committee (NPOC) has developed a strategic plan that lays out the steps necessary to enable the nuclear industry to be in a position to order a new nuclear power plant by the mid 1990's. One of the key elements of the plan is the, ''industry commitment to standardization: through design certification, combined license, first-of-a-kind engineering, construction, operation, and maintenance of nuclear power plants.'' This commitment is a result of the recognition by utilities of the substantial advantages to standardization. Among these are economic benefits, licensing benefits from being treated as one of a family, sharing risks across a broader ownership group, sharing operating experiences, enhancing public safety, and a more coherent market force. Utilities controlled the construction of the past generation of nuclear units in a largely autonomous fashion procuring equipment and designs from a vendor, engineering services from an architect/engineer, and construction from a construction management firm. This, in addition to forcing the utility to assume virtually all of the risks associated with the project, typically resulted in highly customized designs based on preferences of the individual utility. However, the benefits of standardization can be realized only through cooperative choices and decision making by the utilities and through working as partners with reactor vendors, architect/engineers, and construction firms

  20. Using Explanatory Item Response Models to Evaluate Complex Scientific Tasks Designed for the Next Generation Science Standards

    Science.gov (United States)

    Chiu, Tina

    This dissertation includes three studies that analyze a new set of assessment tasks developed by the Learning Progressions in Middle School Science (LPS) Project. These assessment tasks were designed to measure science content knowledge on the structure of matter domain and scientific argumentation, while following the goals from the Next Generation Science Standards (NGSS). The three studies focus on the evidence available for the success of this design and its implementation, generally labelled as "validity" evidence. I use explanatory item response models (EIRMs) as the overarching framework to investigate these assessment tasks. These models can be useful when gathering validity evidence for assessments as they can help explain student learning and group differences. In the first study, I explore the dimensionality of the LPS assessment by comparing the fit of unidimensional, between-item multidimensional, and Rasch testlet models to see which is most appropriate for this data. By applying multidimensional item response models, multiple relationships can be investigated, and in turn, allow for a more substantive look into the assessment tasks. The second study focuses on person predictors through latent regression and differential item functioning (DIF) models. Latent regression models show the influence of certain person characteristics on item responses, while DIF models test whether one group is differentially affected by specific assessment items, after conditioning on latent ability. Finally, the last study applies the linear logistic test model (LLTM) to investigate whether item features can help explain differences in item difficulties.

  1. The Assessment Of The Level Of Management Control Standards Completion In Treasury Sector

    Directory of Open Access Journals (Sweden)

    Kulińska Ewa

    2015-06-01

    Full Text Available This paper concerns the rules of the functioning of management control standards used in the Treasury Control Office. Its purpose is to present research results conducted in the years 2013–2014 in Polish Treasury Control Offices. Obtained results are the effect of applying author’s model of the assessment of management control implementation. The research was conducted for management personnel and the rest of offices employees separately. Significant discrepancies between these two groups of respondents were indicated. Based on the results, the areas of deviation from expected level of management control standards were established and the areas where implementation of control mechanisms relying on increasing the supervision of board of directors over managers were indicated, providing permanent and efficient elements of managers supervision over subordinate employees and making purposes and tasks put on the Treasury Control Office for given year more precise and familiarization of employees and carrying out trainings and series of other corrective measures.

  2. Theorists reject challenge to standard model

    CERN Multimedia

    Adam, D

    2001-01-01

    Particle physicists are questioning results that appear to violate the Standard Model. There are concerns that there is not sufficient statistical significance and also charges that the comparison is being made with the 'most convenient' theoretical value for the muon's magnetic moment (1 page).

  3. Towards a Standardized e-Assessment System: Motivations, Challenges and First Findings

    Directory of Open Access Journals (Sweden)

    Denis Helic

    2009-10-01

    Full Text Available “Global Learning” with shared learning contents, resources, activities and goals is one of the contributions of Globalization. With the capability to use new Information and Communication Technologies (ICT it is a bit easier to have a technology based learning systems that enable learners to share the learning resources and possibilities. As a result many Learning Management Systems (LMS were developed with divers of platforms and approaches. Consequently, sharing learning resources and components has become a major challenge. E-assessment as a primary activity of any LMS is facing the same challenges and problems. In order to stand on this challenge people in the field of technology enhanced learning have recommended that LMS should conform to specific standards. This paper discuses this challenge, the consequences and limitations of standards in the modern learning settings. Moreover, it shows a service oriented framework for assessment which aims to make the e-assessment systems flexible and also to initiate the term of “Global Learning Assessment” with the possibility of sharing the e-assessment system components.

  4. Simple standard model extension by heavy charged scalar

    Science.gov (United States)

    Boos, E.; Volobuev, I.

    2018-05-01

    We consider a Standard Model (SM) extension by a heavy charged scalar gauged only under the UY(1 ) weak hypercharge gauge group. Such an extension, being gauge invariant with respect to the SM gauge group, is a simple special case of the well-known Zee model. Since the interactions of the charged scalar with the Standard Model fermions turn out to be significantly suppressed compared to the Standard Model interactions, the charged scalar provides an example of a long-lived charged particle being interesting to search for at the LHC. We present the pair and single production cross sections of the charged scalar at different colliders and the possible decay widths for various boson masses. It is shown that the current ATLAS and CMS searches at 8 and 13 TeV collision energy lead to the bounds on the scalar boson mass of about 300-320 GeV. The limits are expected to be much larger for higher collision energies and, assuming 15 a b-1 integrated luminosity, reach about 2.7 TeV at future 27 TeV LHC thus covering the most interesting mass region.

  5. Assessment of NASA's Physiographic and Meteorological Datasets as Input to HSPF and SWAT Hydrological Models

    Science.gov (United States)

    Alacron, Vladimir J.; Nigro, Joseph D.; McAnally, William H.; OHara, Charles G.; Engman, Edwin Ted; Toll, David

    2011-01-01

    This paper documents the use of simulated Moderate Resolution Imaging Spectroradiometer land use/land cover (MODIS-LULC), NASA-LIS generated precipitation and evapo-transpiration (ET), and Shuttle Radar Topography Mission (SRTM) datasets (in conjunction with standard land use, topographical and meteorological datasets) as input to hydrological models routinely used by the watershed hydrology modeling community. The study is focused in coastal watersheds in the Mississippi Gulf Coast although one of the test cases focuses in an inland watershed located in northeastern State of Mississippi, USA. The decision support tools (DSTs) into which the NASA datasets were assimilated were the Soil Water & Assessment Tool (SWAT) and the Hydrological Simulation Program FORTRAN (HSPF). These DSTs are endorsed by several US government agencies (EPA, FEMA, USGS) for water resources management strategies. These models use physiographic and meteorological data extensively. Precipitation gages and USGS gage stations in the region were used to calibrate several HSPF and SWAT model applications. Land use and topographical datasets were swapped to assess model output sensitivities. NASA-LIS meteorological data were introduced in the calibrated model applications for simulation of watershed hydrology for a time period in which no weather data were available (1997-2006). The performance of the NASA datasets in the context of hydrological modeling was assessed through comparison of measured and model-simulated hydrographs. Overall, NASA datasets were as useful as standard land use, topographical , and meteorological datasets. Moreover, NASA datasets were used for performing analyses that the standard datasets could not made possible, e.g., introduction of land use dynamics into hydrological simulations

  6. Modern elementary particle physics explaining and extending the standard model

    CERN Document Server

    Kane, Gordon

    2017-01-01

    This book is written for students and scientists wanting to learn about the Standard Model of particle physics. Only an introductory course knowledge about quantum theory is needed. The text provides a pedagogical description of the theory, and incorporates the recent Higgs boson and top quark discoveries. With its clear and engaging style, this new edition retains its essential simplicity. Long and detailed calculations are replaced by simple approximate ones. It includes introductions to accelerators, colliders, and detectors, and several main experimental tests of the Standard Model are explained. Descriptions of some well-motivated extensions of the Standard Model prepare the reader for new developments. It emphasizes the concepts of gauge theories and Higgs physics, electroweak unification and symmetry breaking, and how force strengths vary with energy, providing a solid foundation for those working in the field, and for those who simply want to learn about the Standard Model.

  7. Variation in assessment and standard setting practices across UK undergraduate medicine and the need for a benchmark.

    Science.gov (United States)

    MacDougall, Margaret

    2015-10-31

    The principal aim of this study is to provide an account of variation in UK undergraduate medical assessment styles and corresponding standard setting approaches with a view to highlighting the importance of a UK national licensing exam in recognizing a common standard. Using a secure online survey system, response data were collected during the period 13 - 30 January 2014 from selected specialists in medical education assessment, who served as representatives for their respective medical schools. Assessment styles and corresponding choices of standard setting methods vary markedly across UK medical schools. While there is considerable consensus on the application of compensatory approaches, individual schools display their own nuances through use of hybrid assessment and standard setting styles, uptake of less popular standard setting techniques and divided views on norm referencing. The extent of variation in assessment and standard setting practices across UK medical schools validates the concern that there is a lack of evidence that UK medical students achieve a common standard on graduation. A national licensing exam is therefore a viable option for benchmarking the performance of all UK undergraduate medical students.

  8. 77 FR 23250 - HIT Standards Committee; Schedule for the Assessment of HIT Policy Committee Recommendations

    Science.gov (United States)

    2012-04-18

    ... DEPARTMENT OF HEALTH AND HUMAN SERVICES HIT Standards Committee; Schedule for the Assessment of HIT Policy Committee Recommendations AGENCY: Office of the National Coordinator for Health Information... 2009 mandates that the HIT Standards Committee develop a schedule for the assessment of policy...

  9. 76 FR 25355 - HIT Standards Committee; Schedule for the Assessment of HIT Policy Committee Recommendations

    Science.gov (United States)

    2011-05-04

    ... DEPARTMENT OF HEALTH AND HUMAN SERVICES HIT Standards Committee; Schedule for the Assessment of HIT Policy Committee Recommendations AGENCY: Office of the National Coordinator for Health Information... 2009 mandates that the HIT Standards Committee develop a schedule for the assessment of policy...

  10. 78 FR 29134 - HIT Standards Committee; Schedule for the Assessment of HIT Policy Committee Recommendations

    Science.gov (United States)

    2013-05-17

    ... DEPARTMENT OF HEALTH AND HUMAN SERVICES HIT Standards Committee; Schedule for the Assessment of HIT Policy Committee Recommendations AGENCY: Office of the National Coordinator for Health Information... 2009 mandates that the HIT Standards Committee develop a schedule for the assessment of policy...

  11. Physics at a 100 TeV pp Collider: Standard Model Processes

    Energy Technology Data Exchange (ETDEWEB)

    Mangano, M. L. [European Organization for Nuclear Research (CERN), Geneva (Switzerland); Zanderighi, G. [European Organization for Nuclear Research (CERN), Geneva (Switzerland); Aguilar Saavedra, J. A. [Univ. of Granada (Spain); Alekhin, S. [Univ. of Hamburg (Germany). Inst. for Theoretical Physics; Inst. for High Energy Physics (IHEP), Moscow (Russian Federation); Badger, S. [Univ. of Edinburgh, Scotland (United Kingdom); Bauer, C. W. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Becher, T. [Univ. Bern (Switzerland); Bertone, V. [Univ. of Oxford (United Kingdom); Bonvini, M. [Univ. of Oxford (United Kingdom); Boselli, S. [Univ. of Pavia (Italy); Bothmann, E. [Gottingen Univ. (Germany); Boughezal, R. [Argonne National Lab. (ANL), Argonne, IL (United States); Cacciari, M. [Univ. Paris Diderot (France); Sorbonne Univ., Paris (France); Carloni Calame, C M. [Istituto Nazionale di Fisica Nucleare (INFN), Pavia (Italy); Caola, F. [European Organization for Nuclear Research (CERN), Geneva (Switzerland); Campbell, J. M. [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States); Carrazza, S. [European Organization for Nuclear Research (CERN), Geneva (Switzerland); Chiesa, M. [Istituto Nazionale di Fisica Nucleare (INFN), Pavia (Italy); Cieri, L. [Univ. of Zurich (Switzerland); Cimaglia, F. [Univ. degli Studi di Milano (Italy); Febres Cordero, F. [Physikalisches Inst., Freiburg (Germany); Ferrarese, P. [Gottingen Univ. (Germany); D' Enterria, D. [European Organization for Nuclear Research (CERN), Geneva (Switzerland); Ferrera, G. [Univ. degli Studi di Milano (Italy); Garcia i Tormo, X. [Univ. Bern (Switzerland); Garzelli, M. V. [Univ. of Hamburg (Germany); Germann, E. [Monash Univ., Melbourne, VIC (Australia); Hirschi, V. [SLAC National Accelerator Lab., Menlo Park, CA (United States); Han, T. [Univ. of Pittsburgh, PA (United States); Ita, H. [Physikalisches Inst., Freiburg (Germany); Jager, B. [Univ. of Tubingen (Germany); Kallweit, S. [Johannes Gutenberg Univ., Mainz (Germany); Karlberg, A. [Univ. of Oxford (United Kingdom); Kuttimalai, S. [Durham Univ. (United Kingdom); Krauss, F. [Durham Univ. (United Kingdom); Larkoski, A. J. [Harvard Univ., Cambridge, MA (United States); Lindert, J. [Univ. of Zurich (Switzerland); Luisoni, G. [European Organization for Nuclear Research (CERN), Geneva (Switzerland); Maierhofer, P. [Univ. of Freiburg (Germany); Mattelaer, O. [Durham Univ. (United Kingdom); Martinez, H. [Univ. of Pavia (Italy); Moch, S. [Univ. of Hamburg (Germany); Montagna, G. [Univ. of Pavia (Italy); Moretti, M. [Univ. of Ferrara (Italy); Nason, P. [Univ. of Milano (Italy); Nicrosini, O. [Istituto Nazionale di Fisica Nucleare (INFN), Pavia (Italy); Oleari, C. [Univ. of Milano (Italy); Pagani, D. [Univ. Catholique de Louvain (Belgium); Papaefstathiou, A. [European Organization for Nuclear Research (CERN), Geneva (Switzerland); Petriello, F. [Northwestern Univ., Evanston, IL (United States); Piccinini, F. [Istituto Nazionale di Fisica Nucleare (INFN), Pavia (Italy); Pierini, M. [European Organization for Nuclear Research (CERN), Geneva (Switzerland); Pierog, T. [Karlsruhe Inst. of Technology (KIT) (Germany); Pozzorini, S. [Univ. of Zurich (Switzerland); Re, E. [National Centre for Scientific Research (CNRS), Annecy-le-Vieux (France). Lab. of Annecy-le-Vieux for Theoretical Physics (LAPTh); Robens, T. [Technische Universitat Dresden (Germany); Rojo, J. [Univ. of Oxford (United Kingdom); Ruiz, R. [Durham Univ. (United Kingdom); Sakurai, K. [Durham Univ. (United Kingdom); Salam, G. P. [European Organization for Nuclear Research (CERN), Geneva (Switzerland); Salfelder, L. [Univ. of Tubingen (Germany); Schonherr, M. [Univ. of Ferrara (Italy); Schulze, M. [European Organization for Nuclear Research (CERN), Geneva (Switzerland); Schumann, S. [Univ. Gottingen (Germany); Selvaggi, M. [Univ. Catholique de Louvain (Belgium); Shivaji, A. [Istituto Nazionale di Fisica Nucleare (INFN), Pavia (Italy); Siodmok, A. [European Organization for Nuclear Research (CERN), Geneva (Switzerland); Polish Academy of Sciences (PAS), Krakow (Poland); Skands, P. [Monash Univ., Melbourne, VIC (Australia); Torrielli, P. [Univ. of Torino (Italy); Tramontano, F. [Univ. of Napoli (Italy); Tsinikos, I. [Univ. Catholique de Louvain (Belgium); Tweedie, B. [Univ. of Pittsburgh, PA (United States); Vicini, A. [Univ. degli Studi di Milano (Italy); Westhoff, S. [Heidelberg Univ. (Germany); Zaro, M. [Sorbonne Univ., Paris (France); Zeppenfeld, D. [Forschungszentrum Karlsruhe (Germany)

    2017-06-22

    This report summarises the properties of Standard Model processes at the 100 TeV pp collider. We document the production rates and typical distributions for a number of benchmark Standard Model processes, and discuss new dynamical phenomena arising at the highest energies available at this collider. We discuss the intrinsic physics interest in the measurement of these Standard Model processes, as well as their role as backgrounds for New Physics searches.

  12. Environmental assessment of solid waste landfilling technologies by means of LCA-modeling

    DEFF Research Database (Denmark)

    Manfredi, Simone; Christensen, Thomas Højlund

    2009-01-01

    By using life cycle assessment (LCA) modeling, this paper compares the environmental performance of six landfilling technologies (open dump, conventional landfill with flares, conventional landfill with energy recovery, standard bioreactor landfill, flushing bioreactor landfill and semi......-aerobic landfill) and assesses the influence of the active operations practiced on these performances. The environmental assessments have been performed by means of the LCA-based tool EASEWASTE, whereby the functional unit utilized for the LCA is “landfilling of 1 ton of wet household waste in a 10 m deep landfill...... that it is crucially important to ensure the highest collection efficiency of landfill gas and leachate since a poor capture compromises the overall environmental performance. Once gas and leachate are collected and treated, the potential impacts in the standard environmental categories and on spoiled groundwater...

  13. Non-generic couplings in supersymmetric standard models

    Directory of Open Access Journals (Sweden)

    Evgeny I. Buchbinder

    2015-09-01

    Full Text Available We study two phases of a heterotic standard model, obtained from a Calabi–Yau compactification of the E8×E8 heterotic string, in the context of the associated four-dimensional effective theories. In the first phase we have a standard model gauge group, an MSSM spectrum, four additional U(1 symmetries and singlet fields. In the second phase, obtained from the first by continuing along the singlet directions, three of the additional U(1 symmetries are spontaneously broken and the remaining one is a B–L symmetry. In this second phase, dimension five operators inducing proton decay are consistent with all symmetries and as such, they are expected to be present. We show that, contrary to this expectation, these operators are forbidden due to the additional U(1 symmetries present in the first phase of the model. We emphasise that such “unexpected” absences of operators, due to symmetry enhancement at specific loci in the moduli space, can be phenomenologically relevant and, in the present case, protect the model from fast proton decay.

  14. Assessing item fit for unidimensional item response theory models using residuals from estimated item response functions.

    Science.gov (United States)

    Haberman, Shelby J; Sinharay, Sandip; Chon, Kyong Hee

    2013-07-01

    Residual analysis (e.g. Hambleton & Swaminathan, Item response theory: principles and applications, Kluwer Academic, Boston, 1985; Hambleton, Swaminathan, & Rogers, Fundamentals of item response theory, Sage, Newbury Park, 1991) is a popular method to assess fit of item response theory (IRT) models. We suggest a form of residual analysis that may be applied to assess item fit for unidimensional IRT models. The residual analysis consists of a comparison of the maximum-likelihood estimate of the item characteristic curve with an alternative ratio estimate of the item characteristic curve. The large sample distribution of the residual is proved to be standardized normal when the IRT model fits the data. We compare the performance of our suggested residual to the standardized residual of Hambleton et al. (Fundamentals of item response theory, Sage, Newbury Park, 1991) in a detailed simulation study. We then calculate our suggested residuals using data from an operational test. The residuals appear to be useful in assessing the item fit for unidimensional IRT models.

  15. Lattice Gauge Theories Within and Beyond the Standard Model

    Energy Technology Data Exchange (ETDEWEB)

    Gelzer, Zechariah John [Iowa U.

    2017-01-01

    The Standard Model of particle physics has been very successful in describing fundamental interactions up to the highest energies currently probed in particle accelerator experiments. However, the Standard Model is incomplete and currently exhibits tension with experimental data for interactions involving $B$~mesons. Consequently, $B$-meson physics is of great interest to both experimentalists and theorists. Experimentalists worldwide are studying the decay and mixing processes of $B$~mesons in particle accelerators. Theorists are working to understand the data by employing lattice gauge theories within and beyond the Standard Model. This work addresses the theoretical effort and is divided into two main parts. In the first part, I present a lattice-QCD calculation of form factors for exclusive semileptonic decays of $B$~mesons that are mediated by both charged currents ($B \\to \\pi \\ell \

  16. Standard model beyond the TeV

    International Nuclear Information System (INIS)

    Aurenche, P.

    1987-01-01

    The phenomenology of the standard model in the hadronic reactions in the 10 TeV range is described. The predictions of the model concerning the hadronic cross sections being based on the parton model, we first discuss the behaviour of the structure functions at the low values of X (x > 10 -4 ) which are attained at these energies and we show that the development of the leading logarithms equations allow us to calculate them. The production of W, Z, and gauge bosons and gauge boson pairs are reviewed. The Higgs boson production is discussed in detail according to his mass value [fr

  17. Future Directions in Assessment: Influences of Standards and Implications for Language Learning

    Science.gov (United States)

    Cox, Troy L.; Malone, Margaret E.; Winke, Paula

    2018-01-01

    As "Foreign Language Annals" concludes its 50th anniversary, it is fitting to review the past and peer into the future of standards-based education and assessment. Standards are a common yardstick used by educators and researchers as a powerful framework for conceptualizing teaching and measuring learner success. The impact of standards…

  18. Precision calculations in supersymmetric extensions of the Standard Model

    International Nuclear Information System (INIS)

    Slavich, P.

    2013-01-01

    This dissertation is organized as follows: in the next chapter I will summarize the structure of the supersymmetric extensions of the standard model (SM), namely the MSSM (Minimal Supersymmetric Standard Model) and the NMSSM (Next-to-Minimal Supersymmetric Standard Model), I will provide a brief overview of different patterns of SUSY (supersymmetry) breaking and discuss some issues on the renormalization of the input parameters that are common to all calculations of higher-order corrections in SUSY models. In chapter 3 I will review and describe computations on the production of MSSM Higgs bosons in gluon fusion. In chapter 4 I will review results on the radiative corrections to the Higgs boson masses in the NMSSM. In chapter 5 I will review the calculation of BR(B → X s γ in the MSSM with Minimal Flavor Violation (MFV). Finally, in chapter 6 I will briefly summarize the outlook of my future research. (author)

  19. Psychosocial Assessment as a Standard of Care in Pediatric Cancer

    NARCIS (Netherlands)

    Kazak, Anne E.; Abrams, Annah N.; Banks, Jaime; Christofferson, Jennifer; DiDonato, Stephen; Grootenhuis, Martha A.; Kabour, Marianne; Madan-Swain, Avi; Patel, Sunita K.; Zadeh, Sima; Kupst, Mary Jo

    2015-01-01

    This paper presents the evidence for a standard of care for psychosocial assessment in pediatric cancer. An interdisciplinary group of investigators utilized EBSCO, PubMed, PsycINFO, Ovid, and Google Scholar search databases, focusing on five areas: youth/family psychosocial adjustment, family

  20. Precision tests of the Standard Model

    International Nuclear Information System (INIS)

    Ol'shevskij, A.G.

    1996-01-01

    The present status of the precision measurements of electroweak observables is discussed with the special emphasis on the results obtained recently. All together these measurements provide the basis for the stringent test of the Standard Model and determination of the SM parameters. 22 refs., 23 figs., 11 tabs

  1. Is the standard model really tested?

    International Nuclear Information System (INIS)

    Takasugi, E.

    1989-01-01

    It is discussed how the standard model is really tested. Among various tests, I concentrate on the CP violation phenomena in K and B meson system. Especially, the resent hope to overcome the theoretical uncertainty in the evaluation on the CP violation of K meson system is discussed. (author)

  2. Non-perturbative effective interactions in the standard model

    Energy Technology Data Exchange (ETDEWEB)

    Arbuzov, Boris A. [Moscow Lomonosov State Univ. (Russian Federation). Skobeltsyn Inst. of Nuclear Physics

    2014-07-01

    This monograph is devoted to the nonperturbative dynamics in the Standard Model (SM), the basic theory of allfundamental interactions in natureexcept gravity. The Standard Model is divided into two parts: the quantum chromodynamics (QCD) and the electro-weak theory (EWT) are well-defined renormalizable theories in which the perturbation theory is valid. However, for the adequate description of the real physics nonperturbative effects are inevitable. This book describes how these nonperturbative effects may be obtained in the framework of spontaneous generation of effective interactions. The well-known example of such effective interaction is provided by the famous Nambu-Jona-Lasinio effective interaction. Also a spontaneous generation of this interaction in the framework of QCD is described and applied to the method for other effective interactions in QCD and EWT. The method is based on N.N. Bogolyubov's conception of compensation equations. As a result we then describe the principal features of the Standard Model, e.g. Higgs sector, and significant nonperturbative effects including recent results obtained at LHC and TEVATRON.

  3. Model-driven Privacy Assessment in the Smart Grid

    Energy Technology Data Exchange (ETDEWEB)

    Knirsch, Fabian [Salzburg Univ. (Austria); Engel, Dominik [Salzburg Univ. (Austria); Neureiter, Christian [Salzburg Univ. (Austria); Frincu, Marc [Univ. of Southern California, Los Angeles, CA (United States); Prasanna, Viktor [Univ. of Southern California, Los Angeles, CA (United States)

    2015-02-09

    In a smart grid, data and information are transported, transmitted, stored, and processed with various stakeholders having to cooperate effectively. Furthermore, personal data is the key to many smart grid applications and therefore privacy impacts have to be taken into account. For an effective smart grid, well integrated solutions are crucial and for achieving a high degree of customer acceptance, privacy should already be considered at design time of the system. To assist system engineers in early design phase, frameworks for the automated privacy evaluation of use cases are important. For evaluation, use cases for services and software architectures need to be formally captured in a standardized and commonly understood manner. In order to ensure this common understanding for all kinds of stakeholders, reference models have recently been developed. In this paper we present a model-driven approach for the automated assessment of such services and software architectures in the smart grid that builds on the standardized reference models. The focus of qualitative and quantitative evaluation is on privacy. For evaluation, the framework draws on use cases from the University of Southern California microgrid.

  4. Standards for psychological assessment of nuclear facility personnel. Technical report

    International Nuclear Information System (INIS)

    Frank, F.D.; Lindley, B.S.; Cohen, R.A.

    1981-07-01

    The subject of this study was the development of standards for the assessment of emotional instability in applicants for nuclear facility positions. The investigation covered all positions associated with a nuclear facility. Conclusions reached in this investigation focused on the ingredients of an integrated selection system including the use of personality tests, situational simulations, and the clinical interview; the need for professional standards to ensure quality control; the need for a uniform selection system as organizations vary considerably in terms of instruments presently used; and the need for an on-the-job behavioral observation program

  5. Supersymmetric standard model from the heterotic string (II)

    International Nuclear Information System (INIS)

    Buchmueller, W.; Hamaguchi, K.; Tokyo Univ.; Lebedev, O.; Ratz, M.

    2006-06-01

    We describe in detail a Z 6 orbifold compactification of the heterotic E 8 x E 8 string which leads to the (supersymmetric) standard model gauge group and matter content. The quarks and leptons appear as three 16-plets of SO(10), two of which are localized at fixed points with local SO(10) symmetry. The model has supersymmetric vacua without exotics at low energies and is consistent with gauge coupling unification. Supersymmetry can be broken via gaugino condensation in the hidden sector. The model has large vacuum degeneracy. Certain vacua with approximate B-L symmetry have attractive phenomenological features. The top quark Yukawa coupling arises from gauge interactions and is of the order of the gauge couplings. The other Yukawa couplings are suppressed by powers of standard model singlet fields, similarly to the Froggatt-Nielsen mechanism. (Orig.)

  6. Implications of Higgs searches on the four-generation standard model.

    Science.gov (United States)

    Kuflik, Eric; Nir, Yosef; Volansky, Tomer

    2013-03-01

    Within the four-generation standard model, the Higgs couplings to gluons and to photons deviate in a significant way from the predictions of the three-generation standard model. As a consequence, large departures in several Higgs production and decay channels are expected. Recent Higgs search results, presented by ATLAS, CMS, and CDF, hint on the existence of a Higgs boson with a mass around 125 GeV. Using these results and assuming such a Higgs boson, we derive exclusion limits on the four-generation standard model. For m(H)=125 GeV, the model is excluded above 99.95% confidence level. For 124.5 GeV≤m(H)≤127.5 GeV, an exclusion limit above 99% confidence level is found.

  7. MRI assessment of myelination: an age standardization

    Energy Technology Data Exchange (ETDEWEB)

    Staudt, M. (Kinderklinik Dritter Orden, Passau (Germany)); Schropp, C. (Kinderklinik Dritter Orden, Passau (Germany)); Staudt, F. (Kinderklinik Dritter Orden, Passau (Germany)); Obletter, N. (Radiologische Praxis, Klinikum Ingolstadt (Germany)); Bise, K. (Neuropathologisches Inst., Muenchen Univ. (Germany)); Breit, A. (MR Tomographie, Klinikum Passau (Germany)); Weinmann, H.M. (Kinderklinik Schwabing, Muenchen (Germany))

    1994-04-01

    777 cerebral MRI examinations of children aged 3 days to 14 years were staged for myelination to establish an age standardization. Staging was performed using a system proposed in a previous paper, separately ranking 10 different regions of the brain. Interpretation of the results led to the identification of foue clinical diagnoses that are frequently associated with delays in myelination: West syndrome, cerebral palsy, developmental retardation, and congenital anomalies. In addition, it was found that assessment of myelination in children with head injuries was not practical as alterations in MRI signal can simulate earlier stages of myelination. Age limits were therefore calculated from the case material after excluding all children with these conditions. When simplifications of the definition of the stages are applied, these age limits for the various stages of myelination of each of the 10 regions of the brain make the staging system applicable for routine assessment of myelination. (orig.)

  8. Motivational Effects of Standardized Language Assessment on Chinese Young Learners

    Science.gov (United States)

    Zhao, Chuqiao

    2016-01-01

    This review paper examines how standardized language assessment affects Chinese young learners' motivation for second-language learning. By presenting the historical and contemporary contexts of the testing system in China, this paper seeks to demonstrate the interrelationship among cultural, social, familial, and individual factors, which…

  9. Biosphere Modeling and Analyses in Support of Total System Performance Assessment

    International Nuclear Information System (INIS)

    Tappen, J. J.; Wasiolek, M. A.; Wu, D. W.; Schmitt, J. F.; Smith, A. J.

    2002-01-01

    The Nuclear Waste Policy Act of 1982 established the obligations of and the relationship between the U.S. Environmental Protection Agency (EPA), the U.S. Nuclear Regulatory Commission (NRC), and the U.S. Department of Energy (DOE) for the management and disposal of high-level radioactive wastes. In 1985, the EPA promulgated regulations that included a definition of performance assessment that did not consider potential dose to a member of the general public. This definition would influence the scope of activities conducted by DOE in support of the total system performance assessment program until 1995. The release of a National Academy of Sciences (NAS) report on the technical basis for a Yucca Mountain-specific standard provided the impetus for the DOE to initiate activities that would consider the attributes of the biosphere, i.e. that portion of the earth where living things, including man, exist and interact with the environment around them. The evolution of NRC and EPA Yucca Mountain-specific regulations, originally proposed in 1999, was critical to the development and integration of biosphere modeling and analyses into the total system performance assessment program. These proposed regulations initially differed in the conceptual representation of the receptor of interest to be considered in assessing performance. The publication in 2001 of final regulations in which the NRC adopted standard will permit the continued improvement and refinement of biosphere modeling and analyses activities in support of assessment activities

  10. Biosphere Modeling and Analyses in Support of Total System Performance Assessment

    International Nuclear Information System (INIS)

    Jeff Tappen; M.A. Wasiolek; D.W. Wu; J.F. Schmitt

    2001-01-01

    The Nuclear Waste Policy Act of 1982 established the obligations of and the relationship between the U.S. Environmental Protection Agency (EPA), the U.S. Nuclear Regulatory Commission (NRC), and the U.S. Department of Energy (DOE) for the management and disposal of high-level radioactive wastes. In 1985, the EPA promulgated regulations that included a definition of performance assessment that did not consider potential dose to a member of the general public. This definition would influence the scope of activities conducted by DOE in support of the total system performance assessment program until 1995. The release of a National Academy of Sciences (NAS) report on the technical basis for a Yucca Mountain-specific standard provided the impetus for the DOE to initiate activities that would consider the attributes of the biosphere, i.e. that portion of the earth where living things, including man, exist and interact with the environment around them. The evolution of NRC and EPA Yucca Mountain-specific regulations, originally proposed in 1999, was critical to the development and integration of biosphere modeling and analyses into the total system performance assessment program. These proposed regulations initially differed in the conceptual representation of the receptor of interest to be considered in assessing performance. The publication in 2001 of final regulations in which the NRC adopted standard will permit the continued improvement and refinement of biosphere modeling and analyses activities in support of assessment activities

  11. e+e- interactions at very high energy: searching beyond the standard model

    International Nuclear Information System (INIS)

    Dorfan, J.

    1983-04-01

    These lectures discuss e + e - interactions at very high energies with a particular emphasis on searching the standard model which we take to be SU(3)/sub color/Λ SU(2) Λ U(1). The highest e + e - collision energy exploited to date is at PETRA where data have been taken at 38 GeV. We will consider energies above this to be the very high energy frontier. The lectures will begin with a review of the collision energies which will be available in the upgraded machines of today and the machines planned for tomorrow. Without going into great detail, we will define the essential elements of the standard model. We will remind ourselves that some of these essential elements have not yet been verified and that part of the task of searching beyond the standard model will involve experiments aimed at this verification. For if we find the standard model lacking, then clearly we are forced to find an alternative. So we will investigate how the higher energy e + e - collisions can be used to search for the top quark, the neutral Higgs scalar, provide true verification of the non-Abelian nature of QCD, etc. Having done this we will look at tests of models involving simple extensions of the standard model. Models considered are those without a top quark, those with charged Higgs scalars, with multiple and/or composite vector bosons, with additional generations and possible alternative explanations for the PETRA three jet events which don't require gluon bremsstrahlung. From the simple extensions of the standard model we will move to more radical alternatives, alternatives which have arisen from the unhappiness with the gauge hierarchy problem of the standard model. Technicolor, Supersymmetry and composite models will be discussed. In the final section we will summarize what the future holds in terms of the search beyond the standard model

  12. Assessment of turbulence models for pulsatile flow inside a heart pump.

    Science.gov (United States)

    Al-Azawy, Mohammed G; Turan, A; Revell, A

    2016-02-01

    Computational fluid dynamics (CFD) is applied to study the unsteady flow inside a pulsatile pump left ventricular assist device, in order to assess the sensitivity to a range of commonly used turbulence models. Levels of strain and wall shear stress are directly relevant to the evaluation of risk from haemolysis and thrombosis, and thus understanding the sensitivity to these turbulence models is important in the assessment of uncertainty in CFD predictions. The study focuses on a positive displacement or pulsatile pump, and the CFD model includes valves and moving pusher plate. An unstructured dynamic layering method was employed to capture this cyclic motion, and valves were simulated in their fully open position to mimic the natural scenario, with in/outflow triggered at control planes away from the valves. Six turbulence models have been used, comprising three relevant to the low Reynolds number nature of this flow and three more intended to investigate different transport effects. In the first group, we consider the shear stress transport (SST) [Formula: see text] model in both its standard and transition-sensitive forms, and the 'laminar' model in which no turbulence model is used. In the second group, we compare the one equation Spalart-Almaras model, the standard two equation [Formula: see text] and the full Reynolds stress model (RSM). Following evaluation of spatial and temporal resolution requirements, results are compared with available experimental data. The model was operated at a systolic duration of 40% of the pumping cycle and a pumping rate of 86 BPM (beats per minute). Contrary to reasonable preconception, the 'transition' model, calibrated to incorporate additional physical modelling specifically for these flow conditions, was not noticeably superior to the standard form of the model. Indeed, observations of turbulent viscosity ratio reveal that the transition model initiates a premature increase of turbulence in this flow, when compared with

  13. Phd study of reliability and validity: One step closer to a standardized music therapy assessment model

    DEFF Research Database (Denmark)

    Jacobsen, Stine Lindahl

    The paper will present a phd study concerning reliability and validity of music therapy assessment modelAssessment of Parenting Competences” (APC) in the area of families with emotionally neglected children. This study had a multiple strategy design with a philosophical base of critical realism...... and pragmatism. The fixed design for this study was a between and within groups design in testing the APCs reliability and validity. The two different groups were parents with neglected children and parents with non-neglected children. The flexible design had a multiple case study strategy specifically...

  14. Standard Model mass spectrum in inflationary universe

    Energy Technology Data Exchange (ETDEWEB)

    Chen, Xingang [Institute for Theory and Computation, Harvard-Smithsonian Center for Astrophysics,60 Garden Street, Cambridge, MA 02138 (United States); Wang, Yi [Department of Physics, The Hong Kong University of Science and Technology,Clear Water Bay, Kowloon, Hong Kong (China); Xianyu, Zhong-Zhi [Center of Mathematical Sciences and Applications, Harvard University,20 Garden Street, Cambridge, MA 02138 (United States)

    2017-04-11

    We work out the Standard Model (SM) mass spectrum during inflation with quantum corrections, and explore its observable consequences in the squeezed limit of non-Gaussianity. Both non-Higgs and Higgs inflation models are studied in detail. We also illustrate how some inflationary loop diagrams can be computed neatly by Wick-rotating the inflation background to Euclidean signature and by dimensional regularization.

  15. Cross-cultural validity of standardized motor development screening and assessment tools: a systematic review.

    Science.gov (United States)

    Mendonça, Bianca; Sargent, Barbara; Fetters, Linda

    2016-12-01

    To investigate whether standardized motor development screening and assessment tools that are used to evaluate motor abilities of children aged 0 to 2 years are valid in cultures other than those in which the normative sample was established. This was a systematic review in which six databases were searched. Studies were selected based on inclusion/exclusion criteria and appraised for evidence level and quality. Study variables were extracted. Twenty-three studies representing six motor development screening and assessment tools in 16 cultural contexts met the inclusion criteria: Alberta Infant Motor Scale (n=7), Ages and Stages Questionnaire, 3rd edition (n=2), Bayley Scales of Infant and Toddler Development, 3rd edition (n=8), Denver Developmental Screening Test, 2nd edition (n=4), Harris Infant Neuromotor Test (n=1), and Peabody Developmental Motor Scales, 2nd edition (n=1). Thirteen studies found significant differences between the cultural context and normative sample. Two studies established reliability and/or validity of standardized motor development assessments in high-risk infants from different cultural contexts. Five studies established new population norms. Eight studies described the cross-cultural adaptation of a standardized motor development assessment. Standardized motor development assessments have limited validity in cultures other than that in which the normative sample was established. Their use can result in under- or over-referral for services. © 2016 Mac Keith Press.

  16. Anomalous Abelian symmetry in the standard model

    International Nuclear Information System (INIS)

    Ramond, P.

    1995-01-01

    The observed hierarchy of quark and lepton masses can be parametrized by nonrenormalizable operators with dimensions determined by an anomalous Abelian family symmetry, a gauge extension to the minimal supersymmetric standard model. Such an Abelian symmetry is generic to compactified superstring theories, with its anomalies compensated by the Green-Schwarz mechanism. If we assume these two symmetries to be the same, we find the electroweak mixing angle to be sin 2 θ ω = 3/8 at the string scale, just by setting the ratio of the product of down quark to charged lepton masses equal to one at the string scale. This assumes no GUT structure. The generality of the result suggests a superstring origin for the standard model. We generalize our analysis to massive neutrinos, and mixings in the lepton sector

  17. Standardized Patients Provide a Reliable Assessment of Athletic Training Students' Clinical Skills

    Science.gov (United States)

    Armstrong, Kirk J.; Jarriel, Amanda J.

    2016-01-01

    Context: Providing students reliable objective feedback regarding their clinical performance is of great value for ongoing clinical skill assessment. Since a standardized patient (SP) is trained to consistently portray the case, students can be assessed and receive immediate feedback within the same clinical encounter; however, no research, to our…

  18. Almost-commutative geometries beyond the standard model: III. Vector doublets

    International Nuclear Information System (INIS)

    Squellari, Romain; Stephan, Christoph A

    2007-01-01

    We will present a new extension of the standard model of particle physics in its almost-commutative formulation. This extension has as its basis the algebra of the standard model with four summands (Iochum et al 2004 J. Math. Phys. 45 5003 (Preprint hep-th/0312276), Jureit J-H and Stephan C 2005 J. Math. Phys. 46 043512 (Preprint hep-th/0501134), Schuecker T 2005 Krajewski diagrams and spin lifts Preprint hep-th/0501181, Jureit et al 2005 J. Math. Phys. 46 072303 (Preprint hep-th/0503190), Jureit J-H and Stephan C 2006 On a classification of irreducible almost commutative geometries: IV (Preprint hep-th/0610040)), and enlarges only the particle content by an arbitrary number of generations of left-right symmetric doublets which couple vectorially to the U(1) Y x SU(2) w subgroup of the standard model. As in the model presented in Stephan (2007 Almost-commutative geometries beyond the standard model: II. New Colours Preprint hep-th/0706.0595), which introduced particles with a new colour, grand unification is no longer required by the spectral action. The new model may also possess a candidate for dark matter in the hundred TeV mass range with neutrino-like cross section

  19. The MCRA model for probabilistic single-compound and cumulative risk assessment of pesticides.

    Science.gov (United States)

    van der Voet, Hilko; de Boer, Waldo J; Kruisselbrink, Johannes W; Goedhart, Paul W; van der Heijden, Gerie W A M; Kennedy, Marc C; Boon, Polly E; van Klaveren, Jacob D

    2015-05-01

    Pesticide risk assessment is hampered by worst-case assumptions leading to overly pessimistic assessments. On the other hand, cumulative health effects of similar pesticides are often not taken into account. This paper describes models and a web-based software system developed in the European research project ACROPOLIS. The models are appropriate for both acute and chronic exposure assessments of single compounds and of multiple compounds in cumulative assessment groups. The software system MCRA (Monte Carlo Risk Assessment) is available for stakeholders in pesticide risk assessment at mcra.rivm.nl. We describe the MCRA implementation of the methods as advised in the 2012 EFSA Guidance on probabilistic modelling, as well as more refined methods developed in the ACROPOLIS project. The emphasis is on cumulative assessments. Two approaches, sample-based and compound-based, are contrasted. It is shown that additional data on agricultural use of pesticides may give more realistic risk assessments. Examples are given of model and software validation of acute and chronic assessments, using both simulated data and comparisons against the previous release of MCRA and against the standard software DEEM-FCID used by the Environmental Protection Agency in the USA. It is shown that the EFSA Guidance pessimistic model may not always give an appropriate modelling of exposure. Crown Copyright © 2014. Published by Elsevier Ltd. All rights reserved.

  20. Beyond the standard model

    International Nuclear Information System (INIS)

    Domokos, G.; Elliott, B.; Kovesi-Domokos, S.; Mrenna, S.

    1992-01-01

    In this paper the authors briefly review the necessity of going beyond the Standard Model. We argue that certain types of composite models of quarks and leptons may resolve some of the difficulties of the SM. Furthermore the authors argue that, even without a full specification of a composite model, one may predict some observable effects following from the compositeness hypothesis. The effects are most easily seen in reaction channels in which there is little competition from known processes predicted by the SM, typically in neutrino induced reactions. The authors suggest that above a certain characteristic energy, neutrino cross sections rise well above those predicted within the framework of the SM and the difference between the characteristic features of lepton and hadron induced reactions is blurred. The authors claim that there is some (so far, tenuous) evidence for the phenomenon we just alluded to: in certain high energy cosmic ray interactions it appears that photons and/or neutrinos behave in a manner which is inconsistent with the SM. The authors analyze the data and conclude that the origin of the anomaly in the observational data arises from an increased neutrino interaction cross section

  1. PATELLOFEMORAL MODEL OF THE KNEE JOINT UNDER NON-STANDARD SQUATTING

    OpenAIRE

    FEKETE, GUSZTÁV; CSIZMADIA, BÉLA MÁLNÁSI; WAHAB, MAGD ABDEL; DE BAETS, PATRICK; VANEGAS-USECHE, LIBARDO V.; BÍRÓ, ISTVÁN

    2014-01-01

    The available analytical models for calculating knee patellofemoral forces are limited to the standard squat motion when the center of gravity is fixed horizontally. In this paper, an analytical model is presented to calculate accurately patellofemoral forces by taking into account the change in position of the trunk's center of gravity under deep squat (non-standard squatting). The accuracy of the derived model is validated through comparisons with results of the inverse dynamics technique. ...

  2. Using toxicokinetic-toxicodynamic modeling as an acute risk assessment refinement approach in vertebrate ecological risk assessment.

    Science.gov (United States)

    Ducrot, Virginie; Ashauer, Roman; Bednarska, Agnieszka J; Hinarejos, Silvia; Thorbek, Pernille; Weyman, Gabriel

    2016-01-01

    Recent guidance identified toxicokinetic-toxicodynamic (TK-TD) modeling as a relevant approach for risk assessment refinement. Yet, its added value compared to other refinement options is not detailed, and how to conduct the modeling appropriately is not explained. This case study addresses these issues through 2 examples of individual-level risk assessment for 2 hypothetical plant protection products: 1) evaluating the risk for small granivorous birds and small omnivorous mammals of a single application, as a seed treatment in winter cereals, and 2) evaluating the risk for fish after a pulsed treatment in the edge-of-field zone. Using acute test data, we conducted the first tier risk assessment as defined in the European Food Safety Authority (EFSA) guidance. When first tier risk assessment highlighted a concern, refinement options were discussed. Cases where the use of models should be preferred over other existing refinement approaches were highlighted. We then practically conducted the risk assessment refinement by using 2 different models as examples. In example 1, a TK model accounting for toxicokinetics and relevant feeding patterns in the skylark and in the wood mouse was used to predict internal doses of the hypothetical active ingredient in individuals, based on relevant feeding patterns in an in-crop situation, and identify the residue levels leading to mortality. In example 2, a TK-TD model accounting for toxicokinetics, toxicodynamics, and relevant exposure patterns in the fathead minnow was used to predict the time-course of fish survival for relevant FOCUS SW exposure scenarios and identify which scenarios might lead to mortality. Models were calibrated using available standard data and implemented to simulate the time-course of internal dose of active ingredient or survival for different exposure scenarios. Simulation results were discussed and used to derive the risk assessment refinement endpoints used for decision. Finally, we compared the

  3. Computed-tomography-guided anatomic standardization for quantitative assessment of dopamine transporter SPECT

    Energy Technology Data Exchange (ETDEWEB)

    Yokoyama, Kota [National Center of Neurology and Psychiatry, Department of Radiology, Tokyo (Japan); National Center of Neurology and Psychiatry, Integrative Brain Imaging Center, Tokyo (Japan); Imabayashi, Etsuko; Matsuda, Hiroshi [National Center of Neurology and Psychiatry, Integrative Brain Imaging Center, Tokyo (Japan); Sumida, Kaoru; Sone, Daichi; Kimura, Yukio; Sato, Noriko [National Center of Neurology and Psychiatry, Department of Radiology, Tokyo (Japan); Mukai, Youhei; Murata, Miho [National Center of Neurology and Psychiatry, Department of Neurology, Tokyo (Japan)

    2017-03-15

    For the quantitative assessment of dopamine transporter (DAT) using [{sup 123}I]FP-CIT single-photon emission computed tomography (SPECT) (DaTscan), anatomic standardization is preferable for achieving objective and user-independent quantification of striatal binding using a volume-of-interest (VOI) template. However, low accumulation of DAT in Parkinson's disease (PD) would lead to a deformation error when using a DaTscan-specific template without any structural information. To avoid this deformation error, we applied computed tomography (CT) data obtained using SPECT/CT equipment to anatomic standardization. We retrospectively analyzed DaTscan images of 130 patients with parkinsonian syndromes (PS), including 80 PD and 50 non-PD patients. First we segmented gray matter from CT images using statistical parametric mapping 12 (SPM12). These gray-matter images were then anatomically standardized using the diffeomorphic anatomical registration using exponentiated Lie algebra (DARTEL) algorithm. Next, DaTscan images were warped with the same parameters used in the CT anatomic standardization. The target striatal VOIs for decreased DAT in PD were generated from the SPM12 group comparison of 20 DaTscan images from each group. We applied these VOIs to DaTscan images of the remaining patients in both groups and calculated the specific binding ratios (SBRs) using nonspecific counts in a reference area. In terms of the differential diagnosis of PD and non-PD groups using SBR, we compared the present method with two other methods, DaTQUANT and DaTView, which have already been released as software programs for the quantitative assessment of DaTscan images. The SPM12 group comparison showed a significant DAT decrease in PD patients in the bilateral whole striatum. Of the three methods assessed, the present CT-guided method showed the greatest power for discriminating PD and non-PD groups, as it completely separated the two groups. CT-guided anatomic standardization using

  4. Risk assessment of manual material handling activities (case study: PT BRS Standard Industry)

    Science.gov (United States)

    Deviani; Triyanti, V.

    2017-12-01

    The process of moving material manually has the potential for injury to workers. The risk of injury will increase if we do not pay attention to the working conditions. The purpose of this study is to assess and analyze the injury risk level in manual handling material activity, as well as to improve the condition. The observed manual material handling activities is pole lifting and goods loading. These activities were analyzed using Job Strain Index method, Rapid Entire Body Assessment, and Chaffin’s 2D Planar Static Model. The results show that most workers who perform almost all activities have a high level of risk level with the score of JSI and REBA exceeds 9 points. For some activities, the estimated compression forces in the lumbar area also exceed the standard limits of 3400 N. Concerning this condition, several suggestions for improvement were made, improving the composition of packing, improving body posture, and making guideline posters.

  5. Innovation Process Planning Model in the Bpmn Standard

    Directory of Open Access Journals (Sweden)

    Jurczyk-Bunkowska Magdalena

    2013-12-01

    Full Text Available The aim of the article is to show the relations in the innovation process planning model. The relations argued here guarantee the stable and reliable way to achieve the result in the form of an increased competitiveness by a professionally directed development of the company. The manager needs to specify the effect while initiating the realisation of the process, has to be achieved this by the system of indirect goals. The original model proposed here shows the standard of dependence between the plans of the fragments of the innovation process which make up for achieving its final goal. The relation in the present article was shown by using the standard Business Process Model and Notation. This enabled the specification of interrelations between the decision levels at which subsequent fragments of the innovation process are planned. This gives the possibility of a better coordination of the process, reducing the time needed for the achievement of its effect. The model has been compiled on the basis of the practises followed in Polish companies. It is not, however, the reflection of these practises, but rather an idealised standard of proceedings which aims at improving the effectiveness of the management of innovations on the operational level. The model shown could be the basis of the creation of systems supporting the decision making, supporting the knowledge management or those supporting the communication in the innovation processes.

  6. Yukawa couplings in Superstring derived Standard-like models

    International Nuclear Information System (INIS)

    Faraggi, A.E.

    1991-01-01

    I discuss Yukawa couplings in Standard-like models which are derived from Superstring in the free fermionic formulation. I introduce new notation for the construction of these models. I show how choice of boundary conditions selects a trilevel Yukawa coupling either for +2/3 charged quark or for -1/3 charged quark. I prove this selection rule. I make the conjecture that in this class of standard-like models a possible connection may exist between the requirements of F and D flatness at the string level and the heaviness of the top quark relative to lighter quarks and leptons. I discuss how the choice of boundary conditions determines the non vanishing mass terms for quartic order terms. I discuss the implication on the mass of the top quark. (author)

  7. A conceptual analysis of standard setting in large-scale assessments

    NARCIS (Netherlands)

    van der Linden, Willem J.

    1994-01-01

    Elements of arbitrariness in the standard setting process are explored, and an alternative to the use of cut scores is presented. The first part of the paper analyzes the use of cut scores in large-scale assessments, discussing three different functions: (1) cut scores define the qualifications used

  8. Quasi standard model physics

    International Nuclear Information System (INIS)

    Peccei, R.D.

    1986-01-01

    Possible small extensions of the standard model are considered, which are motivated by the strong CP problem and by the baryon asymmetry of the Universe. Phenomenological arguments are given which suggest that imposing a PQ symmetry to solve the strong CP problem is only tenable if the scale of the PQ breakdown is much above M W . Furthermore, an attempt is made to connect the scale of the PQ breakdown to that of the breakdown of lepton number. It is argued that in these theories the same intermediate scale may be responsible for the baryon number of the Universe, provided the Kuzmin Rubakov Shaposhnikov (B+L) erasing mechanism is operative. (orig.)

  9. Can An Amended Standard Model Account For Cold Dark Matter?

    International Nuclear Information System (INIS)

    Goldhaber, Maurice

    2004-01-01

    It is generally believed that one has to invoke theories beyond the Standard Model to account for cold dark matter particles. However, there may be undiscovered universal interactions that, if added to the Standard Model, would lead to new members of the three generations of elementary fermions that might be candidates for cold dark matter particles

  10. A generic standard for assessing and managing activities with significant risk to health and safety

    International Nuclear Information System (INIS)

    Wilde, T.S.; Sandquist, G.M.

    2005-01-01

    Some operations and activities in industry, business, and government can present an unacceptable risk to health and safety if not performed according to established safety practices and documented procedures. The nuclear industry has extensive experience and commitment to assessing and controlling such risks. This paper provides a generic standard based upon DOE Standard DOE-STD-3007- 93, Nov 1993, Change Notice No. 1, Sep 1998. This generic standard can be used to assess practices and procedures employed by any industrial and government entity to ensure that an acceptable level of safety and control prevail for such operations. When any activity and operation is determined to involve significant risk to health and safety to workers or the public, the organization should adopt and establish an appropriate standard and methodology to ensure that adequate health and safety prevail. This paper uses DOE experience and standards to address activities with recognized potential for impact upon health and safety. Existing and future assessments of health and safety issues can be compared and evaluated against this generic standard for insuring that proper planning, analysis, review, and approval have been made. (authors)

  11. Temporal assessment of copper speciation, bioavailability and toxicity in UK freshwaters using chemical equilibrium and biotic ligand models: Implications for compliance with copper environmental quality standards.

    Science.gov (United States)

    Lathouri, Maria; Korre, Anna

    2015-12-15

    Although significant progress has been made in understanding how environmental factors modify the speciation, bioavailability and toxicity of metals such as copper in aquatic environments, the current methods used to establish water quality standards do not necessarily consider the different geological and geochemical characteristics of a given site and the factors that affect copper fate, bioavailability potential and toxicity. In addition, the temporal variation in the concentration and bioavailable metal fraction is also important in freshwater systems. The work presented in this paper illustrates the temporal and seasonal variability of a range of water quality parameters, and Cu speciation, bioavailability and toxicity at four freshwaters sites in the UK. Rivers Coquet, Cree, Lower Clyde and Eden (Kent) were selected to cover a broad range of different geochemical environments and site characteristics. The monitoring data used covered a period of around six years at almost monthly intervals. Chemical equilibrium modelling was used to study temporal variations in Cu speciation and was combined with acute toxicity modelling to assess Cu bioavailability for two aquatic species, Daphnia magna and Daphnia pulex. The estimated copper bioavailability, toxicity levels and the corresponding ecosystem risks were analysed in relation to key water quality parameters (alkalinity, pH and DOC). Although copper concentrations did not vary much during the sampling period or between the seasons at the different sites; copper bioavailability varied markedly. In addition, through the chronic-Cu BLM-based on the voluntary risk assessment approach, the potential environmental risk in terms of the chronic toxicity was assessed. A much higher likelihood of toxicity effects was found during the cold period at all sites. It is suggested that besides the metal (copper) concentration in the surface water environment, the variability and seasonality of other important water quality

  12. Primordial alchemy: a test of the standard model

    International Nuclear Information System (INIS)

    Steigman, G.

    1987-01-01

    Big Bang Nucleosynthesis provides the only probe of the early evolution of the Universe constrained by observational data. The standard, hot, big bang model predicts the synthesis of the light elements (D, 3 He, 4 He, 7 Li) in astrophysically interesting abundances during the first few minutes in the evolution of the Universe. A quantitative comparison of the predicted abundances with those observed astronomically confirms the consistency of the standard model and yields valuable constraints on the parameters of cosmology and elementary particle physics. The current status of the comparison between theory and observation will be reviewed and the opportunities for future advances outlined

  13. Higgs detectability in the extended supersymmetric standard model

    International Nuclear Information System (INIS)

    Kamoshita, Jun-ichi

    1995-01-01

    Higgs detectability at a future linear collider are discussed in the minimal supersymmetric standard model (MSSM) and a supersymmetric standard model with a gauge singlet Higgs field (NMSSM). First, in the MSSM at least one of the neutral scalar Higgs is shown to be detectable irrespective of parameters of the model in a future e + e - linear collider at √s = 300-500 GeV. Next the Higgs sector of the NMSSM is considered, since the lightest Higgs boson can be singlet dominated and therefore decouple from Z 0 boson it is important to consider the production of heavier Higgses. It is shown that also in this case at least one of the neutral scalar Higgs will be detectable in a future linear collider. We extend the analysis and show that the same is true even if three singlets are included. Thus the detectability of these Higgs bosons of these models is guaranteed. (author)

  14. Is the Standard Model about to crater?

    CERN Multimedia

    Lane, Kenneth

    2015-01-01

    The Standard Model is coming under more and more pressure from experiments. New results from the analysis of LHC's Run 1 data show effects that, if confirmed, would be the signature of new interactions at the TeV scale.

  15. Assessment of the Japanese Energy Efficiency Standards Program

    Directory of Open Access Journals (Sweden)

    Jun Arakawa

    2015-03-01

    Full Text Available Japanese energy efficiency standards program for appliances is a unique program which sets and revises mandatory standards based on the products of the highest energy efficiency on the markets. This study assessed the cost-effectiveness of the standard settings for air conditioner as a major residential appliance or typical example in the program. Based on analyses of empirical data, the net costs and effects from 1999 to 2040 were estimated. When applying a discount rate of 3%, the cost of abating CO2 emissions realized through the considered standards was estimated to be -13700 JPY/t-CO2. The sensitivity analysis, however, showed the cost turns into positive at a discount rate of 26% or higher. The authors also revealed that the standards’ “excellent” cost-effectiveness largely depends on that of the 1st standard setting, and the CO2 abatement cost through the 2nd standard was estimated to be as high as 26800 JPY/t-CO2. The results imply that the government is required to be careful about the possible economic burden imposed when considering introducing new, additional standards.

  16. Non-perturbative effective interactions in the standard model

    CERN Document Server

    Arbuzov, Boris A

    2014-01-01

    This monograph is devoted to the nonperturbative dynamics in the Standard Model (SM), the basic theory of all, but gravity, fundamental interactions in nature. The Standard Model is devided into two parts: the Quantum chromodynamics (QCD) and the Electro-weak theory (EWT) are well-defined renormalizable theories in which the perturbation theory is valid. However, for the adequate description of the real physics nonperturbative effects are inevitable. This book describes how these nonperturbative effects may be obtained in the framework of spontaneous generation of effective interactions. The well-known example of such effective interaction is provided by the famous Nambu--Jona-Lasinio effective interaction. Also a spontaneous generation of this interaction in the framework of QCD is described and applied to the method for other effective interactions in QCD and EWT. The method is based on N.N. Bogoliubov conception of compensation equations. As a result we then describe the principle feathures of the Standard...

  17. Beyond the Standard Model Higgs searches at the LHC

    CERN Document Server

    Meridiani, P

    2015-01-01

    The Run I at the LHC marks the birth of the "Higgs physics", a path which will be followed at its full extent in the future runs of the LHC. Indeed there are two complementary paths to be followed to new physics in the Higgs sector: precision measurements of the Higgs properties (couplings, mass, spin and parity), where new physics can manifest as deviation from the Standard Model, or direct search for processes not foreseen in the Standard Model (Higgs decays not foreseen in the Standard Model, additional scalars which would indicate an extended Higgs sector). The current status of these studies at the LHC is presented, focussing in particular on the direct searches for rare or invisible Higgs decays or for an extended Higgs sector. The results are based on the analysis of the proton-proton collisions at 7 and 8 TeV center-of-mass energy at the LHC by the ATLAS and CMS collaborations.

  18. Impersonating the Standard Model Higgs boson: alignment without decoupling

    International Nuclear Information System (INIS)

    Carena, Marcela; Low, Ian; Shah, Nausheen R.; Wagner, Carlos E.M.

    2014-01-01

    In models with an extended Higgs sector there exists an alignment limit, in which the lightest CP-even Higgs boson mimics the Standard Model Higgs. The alignment limit is commonly associated with the decoupling limit, where all non-standard scalars are significantly heavier than the Z boson. However, alignment can occur irrespective of the mass scale of the rest of the Higgs sector. In this work we discuss the general conditions that lead to “alignment without decoupling”, therefore allowing for the existence of additional non-standard Higgs bosons at the weak scale. The values of tan β for which this happens are derived in terms of the effective Higgs quartic couplings in general two-Higgs-doublet models as well as in supersymmetric theories, including the MSSM and the NMSSM. Moreover, we study the information encoded in the variations of the SM Higgs-fermion couplings to explore regions in the m A −tan β parameter space

  19. Standard Model Higgs Searches at the Tevatron

    Energy Technology Data Exchange (ETDEWEB)

    Knoepfel, Kyle J.

    2012-06-01

    We present results from the search for a standard model Higgs boson using data corresponding up to 10 fb{sup -1} of proton-antiproton collision data produced by the Fermilab Tevatron at a center-of-mass energy of 1.96 TeV. The data were recorded by the CDF and D0 detectors between March 2001 and September of 2011. A broad excess is observed between 105 < m{sub H} < 145 GeV/c{sup 2} with a global significance of 2.2 standard deviations relative to the background-only hypothesis.

  20. Non-Standard Assessment Practices in the Evaluation of Communication in Australian Aboriginal Children

    Science.gov (United States)

    Gould, Judith

    2008-01-01

    Australian Aboriginal children typically receive communication assessment services from Standard Australian English (SAE) speaking non-Aboriginal speech-language pathologists (SLPs). Educational assessments, including intelligence testing, are also primarily conducted by non-Aboriginal educational professionals. While the current paper will show…

  1. Scale gauge symmetry and the standard model

    International Nuclear Information System (INIS)

    Sola, J.

    1990-01-01

    This paper speculates on a version of the standard model of the electroweak and strong interactions coupled to gravity and equipped with a spontaneously broken, anomalous, conformal gauge symmetry. The scalar sector is virtually absent in the minimal model but in the general case it shows up in the form of a nonlinear harmonic map Lagrangian. A Euclidean approach to the phenological constant problem is also addressed in this framework

  2. Standardized Handwriting to Assess Bradykinesia, Micrographia and Tremor in Parkinson's Disease

    NARCIS (Netherlands)

    Smits, Esther J.; Tolonen, Antti J.; Cluitmans, Luc; van Gils, Mark; Conway, Bernard A.; Zietsma, Rutger C.; Leenders, Klaus L.; Maurits, Natasha M.

    2014-01-01

    Objective: To assess whether standardized handwriting can provide quantitative measures to distinguish patients diagnosed with Parkinson's disease from age- and gender-matched healthy control participants. Design: Exploratory study. Pen tip trajectories were recorded during circle, spiral and line

  3. Searches for rare and non-Standard-Model decays of the Higgs boson

    CERN Document Server

    Sun, Xiaohu; The ATLAS collaboration

    2018-01-01

    Theories beyond the Standard Model predict Higgs boson decays at a much enhanced rate compared to the Standard Model, e.g. for decays to Z+photon or a meson and a photon, or decays that do not exist in the Standard Model, such as decays into two light bosons (a). This talk presents recent results based on 36 fb-1 of pp collision data collected at 13 TeV.

  4. Utility of Social Modeling for Proliferation Assessment - Preliminary Assessment

    Energy Technology Data Exchange (ETDEWEB)

    Coles, Garill A.; Gastelum, Zoe N.; Brothers, Alan J.; Thompson, Sandra E.

    2009-06-01

    This Preliminary Assessment draft report will present the results of a literature search and preliminary assessment of the body of research, analysis methods, models and data deemed to be relevant to the Utility of Social Modeling for Proliferation Assessment research. This report will provide: 1) a description of the problem space and the kinds of information pertinent to the problem space, 2) a discussion of key relevant or representative literature, 3) a discussion of models and modeling approaches judged to be potentially useful to the research, and 4) the next steps of this research that will be pursued based on this preliminary assessment. This draft report represents a technical deliverable for the NA-22 Simulations, Algorithms, and Modeling (SAM) program. Specifically this draft report is the Task 1 deliverable for project PL09-UtilSocial-PD06, Utility of Social Modeling for Proliferation Assessment. This project investigates non-traditional use of social and cultural information to improve nuclear proliferation assessment, including nonproliferation assessment, proliferation resistance assessments, safeguards assessments and other related studies. These assessments often use and create technical information about the State’s posture towards proliferation, the vulnerability of a nuclear energy system to an undesired event, and the effectiveness of safeguards. This project will find and fuse social and technical information by explicitly considering the role of cultural, social and behavioral factors relevant to proliferation. The aim of this research is to describe and demonstrate if and how social science modeling has utility in proliferation assessment.

  5. Measurement system for wind turbine acoustic noise assessment based on IEC standard and Qin′s model

    Institute of Scientific and Technical Information of China (English)

    Sun Lei; Qin Shuren; Bo Lin; Xu Liping; Stephan Joeckel

    2008-01-01

    A novel measurement system specially used in noise emission assessment and verification of wind turbine generator systems is presented that complies with specifications given in IEC 61400-11 to ensure the process consistency and accuracy. Theory elements of the calculation formula used for the sound power level of wind turbine have been discussed for the first time, and detailed calculation procedure of tonality and audibility integrating narrowband analysis and psychoacoustics is described. With a microphone and two PXI cards inserted into a PC, this system is designed in Qin′s model using VMIDS development system. Benefiting from the virtual instrument architecture, it′s the first time that all assessment process have been integrated into an organic whole, which gives full advantages of its efficiency, price, and facility. Mass experiments show that its assessment results accord with the ones given by MEASNET member.

  6. Risk assessment using probabilistic standards

    International Nuclear Information System (INIS)

    Avila, R.

    2004-01-01

    A core element of risk is uncertainty represented by plural outcomes and their likelihood. No risk exists if the future outcome is uniquely known and hence guaranteed. The probability that we will die some day is equal to 1, so there would be no fatal risk if sufficiently long time frame is assumed. Equally, rain risk does not exist if there was 100% assurance of rain tomorrow, although there would be other risks induced by the rain. In a formal sense, any risk exists if, and only if, more than one outcome is expected at a future time interval. In any practical risk assessment we have to deal with uncertainties associated with the possible outcomes. One way of dealing with the uncertainties is to be conservative in the assessments. For example, we may compare the maximal exposure to a radionuclide with a conservatively chosen reference value. In this case, if the exposure is below the reference value then it is possible to assure that the risk is low. Since single values are usually compared; this approach is commonly called 'deterministic'. Its main advantage lies in the simplicity and in that it requires minimum information. However, problems arise when the reference values are actually exceeded or might be exceeded, as in the case of potential exposures, and when the costs for realizing the reference values are high. In those cases, the lack of knowledge on the degree of conservatism involved impairs a rational weighing of the risks against other interests. In this presentation we will outline an approach for dealing with uncertainties that in our opinion is more consistent. We will call it a 'fully probabilistic risk assessment'. The essence of this approach consists in measuring the risk in terms of probabilities, where the later are obtained from comparison of two probabilistic distributions, one reflecting the uncertainties in the outcomes and one reflecting the uncertainties in the reference value (standard) used for defining adverse outcomes. Our first aim

  7. Quantum gravity and Standard-Model-like fermions

    International Nuclear Information System (INIS)

    Eichhorn, Astrid; Lippoldt, Stefan

    2017-01-01

    We discover that chiral symmetry does not act as an infrared attractor of the renormalization group flow under the impact of quantum gravity fluctuations. Thus, observationally viable quantum gravity models must respect chiral symmetry. In our truncation, asymptotically safe gravity does, as a chiral fixed point exists. A second non-chiral fixed point with massive fermions provides a template for models with dark matter. This fixed point disappears for more than 10 fermions, suggesting that an asymptotically safe ultraviolet completion for the standard model plus gravity enforces chiral symmetry.

  8. Study on Standard Fatigue Vehicle Load Model

    Science.gov (United States)

    Huang, H. Y.; Zhang, J. P.; Li, Y. H.

    2018-02-01

    Based on the measured data of truck from three artery expressways in Guangdong Province, the statistical analysis of truck weight was conducted according to axle number. The standard fatigue vehicle model applied to industrial areas in the middle and late was obtained, which adopted equivalence damage principle, Miner linear accumulation law, water discharge method and damage ratio theory. Compared with the fatigue vehicle model Specified by the current bridge design code, the proposed model has better applicability. It is of certain reference value for the fatigue design of bridge in China.

  9. [COMPUTER TECHNOLOGY FOR ACCOUNTING OF CONFOUNDERS IN THE RISK ASSESSMENT IN COMPARATIVE STUDIES ON THE BASE OF THE METHOD OF STANDARDIZATION].

    Science.gov (United States)

    Shalaumova, Yu V; Varaksin, A N; Panov, V G

    2016-01-01

    There was performed an analysis of the accounting of the impact of concomitant variables (confounders), introducing a systematic error in the assessment of the impact of risk factors on the resulting variable. The analysis showed that standardization is an effective method for the reduction of the shift of risk assessment. In the work there is suggested an algorithm implementing the method of standardization based on stratification, providing for the minimization of the difference of distributions of confounders in groups on risk factors. To automate the standardization procedures there was developed a software available on the website of the Institute of Industrial Ecology, UB RAS. With the help of the developed software by numerically modeling there were determined conditions of the applicability of the method of standardization on the basis of stratification for the case of the normal distribution on the response and confounder and linear relationship between them. Comparison ofresults obtained with the help of the standardization with statistical methods (logistic regression and analysis of covariance) in solving the problem of human ecology, has shown that obtaining close results is possible if there will be met exactly conditions for the applicability of statistical methods. Standardization is less sensitive to violations of conditions of applicability.

  10. Development of Learning Models Based on Problem Solving and Meaningful Learning Standards by Expert Validity for Animal Development Course

    Science.gov (United States)

    Lufri, L.; Fitri, R.; Yogica, R.

    2018-04-01

    The purpose of this study is to produce a learning model based on problem solving and meaningful learning standards by expert assessment or validation for the course of Animal Development. This research is a development research that produce the product in the form of learning model, which consist of sub product, namely: the syntax of learning model and student worksheets. All of these products are standardized through expert validation. The research data is the level of validity of all sub products obtained using questionnaire, filled by validators from various field of expertise (field of study, learning strategy, Bahasa). Data were analysed using descriptive statistics. The result of the research shows that the problem solving and meaningful learning model has been produced. Sub products declared appropriate by expert include the syntax of learning model and student worksheet.

  11. Asymptotically Safe Standard Model via Vectorlike Fermions

    Science.gov (United States)

    Mann, R. B.; Meffe, J. R.; Sannino, F.; Steele, T. G.; Wang, Z. W.; Zhang, C.

    2017-12-01

    We construct asymptotically safe extensions of the standard model by adding gauged vectorlike fermions. Using large number-of-flavor techniques we argue that all gauge couplings, including the hypercharge and, under certain conditions, the Higgs coupling, can achieve an interacting ultraviolet fixed point.

  12. Towards a non-perturbative study of the strongly coupled standard model

    International Nuclear Information System (INIS)

    Dagotto, E.; Kogut, J.

    1988-01-01

    The strongly coupled standard model of Abbott and Farhi can be a good alternative to the standard model if it has a phase where chiral symmetry is not broken, the SU(2) sector confines and the scalar field is in the symmetric regime. To look for such a phase we did a numerical analysis in the context of lattice gauge theory. To simplify the model we studied a U(1) gauge theory with Higgs fields and four species of dynamical fermions. In this toy model we did not find a phase with the correct properties required by the strongly coupled standard model. We also speculate about a possible solution to this problem using a new phase of the SU(2) gauge theory with a large number of flavors. (orig.)

  13. Beyond standard model calculations with Sherpa

    Energy Technology Data Exchange (ETDEWEB)

    Hoeche, Stefan [SLAC National Accelerator Laboratory, Menlo Park, CA (United States); Kuttimalai, Silvan [Durham University, Institute for Particle Physics Phenomenology, Durham (United Kingdom); Schumann, Steffen [Universitaet Goettingen, II. Physikalisches Institut, Goettingen (Germany); Siegert, Frank [Institut fuer Kern- und Teilchenphysik, TU Dresden, Dresden (Germany)

    2015-03-01

    We present a fully automated framework as part of the Sherpa event generator for the computation of tree-level cross sections in Beyond Standard Model scenarios, making use of model information given in the Universal FeynRules Output format. Elementary vertices are implemented into C++ code automatically and provided to the matrix-element generator Comix at runtime. Widths and branching ratios for unstable particles are computed from the same building blocks. The corresponding decays are simulated with spin correlations. Parton showers, QED radiation and hadronization are added by Sherpa, providing a full simulation of arbitrary BSM processes at the hadron level. (orig.)

  14. Standard error propagation in R-matrix model fitting for light elements

    International Nuclear Information System (INIS)

    Chen Zhenpeng; Zhang Rui; Sun Yeying; Liu Tingjin

    2003-01-01

    The error propagation features with R-matrix model fitting 7 Li, 11 B and 17 O systems were researched systematically. Some laws of error propagation were revealed, an empirical formula P j = U j c / U j d = K j · S-bar · √m / √N for describing standard error propagation was established, the most likely error ranges for standard cross sections of 6 Li(n,t), 10 B(n,α0) and 10 B(n,α1) were estimated. The problem that the standard error of light nuclei standard cross sections may be too small results mainly from the R-matrix model fitting, which is not perfect. Yet R-matrix model fitting is the most reliable evaluation method for such data. The error propagation features of R-matrix model fitting for compound nucleus system of 7 Li, 11 B and 17 O has been studied systematically, some laws of error propagation are revealed, and these findings are important in solving the problem mentioned above. Furthermore, these conclusions are suitable for similar model fitting in other scientific fields. (author)

  15. Standard model fermions and K(E10

    Directory of Open Access Journals (Sweden)

    Axel Kleinschmidt

    2015-07-01

    Full Text Available In recent work [1] it was shown how to rectify Gell-Mann's proposal for identifying the 48 quarks and leptons of the Standard Model with the 48 spin-12 fermions of maximal SO(8 gauged supergravity remaining after the removal of eight Goldstinos, by deforming the residual U(1 symmetry at the SU(3 × U(1 stationary point of N=8 supergravity, so as to also achieve agreement of the electric charge assignments. In this Letter we show that the required deformation, while not in SU(8, does belong to K(E10, the ‘maximal compact’ subgroup of E10 which is a possible candidate symmetry underlying M theory. The incorporation of infinite-dimensional Kac–Moody symmetries of hyperbolic type, apparently unavoidable for the present scheme to work, opens up completely new perspectives on embedding Standard Model physics into a Planck scale theory of quantum gravity.

  16. Standards and measurements for assessing bone health-workshop report co-sponsored by the International Society for Clinical Densitometry (ISCD) and the National Institute of Standards and Technology (NIST).

    Science.gov (United States)

    Bennett, Herbert S; Dienstfrey, Andrew; Hudson, Lawrence T; Oreskovic, Tammy; Fuerst, Thomas; Shepherd, John

    2006-01-01

    This article reports and discusses the results of the recent ISCD-NIST Workshop on Standards and Measurements for Assessing Bone Health. The purpose of the workshop was to assess the status of efforts to standardize and compare results from dual-energy X-ray absorptiometry (DXA) scans, and then to identify and prioritize ongoing measurement and standards needs.

  17. Search for the standard model Higgs boson in $l\

    Energy Technology Data Exchange (ETDEWEB)

    Li, Dikai [Pierre and Marie Curie Univ., Paris (France)

    2013-01-01

    Humans have always attempted to understand the mystery of Nature, and more recently physicists have established theories to describe the observed phenomena. The most recent theory is a gauge quantum field theory framework, called Standard Model (SM), which proposes a model comprised of elementary matter particles and interaction particles which are fundamental force carriers in the most unified way. The Standard Model contains the internal symmetries of the unitary product group SU(3)c ⓍSU(2)L Ⓧ U(1)Y , describes the electromagnetic, weak and strong interactions; the model also describes how quarks interact with each other through all of these three interactions, how leptons interact with each other through electromagnetic and weak forces, and how force carriers mediate the fundamental interactions.

  18. Automatic creation of Markov models for reliability assessment of safety instrumented systems

    International Nuclear Information System (INIS)

    Guo Haitao; Yang Xianhui

    2008-01-01

    After the release of new international functional safety standards like IEC 61508, people care more for the safety and availability of safety instrumented systems. Markov analysis is a powerful and flexible technique to assess the reliability measurements of safety instrumented systems, but it is fallible and time-consuming to create Markov models manually. This paper presents a new technique to automatically create Markov models for reliability assessment of safety instrumented systems. Many safety related factors, such as failure modes, self-diagnostic, restorations, common cause and voting, are included in Markov models. A framework is generated first based on voting, failure modes and self-diagnostic. Then, repairs and common-cause failures are incorporated into the framework to build a complete Markov model. Eventual simplification of Markov models can be done by state merging. Examples given in this paper show how explosively the size of Markov model increases as the system becomes a little more complicated as well as the advancement of automatic creation of Markov models

  19. Lectures on perturbative QCD, jets and the standard model: collider phenomenology

    International Nuclear Information System (INIS)

    Ellis, S.D.

    1988-01-01

    Applications of the Standard Model to the description of physics at hadron colliders are discussed. Particular attention is paid to the use of jets to characterize this physics. The issue of identifying physics beyond the Standard Model is also discussed. 59 refs., 6 figs., 4 tabs

  20. A Mapmark method of standard setting as implemented for the National Assessment Governing Board.

    Science.gov (United States)

    Schulz, E Matthew; Mitzel, Howard C

    2011-01-01

    This article describes a Mapmark standard setting procedure, developed under contract with the National Assessment Governing Board (NAGB). The procedure enhances the bookmark method with spatially representative item maps, holistic feedback, and an emphasis on independent judgment. A rationale for these enhancements, and the bookmark method, is presented, followed by a detailed description of the materials and procedures used in a meeting to set standards for the 2005 National Assessment of Educational Progress (NAEP) in Grade 12 mathematics. The use of difficulty-ordered content domains to provide holistic feedback is a particularly novel feature of the method. Process evaluation results comparing Mapmark to Anghoff-based methods previously used for NAEP standard setting are also presented.

  1. Using the Many-Faceted Rasch Model to Evaluate Standard Setting Judgments: An Illustration with the Advanced Placement Environmental Science Exam

    Science.gov (United States)

    Kaliski, Pamela K.; Wind, Stefanie A.; Engelhard, George, Jr.; Morgan, Deanna L.; Plake, Barbara S.; Reshetar, Rosemary A.

    2013-01-01

    The many-faceted Rasch (MFR) model has been used to evaluate the quality of ratings on constructed response assessments; however, it can also be used to evaluate the quality of judgments from panel-based standard setting procedures. The current study illustrates the use of the MFR model for examining the quality of ratings obtained from a standard…

  2. A standardized model for predicting flap failure using indocyanine green dye

    Science.gov (United States)

    Zimmermann, Terence M.; Moore, Lindsay S.; Warram, Jason M.; Greene, Benjamin J.; Nakhmani, Arie; Korb, Melissa L.; Rosenthal, Eben L.

    2016-03-01

    Techniques that provide a non-invasive method for evaluation of intraoperative skin flap perfusion are currently available but underutilized. We hypothesize that intraoperative vascular imaging can be used to reliably assess skin flap perfusion and elucidate areas of future necrosis by means of a standardized critical perfusion threshold. Five animal groups (negative controls, n=4; positive controls, n=5; chemotherapy group, n=5; radiation group, n=5; chemoradiation group, n=5) underwent pre-flap treatments two weeks prior to undergoing random pattern dorsal fasciocutaneous flaps with a length to width ratio of 2:1 (3 x 1.5 cm). Flap perfusion was assessed via laser-assisted indocyanine green dye angiography and compared to standard clinical assessment for predictive accuracy of flap necrosis. For estimating flap-failure, clinical prediction achieved a sensitivity of 79.3% and a specificity of 90.5%. When average flap perfusion was more than three standard deviations below the average flap perfusion for the negative control group at the time of the flap procedure (144.3+/-17.05 absolute perfusion units), laser-assisted indocyanine green dye angiography achieved a sensitivity of 81.1% and a specificity of 97.3%. When absolute perfusion units were seven standard deviations below the average flap perfusion for the negative control group, specificity of necrosis prediction was 100%. Quantitative absolute perfusion units can improve specificity for intraoperative prediction of viable tissue. Using this strategy, a positive predictive threshold of flap failure can be standardized for clinical use.

  3. Standard for Models and Simulations

    Science.gov (United States)

    Steele, Martin J.

    2016-01-01

    This NASA Technical Standard establishes uniform practices in modeling and simulation to ensure essential requirements are applied to the design, development, and use of models and simulations (MS), while ensuring acceptance criteria are defined by the program project and approved by the responsible Technical Authority. It also provides an approved set of requirements, recommendations, and criteria with which MS may be developed, accepted, and used in support of NASA activities. As the MS disciplines employed and application areas involved are broad, the common aspects of MS across all NASA activities are addressed. The discipline-specific details of a given MS should be obtained from relevant recommended practices. The primary purpose is to reduce the risks associated with MS-influenced decisions by ensuring the complete communication of the credibility of MS results.

  4. ATLAS discovery potential of the Standard Model Higgs boson

    CERN Document Server

    Weiser, C; The ATLAS collaboration

    2009-01-01

    The Standard Model of elementary particles is remarkably succesful in describing experimental data. The Higgs mechanism as the origin of electroweak symmetry breaking and mass generation, however, has not yet been confirmed experimentally. The search for the Higgs boson is thus one of the most important tasks of the ATLAS experiment at the Large Hadron Collider (LHC). This talk will present an overview of the potential of the ATLAS detector for the discovery of the Standard Model Higgs boson. Different production processes and decay channels -to cover a wide mass range- will be discussed.

  5. ATLAS Discovery Potential of the Standard Model Higgs Boson

    CERN Document Server

    Weiser, C; The ATLAS collaboration

    2010-01-01

    The Standard Model of elementary particles is remarkably succesful in describing experimental data. The Higgs mechanism as the origin of electroweak symmetry breaking and mass generation, however, has not yet been confirmed experimentally. The search for the Higgs boson is thus one of the most important tasks of the ATLAS experiment at the Large Hadron Collider (LHC). This talk will present an overview of the potential of the ATLAS detector for the discovery of the Standard Model Higgs boson. Different production processes and decay channels -to cover a wide mass range- will be discussed.

  6. Majorana neutrinos in a warped 5D standard model

    International Nuclear Information System (INIS)

    Huber, S.J.; Shafi, Q.

    2002-05-01

    We consider neutrino oscillations and neutrinoless double beta decay in a five dimensional standard model with warped geometry. Although the see-saw mechanism in its simplest form cannot be implemented because of the warped geometry, the bulk standard model neutrinos can acquire the desired (Majorana) masses from dimension five interactions. We discuss how large mixings can arise, why the large mixing angle MSW solution for solar neutrinos is favored, and provide estimates for the mixing angle U e3 . Implications for neutrinoless double beta decay are also discussed. (orig.)

  7. A Comprehensive Evaluation of Standardized Assessment Tools in the Diagnosis of Fibromyalgia and in the Assessment of Fibromyalgia Severity

    Directory of Open Access Journals (Sweden)

    Chad S. Boomershine

    2012-01-01

    Full Text Available Standard assessments for fibromyalgia (FM diagnosis and core FM symptom domains are needed for biomarker development and treatment trials. Diagnostic and symptom assessments are reviewed and recommendations are made for standards. Recommendations for existing assessments include the American College of Rheumatology FM classification criteria using the manual tender point Survey for diagnosis, the brief pain inventory average pain visual analogue scale for pain intensity, the function subscale of the revised fibromyalgia impact questionnaire (FIQR for physical function, the patient global impression of change and FIQR for overall/global improvement, the hospital anxiety and depression scale depression subscale for depression, the multiple ability self-report questionnaire for cognitive dysfunction, the fatigue severity scale for fatigue, the FIQR for multidimensional function/health-related quality of life, the jenkins sleep scale for sleep disturbance, and the fibromyalgia intensity score for tenderness. Forthcoming assessments including the FIQR for diagnosis, NIH PROMIS, and FIBRO Change scales are discussed.

  8. Evaluation model applied to TRANSPETRO's Marine Terminals Standardization Program

    Energy Technology Data Exchange (ETDEWEB)

    Almeida, Maria Fatima Ludovico de; Mueller, Gabriela [Pontificia Universidade Catolica do Rio de Janeiro (PUC-Rio), Rio de Janeiro, RJ (Brazil). Instituto Tecnologico; Garcia, Luciano Maldonado [TRANSPETRO - PETROBRAS Transporte S.A., Rio de Janeiro, RJ (Brazil)

    2009-07-01

    This paper describes an innovative evaluation model applied to TRANSPETRO's 'Marine Terminals Standardization Program' based on updating approaches of programs evaluation and organizational learning. Since the program was launched in 2004, the need for having an evaluation model able to evaluate its implementation progress, to measure the degree of standards compliance and its potential economic, social and environmental impacts has become evident. Within a vision of safe and environmentally responsible operations of marine terminals, this evaluation model was jointly designed by TRANSPETRO and PUC-Rio to promote continuous improvement and learning in operational practices and in the standardization process itself. TRANSPETRO believes that standardization supports its services and management innovation capability by creating objective and internationally recognized parameters, targets and metrology for its business activities. The conceptual model and application guidelines for this important tool are presented in this paper, as well as the next steps towards its implementation. (author)

  9. DOE [Department of Energy]-Nuclear Energy Standards Program annual assessment, FY 1990

    International Nuclear Information System (INIS)

    Williams, D.L. Jr.

    1990-11-01

    To meet the objectives of the programs funded by the Department of Energy (DOE)-Nuclear Energy (NE) Technology Support Programs, the Performance Assurance Project Office (PAPO) administers a nuclear standards program and related activities and fosters the development and application of standards. This standards program is carried out in accordance with the principles in DOE Order 1300.2, Department of Energy Standards Program, December 18, 1980. The purposes of this effort, as set forth in three subtasks, are to (1) manage the NE Standards Program, (2) manage the development and maintenance of NE standards, and (3) operate an NE Standards Information Program. This report assesses the Performance Assurance Project Office (PAPO) activities in terms of the objectives of the Department of Energy-Nuclear Energy (DOE-NE) funded programs. To meet these objectives, PAPO administers a nuclear standards program and related activities and fosters the development and application of standards. This task is carried out in accordance with the principles set forth in DOE Order 1300.2, Department of Energy Standards Program, December 18, 1980, and DOE memorandum, Implementation of DOE Orders on Quality Assurance, Standards, and Unusual Occurrence Reporting for Nuclear Energy Programs, March 3, 1982, and with guidance from the DOE-NE Technology Support Programs. 1 tab. (JF)

  10. Model Manajemen Laba Akrual dan Riil Berbasis Implementasi International Financial Reporting Standards

    Directory of Open Access Journals (Sweden)

    Nurmala Ahmar

    2016-03-01

    Full Text Available The aim of this study was to investigate the impact of inplementasi International Financial Reporting Standards (IFRS on accrual earnings management and real earnings management. Adoption of accounting standards have an impact on the way of assessment, measurement and presentation. Samples are manufacturing companies listed in Indonesia Stock Exchange. Accrual earnings measurement method using five measurement approach, and three approaches to the measurement of real earnings management. The results showed that there were differences in real earnings management approach diskretioner costs and production costs. Three of the five methods used accrual earnings management (Modified Jones, Piecewise Linear and Kothari proved to be the difference between before than after IFRS. While, Stubben Model not proved in this research. Results of this study are expected to have positive contribution on the development some policies related to the adoption of IFRS and the, particularly related to the accrual earnings management and real earnings management.

  11. Fitting Simpson's neutrino into the standard model

    International Nuclear Information System (INIS)

    Valle, J.W.F.

    1985-01-01

    I show how to accomodate the 17 keV state recently by Simpson as one of the neutrinos of the standard model. Experimental constraints can only be satisfied if the μ and tau neutrino combine to a very good approximation to form a Dirac neutrino of 17 keV leaving a light νsub(e). Neutrino oscillations will provide the most stringent test of the model. The cosmological bounds are also satisfied in a natural way in models with Goldstone bosons. Explicit examples are given in the framework of majoron-type models. Constraints on the lepton symmetry breaking scale which follow from astrophysics, cosmology and laboratory experiments are discussed. (orig.)

  12. Army Model and Simulation Standards Report FY98

    National Research Council Canada - National Science Library

    1997-01-01

    ...) standards efforts as work progresses towards the objective Army M&S environment. This report specifically documents projects approved for funding through the Army Model and Improvement Program (AMIP...

  13. Handbook on Life Cycle Assessment. Operational Guide to the ISO Standards

    Energy Technology Data Exchange (ETDEWEB)

    Guinee, J.B. [Centre of Environmental Studies, Leiden University, Leiden (Netherlands)

    2002-04-01

    In 1992 the Centre of Environmental Science (CML) at Leiden University, The Netherlands, published a Guide on Environmental Life Cycle Assessment (LCA) methodology, setting the standard for a long time. Since then LCA methodology has progressed enormously and the International Organization for Standardization (ISO) has published a series of Standards on LCA. These developments have now been incorporated into a new Handbook on LCA authored by CML in cooperation with a number of other important institutes in the area of LCA. The general aim of this Handbook on LCA is to provide a stepwise 'cookbook' with operational guidelines for conducting an LCA study step-by-step, justified by a scientific background document, based on the ISO Standards for LCA. The different ISO elements and requirements are made operational to the 'best available practice' for each step. CML is strongly involved in the development of a standard methodology to determine environmental impacts of products, i.e., LCA. This is done within international fora such as the Society for Environmental Toxicology and Chemistry (SETAC), the International Organization for Standardization (ISO), and the United Nations Environmental Programme (UNEP)

  14. Accuracy of virtual models in the assessment of maxillary defects

    International Nuclear Information System (INIS)

    Kamburoglu, Kivanc; Kursun, Sebnem; Kilic, Cenk; Eozen, Tuncer

    2015-01-01

    This study aimed to assess the reliability of measurements performed on three-dimensional (3D) virtual models of maxillary defects obtained using cone-beam computed tomography (CBCT) and 3D optical scanning. Mechanical cavities simulating maxillary defects were prepared on the hard palate of nine cadavers. Images were obtained using a CBCT unit at three different fields-of-views (FOVs) and voxel sizes: 1) 60 X 60 mm FOV, 0.125 mm 3 (FOV 60 ); 2) 80 X 80 mm FOV, 0.160 mm 3 (FOV 80 ); and 3) 100 X 100 mm FOV, 0.250 mm 3 (FOV 100 ). Superimposition of the images was performed using software called VRMesh Design. Automated volume measurements were conducted, and differences between surfaces were demonstrated. Silicon impressions obtained from the defects were also scanned with a 3D optical scanner. Virtual models obtained using VRMesh Design were compared with impressions obtained by scanning silicon models. Gold standard volumes of the impression models were then compared with CBCT and 3D scanner measurements. Further, the general linear model was used, and the significance was set to p=0.05. A comparison of the results obtained by the observers and methods revealed the p values to be smaller than 0.05, suggesting that the measurement variations were caused by both methods and observers along with the different cadaver specimens used. Further, the 3D scanner measurements were closer to the gold standard measurements when compared to the CBCT measurements. In the assessment of artificially created maxillary defects, the 3D scanner measurements were more accurate than the CBCT measurements.

  15. Accuracy of virtual models in the assessment of maxillary defects

    Energy Technology Data Exchange (ETDEWEB)

    Kamburoglu, Kivanc [Dept. of Dentomaxillofacial Radiology, Faculty of Dentistry, Ankara University, Ankara (Turkmenistan); Kursun, Sebnem [Division of Dentomaxillofacial Radiology, Ministry of Health, Oral and Dental Health Center, Bolu (Turkmenistan); Kilic, Cenk; Eozen, Tuncer [Gealhane Military Medical Academy, Ankara, (Turkmenistan)

    2015-03-15

    This study aimed to assess the reliability of measurements performed on three-dimensional (3D) virtual models of maxillary defects obtained using cone-beam computed tomography (CBCT) and 3D optical scanning. Mechanical cavities simulating maxillary defects were prepared on the hard palate of nine cadavers. Images were obtained using a CBCT unit at three different fields-of-views (FOVs) and voxel sizes: 1) 60 X 60 mm FOV, 0.125 mm{sup 3} (FOV{sub 60}); 2) 80 X 80 mm FOV, 0.160 mm{sup 3} (FOV{sub 80}); and 3) 100 X 100 mm FOV, 0.250 mm{sup 3} (FOV{sub 100}). Superimposition of the images was performed using software called VRMesh Design. Automated volume measurements were conducted, and differences between surfaces were demonstrated. Silicon impressions obtained from the defects were also scanned with a 3D optical scanner. Virtual models obtained using VRMesh Design were compared with impressions obtained by scanning silicon models. Gold standard volumes of the impression models were then compared with CBCT and 3D scanner measurements. Further, the general linear model was used, and the significance was set to p=0.05. A comparison of the results obtained by the observers and methods revealed the p values to be smaller than 0.05, suggesting that the measurement variations were caused by both methods and observers along with the different cadaver specimens used. Further, the 3D scanner measurements were closer to the gold standard measurements when compared to the CBCT measurements. In the assessment of artificially created maxillary defects, the 3D scanner measurements were more accurate than the CBCT measurements.

  16. Making Validated Educational Models Central in Preschool Standards.

    Science.gov (United States)

    Schweinhart, Lawrence J.

    This paper presents some ideas to preschool educators and policy makers about how to make validated educational models central in standards for preschool education and care programs that are available to all 3- and 4-year-olds. Defining an educational model as a coherent body of program practices, curriculum content, program and child, and teacher…

  17. A standard library for modeling satellite orbits on a microcomputer

    Science.gov (United States)

    Beutel, Kenneth L.

    1988-03-01

    Introductory students of astrodynamics and the space environment are required to have a fundamental understanding of the kinematic behavior of satellite orbits. This thesis develops a standard library that contains the basic formulas for modeling earth orbiting satellites. This library is used as a basis for implementing a satellite motion simulator that can be used to demonstrate orbital phenomena in the classroom. Surveyed are the equations of orbital elements, coordinate systems and analytic formulas, which are made into a standard method for modeling earth orbiting satellites. The standard library is written in the C programming language and is designed to be highly portable between a variety of computer environments. The simulation draws heavily on the standards established by the library to produce a graphics-based orbit simulation program written for the Apple Macintosh computer. The simulation demonstrates the utility of the standard library functions but, because of its extensive use of the Macintosh user interface, is not portable to other operating systems.

  18. Beyond The Standard Model Higgs Physics with Photons with the CMS Detector

    CERN Document Server

    Teixeira de Lima, Rafael

    The experimental discovery of the Higgs boson is one of the latest successes of the Standard Model of particle physics. Although all measurements have confirmed that this newly discovered particle is the Higgs boson predicted by the Standard Model, with no deviations to suggest otherwise, the Higgs boson can guide us to new models which modify the electroweak symmetry breaking mechanism or predict new states that couple to the Higgs. Therefore, it's paramount to directly look for modifications of our current model with the help of the recently discovered particle. In this thesis, two analyses involving beyond the Standard Model physics tied to the Higgs sector will be explored. First, looking at exotic Higgs decays, an analysis searching for the final state with photons and missing transverse energy will be presented. Then, the search for Higgs pair production, both resonantly and non-resonantly (a process predicted by the Standard Model, albeit at very low rates), in the final state with two bottom quark je...

  19. Accounting for correlated observations in an age-based state-space stock assessment model

    DEFF Research Database (Denmark)

    Berg, Casper Willestofte; Nielsen, Anders

    2016-01-01

    Fish stock assessment models often relyon size- or age-specific observations that are assumed to be statistically independent of each other. In reality, these observations are not raw observations, but rather they are estimates from a catch-standardization model or similar summary statistics base...... the independence assumption is rejected. Less fluctuating estimates of the fishing mortality is obtained due to a reduced process error. The improved model does not suffer from correlated residuals unlike the independent model, and the variance of forecasts is decreased....

  20. Framework for Assessing the ICT Competency in Teachers up to the Requirements of "Teacher" Occupational Standard

    Science.gov (United States)

    Avdeeva, Svetlana; Zaichkina, Olga; Nikulicheva, Nataliya; Khapaeva, Svetlana

    2016-01-01

    The paper deals with problems of working out a test framework for the assessment of teachers' ICT competency in line with the requirements of "Teacher" occupational standard. The authors have analyzed the known approaches to assessing teachers' ICT competency--ISTE Standards and UNESCO ICT CFT and have suggested their own approach to…

  1. Momasi Model in Need Assessment of Faculty Members of Alborz University

    Directory of Open Access Journals (Sweden)

    S. Esmaelzadeh

    2013-02-01

    Full Text Available Background: The first step in developing human resources to improve the performance of universities is to indentify accurate educational needs. Models may draw on a number of theories to help understand a particular problem in a certain setting or context. Momasi model is an integrated of the existing models in educational needs assessment field which has sufficient comprehensiveness of data collection. the aim of this study was application of Momasi model in need assessment of faculty members in seven areas duties. Methods: This study is a cross- sectional study which was formed based on Momasi model between34 faculty members of Alborz university. Results: Different areas of educational needs were respectively prioritized as: personal development, research, administrative and executive activities, education, health services and health promotion, and specialized activities outside the university. The most mean and standard deviation belong to area of research, The first priority in the area of research was the publications in English, in personal development area: familiarity with SPSS software ,and the area of education it was creativity nurture. Conclusion: Based on assessment results, research area in this needs assessment study has the most important priority and frequency. Therefore it is recommended that data gathered in research area section put in first priority for empowering for faculty members Of Alborz University.

  2. Higgs bosons in the standard model, the MSSM and beyond

    Indian Academy of Sciences (India)

    Abstract. I summarize the basic theory and selected phenomenology for the Higgs boson(s) of the standard model, the minimal supersymmetric model and some extensions thereof, including the next-to-minimal supersymmetric model.

  3. The development and implementation of a decision-making capacity assessment model.

    Science.gov (United States)

    Parmar, Jasneet; Brémault-Phillips, Suzette; Charles, Lesley

    2015-03-01

    Decision-making capacity assessment (DMCA) is an issue of increasing importance for older adults. Current challenges need to be explored, and potential processes and strategies considered in order to address issues of DMCA in a more coordinated manner. An iterative process was used to address issues related to DMCA. This began with recognition of challenges associated with capacity assessments (CAs) by staff at Covenant Health (CH). Review of the literature, as well as discussions with and a survey of staff at three CH sites, resulted in determination of issues related to DMCA. Development of a DMCA Model and demonstration of its feasibility followed. A process was proposed with front-end screening/problem- solving, a well-defined standard assessment, and definition of team member roles. A Capacity Assessment Care Map was formulated based on the process. Documentation was developed consisting of a Capacity Assessment Process Worksheet, Capacity Interview Worksheet, and a brochure. Interactive workshops were delivered to familiarize staff with the DMCA Model. A successful demonstration project led to implementation across all sites in the Capital Health region, and eventual provincial endorsement. Concerns identified in the survey and in the literature regarding CA were addressed through the holistic interdisciplinary approach offered by the DMCA Model.

  4. Introduction to gauge theories and the Standard Model

    CERN Document Server

    de Wit, Bernard

    1995-01-01

    The conceptual basis of gauge theories is introduced to enable the construction of generic models.Spontaneous symmetry breaking is dicussed and its relevance for the renormalization of theories with massive vector field is explained. Subsequently a d standard model. When time permits we will address more practical questions that arise in the evaluation of quantum corrections.

  5. Computable general equilibrium models for sustainability impact assessment: Status quo and prospects

    International Nuclear Information System (INIS)

    Boehringer, Christoph; Loeschel, Andreas

    2006-01-01

    Sustainability Impact Assessment (SIA) of economic, environmental, and social effects triggered by governmental policies has become a central requirement for policy design. The three dimensions of SIA are inherently intertwined and subject to trade-offs. Quantification of trade-offs for policy decision support requires numerical models in order to assess systematically the interference of complex interacting forces that affect economic performance, environmental quality, and social conditions. This paper investigates the use of computable general equilibrium (CGE) models for measuring the impacts of policy interference on policy-relevant economic, environmental, and social (institutional) indicators. We find that operational CGE models used for energy-economy-environment (E3) analyses have a good coverage of central economic indicators. Environmental indicators such as energy-related emissions with direct links to economic activities are widely covered, whereas indicators with complex natural science background such as water stress or biodiversity loss are hardly represented. Social indicators stand out for very weak coverage, mainly because they are vaguely defined or incommensurable. Our analysis identifies prospects for future modeling in the field of integrated assessment that link standard E3-CGE-models to themespecific complementary models with environmental and social focus. (author)

  6. Assessment of hospital performance with a case-mix standardized mortality model using an existing administrative database in Japan.

    Science.gov (United States)

    Miyata, Hiroaki; Hashimoto, Hideki; Horiguchi, Hiromasa; Fushimi, Kiyohide; Matsuda, Shinya

    2010-05-19

    Few studies have examined whether risk adjustment is evenly applicable to hospitals with various characteristics and case-mix. In this study, we applied a generic prediction model to nationwide discharge data from hospitals with various characteristics. We used standardized data of 1,878,767 discharged patients provided by 469 hospitals from July 1 to October 31, 2006. We generated and validated a case-mix in-hospital mortality prediction model using 50/50 split sample validation. We classified hospitals into two groups based on c-index value (hospitals with c-index > or = 0.8; hospitals with c-index /=0.8 and were classified as the higher c-index group. A significantly higher proportion of hospitals in the lower c-index group were specialized hospitals and hospitals with convalescent wards. The model fits well to a group of hospitals with a wide variety of acute care events, though model fit is less satisfactory for specialized hospitals and those with convalescent wards. Further sophistication of the generic prediction model would be recommended to obtain optimal indices to region specific conditions.

  7. The role of Health Impact Assessment in the setting of air quality standards: An Australian perspective

    Energy Technology Data Exchange (ETDEWEB)

    Spickett, Jeffery, E-mail: J.Spickett@curtin.edu.au [WHO Collaborating Centre for Environmental Health Impact Assessment (Australia); Faculty of Health Sciences, School of Public Health, Curtin University, Perth, Western Australia (Australia); Katscherian, Dianne [WHO Collaborating Centre for Environmental Health Impact Assessment (Australia); Faculty of Health Sciences, School of Public Health, Curtin University, Perth, Western Australia (Australia); Harris, Patrick [CHETRE — UNSW Research Centre for Primary Health Care and Equity, University of New South Wales (Australia)

    2013-11-15

    The approaches used for setting or reviewing air quality standards vary from country to country. The purpose of this research was to consider the potential to improve decision-making through integration of HIA into the processes to review and set air quality standards used in Australia. To assess the value of HIA in this policy process, its strengths and weaknesses were evaluated aligned with review of international processes for setting air quality standards. Air quality standard setting programmes elsewhere have either used HIA or have amalgamated and incorporated factors normally found within HIA frameworks. They clearly demonstrate the value of a formalised HIA process for setting air quality standards in Australia. The following elements should be taken into consideration when using HIA in standard setting. (a) The adequacy of a mainly technical approach in current standard setting procedures to consider social determinants of health. (b) The importance of risk assessment criteria and information within the HIA process. The assessment of risk should consider equity, the distribution of variations in air quality in different locations and the potential impacts on health. (c) The uncertainties in extrapolating evidence from one population to another or to subpopulations, especially the more vulnerable, due to differing environmental factors and population variables. (d) The significance of communication with all potential stakeholders on issues associated with the management of air quality. In Australia there is also an opportunity for HIA to be used in conjunction with the NEPM to develop local air quality standard measures. The outcomes of this research indicated that the use of HIA for air quality standard setting at the national and local levels would prove advantageous. -- Highlights: • Health Impact Assessment framework has been applied to a policy development process. • HIA process was evaluated for application in air quality standard setting.

  8. NASA Standard for Models and Simulations: Philosophy and Requirements Overview

    Science.gov (United States)

    Blattnig, Steve R.; Luckring, James M.; Morrison, Joseph H.; Sylvester, Andre J.; Tripathi, Ram K.; Zang, Thomas A.

    2013-01-01

    Following the Columbia Accident Investigation Board report, the NASA Administrator chartered an executive team (known as the Diaz Team) to identify those CAIB report elements with NASA-wide applicability and to develop corrective measures to address each element. One such measure was the development of a standard for the development, documentation, and operation of models and simulations. This report describes the philosophy and requirements overview of the resulting NASA Standard for Models and Simulations.

  9. How TK-TD and population models for aquatic macrophytes could support the risk assessment for plant protection products.

    Science.gov (United States)

    Hommen, Udo; Schmitt, Walter; Heine, Simon; Brock, Theo Cm; Duquesne, Sabine; Manson, Phil; Meregalli, Giovanna; Ochoa-Acuña, Hugo; van Vliet, Peter; Arts, Gertie

    2016-01-01

    This case study of the Society of Environmental Toxicology and Chemistry (SETAC) workshop MODELINK demonstrates the potential use of mechanistic effects models for macrophytes to extrapolate from effects of a plant protection product observed in laboratory tests to effects resulting from dynamic exposure on macrophyte populations in edge-of-field water bodies. A standard European Union (EU) risk assessment for an example herbicide based on macrophyte laboratory tests indicated risks for several exposure scenarios. Three of these scenarios are further analyzed using effect models for 2 aquatic macrophytes, the free-floating standard test species Lemna sp., and the sediment-rooted submerged additional standard test species Myriophyllum spicatum. Both models include a toxicokinetic (TK) part, describing uptake and elimination of the toxicant, a toxicodynamic (TD) part, describing the internal concentration-response function for growth inhibition, and a description of biomass growth as a function of environmental factors to allow simulating seasonal dynamics. The TK-TD models are calibrated and tested using laboratory tests, whereas the growth models were assumed to be fit for purpose based on comparisons of predictions with typical growth patterns observed in the field. For the risk assessment, biomass dynamics are predicted for the control situation and for several exposure levels. Based on specific protection goals for macrophytes, preliminary example decision criteria are suggested for evaluating the model outputs. The models refined the risk indicated by lower tier testing for 2 exposure scenarios, while confirming the risk associated for the third. Uncertainties related to the experimental and the modeling approaches and their application in the risk assessment are discussed. Based on this case study and the assumption that the models prove suitable for risk assessment once fully evaluated, we recommend that 1) ecological scenarios be developed that are also

  10. Building up the standard gauge model of high energy physics. 11

    International Nuclear Information System (INIS)

    Rajasekaran, G.

    1989-01-01

    This chapter carefully builds up, step by step, the standard gauge model of particle physics based on the group SU(3) c x SU(2) x U(1). Spontaneous symmetry breaking via the Nambu-Goldstone mode, and then via the Higgs mode for gauge theories, are presented via examples, first for the Abelian U(1) and then for the non-Abelian SU(2) case. The physically interesting SU(2) x U(1) model is then taken up. The emergence of massive vector bosons is demonstrated. After this preparation, the 'standard model' of the late 60's prior to the gauge theory revolution, based on the V-A current-current weak interactions, minimal electromagnetism, and an unspecified strong interaction, all in quark-lepton language, is set up. It is then compared to the standard gauge model of SU(3) c x SU(2) x U(1). The compelling reasons for QCD as the gauge theory of strong interactions are spelt out. An introduction to renormalization group methods as the main calculational tool for QCD, asymptotic freedom, infrared problems, and physically motivated reasons for going beyond the standard model are presented. (author). 6 refs.; 19 figs.; 2 tabs

  11. CCR+: Metadata Based Extended Personal Health Record Data Model Interoperable with the ASTM CCR Standard.

    Science.gov (United States)

    Park, Yu Rang; Yoon, Young Jo; Jang, Tae Hun; Seo, Hwa Jeong; Kim, Ju Han

    2014-01-01

    Extension of the standard model while retaining compliance with it is a challenging issue because there is currently no method for semantically or syntactically verifying an extended data model. A metadata-based extended model, named CCR+, was designed and implemented to achieve interoperability between standard and extended models. Furthermore, a multilayered validation method was devised to validate the standard and extended models. The American Society for Testing and Materials (ASTM) Community Care Record (CCR) standard was selected to evaluate the CCR+ model; two CCR and one CCR+ XML files were evaluated. In total, 188 metadata were extracted from the ASTM CCR standard; these metadata are semantically interconnected and registered in the metadata registry. An extended-data-model-specific validation file was generated from these metadata. This file can be used in a smartphone application (Health Avatar CCR+) as a part of a multilayered validation. The new CCR+ model was successfully evaluated via a patient-centric exchange scenario involving multiple hospitals, with the results supporting both syntactic and semantic interoperability between the standard CCR and extended, CCR+, model. A feasible method for delivering an extended model that complies with the standard model is presented herein. There is a great need to extend static standard models such as the ASTM CCR in various domains: the methods presented here represent an important reference for achieving interoperability between standard and extended models.

  12. [The application of two occupation health risk assessment models in a wooden furniture manufacturing industry].

    Science.gov (United States)

    Wang, A H; Leng, P B; Bian, G L; Li, X H; Mao, G C; Zhang, M B

    2016-10-20

    Objective: To explore the applicability of 2 different models of occupational health risk assessment in wooden furniture manufacturing industry. Methods: American EPA inhalation risk model and ICMM model of occupational health risk assessment were conducted to assess occupational health risk in a small wooden furniture enterprises, respectively. Results: There was poor protective measure and equipment of occupational disease in the plant. The concentration of wood dust in the air of two workshops was over occupational exposure limit (OEL) , and the C TWA was 8.9 mg/m 3 and 3.6 mg/m 3 , respectively. According to EPA model, the workers who exposed to benzene in this plant had high risk (9.7×10 -6 ~34.3×10 -6 ) of leukemia, and who exposed to formaldehyde had high risk (11.4 × 10 -6 ) of squamous cell carcinoma. There were inconsistent evaluation results using the ICMM tools of standard-based matrix and calculated risk rating. There were very high risks to be attacked by rhinocarcinoma of the workers who exposed to wood dust for the tool of calculated risk rating, while high risk for the tool of standard-based matrix. For the workers who exposed to noise, risk of noise-induced deafness was unacceptable and medium risk using two tools, respectively. Conclusion: Both EPA model and ICMM model can appropriately predict and assessthe occupational health risk in wooden furniture manufactory, ICMM due to the relatively simple operation, easy evaluation parameters, assessment of occupational - disease - inductive factors comprehensively, and more suitable for wooden furniture production enterprise.

  13. Eulerian Circles (Venn Diagrams) as model for modern economy education on the basis of Russian professional standards

    Science.gov (United States)

    Sharonov, M. A.; Sharonova, O. V.; Sharonova, V. P.

    2018-03-01

    The article is an attempt to create a model built using Eulerian circles (Venn diagrams) to illustrate the methodological impact of recent Federal Law 283-FZ “On the independent evaluation of qualifications” and new Federal State Educational Standards of higher education of generation 3++ on educational process in Russia. In modern economic conditions, the ability to correctly assess the role of professional standards, as a matter of fact, some set, the degree of intersection with the approximate basic educational program and the Federal State Educational Standards becomes an important factor on which in the future will depend not only the demand of graduates in the labor market, but also the possibility of passing the professional and public accreditation of the proposed program.

  14. Preliminary assessment of PWR Steam Generator modelling in RELAP5/MOD3

    International Nuclear Information System (INIS)

    Preece, R.J.; Putney, J.M.

    1993-07-01

    A preliminary assessment of Steam Generator (SG) modelling in the PWR thermal-hydraulic code RELAP5/MOD3 is presented. The study is based on calculations against a series of steady-state commissioning tests carried out on the Wolf Creek PWR over a range of load conditions. Data from the tests are used to assess the modelling of primary to secondary side heat transfer and, in particular, to examine the effect of reverting to the standard form of the Chen heat transfer correlation in place of the modified form applied in RELAP5/MOD2. Comparisons between the two versions of the code are also used to show how the new interphase drag model in RELAP5/MOD3 affects the calculation of SG liquid inventory and the void fraction profile in the riser

  15. Prototyping an online wetland ecosystem services model using open model sharing standards

    Science.gov (United States)

    Feng, M.; Liu, S.; Euliss, N.H.; Young, Caitlin; Mushet, D.M.

    2011-01-01

    Great interest currently exists for developing ecosystem models to forecast how ecosystem services may change under alternative land use and climate futures. Ecosystem services are diverse and include supporting services or functions (e.g., primary production, nutrient cycling), provisioning services (e.g., wildlife, groundwater), regulating services (e.g., water purification, floodwater retention), and even cultural services (e.g., ecotourism, cultural heritage). Hence, the knowledge base necessary to quantify ecosystem services is broad and derived from many diverse scientific disciplines. Building the required interdisciplinary models is especially challenging as modelers from different locations and times may develop the disciplinary models needed for ecosystem simulations, and these models must be identified and made accessible to the interdisciplinary simulation. Additional difficulties include inconsistent data structures, formats, and metadata required by geospatial models as well as limitations on computing, storage, and connectivity. Traditional standalone and closed network systems cannot fully support sharing and integrating interdisciplinary geospatial models from variant sources. To address this need, we developed an approach to openly share and access geospatial computational models using distributed Geographic Information System (GIS) techniques and open geospatial standards. We included a means to share computational models compliant with Open Geospatial Consortium (OGC) Web Processing Services (WPS) standard to ensure modelers have an efficient and simplified means to publish new models. To demonstrate our approach, we developed five disciplinary models that can be integrated and shared to simulate a few of the ecosystem services (e.g., water storage, waterfowl breeding) that are provided by wetlands in the Prairie Pothole Region (PPR) of North America.

  16. Accidentally safe extensions of the Standard Model

    CERN Document Server

    Di Luzio, Luca; Kamenik, Jernej F.; Nardecchia, Marco

    2015-01-01

    We discuss a class of weak-scale extensions of the Standard Model which is completely invisible to low-energy indirect probes. The typical signature of this scenario is the existence of new charged and/or colored states which are stable on the scale of high-energy particle detectors.

  17. Modeling of storage tank settlement based on the United States standards

    Directory of Open Access Journals (Sweden)

    Gruchenkova Alesya

    2018-01-01

    Full Text Available Up to 60% of storage tanks in operation have uneven settlement of the outer bottom contour, which often leads to accidents. Russian and foreign regulatory documents have different requirements for strain limits of metal structures. There is an increasing need for harmonizing regulatory documents. The aim of this study is to theoretically justify and to assess the possibility of applying the U.S. standards for specifying the allowable settlement of storage tanks used in Russia. The allowable uneven settlement was calculated for a vertical steel tank (VST-20000 according to API-653, a standard of the American Petroleum Institute. The calculated allowable settlement levels were compared with those established by Russian standards. Based on the finite element method, the uneven settlement development process of a storage tank was modeled. Stress-strain state parameters of tank structures were obtained at the critical levels established in API-653. Relationships of maximum equivalent stresses in VST metal structures to the vertical settlement component for settlement zones of 6 to 72 m in length were determined. When the uneven settlement zone is 6 m in length, the limit state is found to be caused by 30-mm vertical settlement, while stresses in the wall exceed 330 MPa. When the uneven settlement zone is 36 m in length, stresses reach the yield point only at 100-mm vertical settlement.

  18. Standard Model Effective Potential from Trace Anomalies

    Directory of Open Access Journals (Sweden)

    Renata Jora

    2018-01-01

    Full Text Available By analogy with the low energy QCD effective linear sigma model, we construct a standard model effective potential based entirely on the requirement that the tree level and quantum level trace anomalies must be satisfied. We discuss a particular realization of this potential in connection with the Higgs boson mass and Higgs boson effective couplings to two photons and two gluons. We find that this kind of potential may describe well the known phenomenology of the Higgs boson.

  19. Implementation of Electrical Simulation Model for IEC Standard Type-3A Generator

    DEFF Research Database (Denmark)

    Subramanian, Chandrasekaran; Casadei, Domenico; Tani, Angelo

    2013-01-01

    This paper describes the implementation of electrical simulation model for IEC 61400-27-1 standard Type-3A generator. A general overview of the different wind electric generators(WEG) types are given and the main focused on Type-3A WEG standard models, namely a model for a variable speed wind tur...

  20. Standardized mean differences cause funnel plot distortion in publication bias assessments.

    Science.gov (United States)

    Zwetsloot, Peter-Paul; Van Der Naald, Mira; Sena, Emily S; Howells, David W; IntHout, Joanna; De Groot, Joris Ah; Chamuleau, Steven Aj; MacLeod, Malcolm R; Wever, Kimberley E

    2017-09-08

    Meta-analyses are increasingly used for synthesis of evidence from biomedical research, and often include an assessment of publication bias based on visual or analytical detection of asymmetry in funnel plots. We studied the influence of different normalisation approaches, sample size and intervention effects on funnel plot asymmetry, using empirical datasets and illustrative simulations. We found that funnel plots of the Standardized Mean Difference (SMD) plotted against the standard error (SE) are susceptible to distortion, leading to overestimation of the existence and extent of publication bias. Distortion was more severe when the primary studies had a small sample size and when an intervention effect was present. We show that using the Normalised Mean Difference measure as effect size (when possible), or plotting the SMD against a sample size-based precision estimate, are more reliable alternatives. We conclude that funnel plots using the SMD in combination with the SE are unsuitable for publication bias assessments and can lead to false-positive results.

  1. Implementation of IEC standard models for power system stability studies

    Energy Technology Data Exchange (ETDEWEB)

    Margaris, Ioannis D.; Hansen, Anca D.; Soerensen, Poul [Technical Univ. of Denmark, Roskilde (Denmark). Dept. of Wind Energy; Bech, John; Andresen, Bjoern [Siemens Wind Power A/S, Brande (Denmark)

    2012-07-01

    This paper presents the implementation of the generic wind turbine generator (WTG) electrical simulation models proposed in the IEC 61400-27 standard which is currently in preparation. A general overview of the different WTG types is given while the main focus is on Type 4B WTG standard model, namely a model for a variable speed wind turbine with full scale power converter WTG including a 2-mass mechanical model. The generic models for fixed and variable speed WTGs models are suitable for fundamental frequency positive sequence response simulations during short events in the power system such as voltage dips. The general configuration of the models is presented and discussed; model implementation in the simulation software platform DIgSILENT PowerFactory is presented in order to illustrate the range of applicability of the generic models under discussion. A typical voltage dip is simulated and results from the basic electrical variables of the WTG are presented and discussed. (orig.)

  2. Standard Model Higgs Boson with the L3 Experiment at LEP

    CERN Document Server

    Achard, P.; Aguilar-Benitez, M.; Alcaraz, J.; Alemanni, G.; Allaby, J.; Aloisio, A.; Alviggi, M.G.; Anderhub, H.; Andreev, Valery P.; Anselmo, F.; Arefev, A.; Azemoon, T.; Aziz, T.; Baarmand, M.; Bagnaia, P.; Bajo, A.; Baksay, G.; Baksay, L.; Baldew, S.V.; Banerjee, S.; Banerjee, Sw.; Barczyk, A.; Barillere, R.; Bartalini, P.; Basile, M.; Batalova, N.; Battiston, R.; Bay, A.; Becattini, F.; Becker, U.; Behner, F.; Bellucci, L.; Berbeco, R.; Berdugo, J.; Berges, P.; Bertucci, B.; Betev, B.L.; Biasini, M.; Biglietti, M.; Biland, A.; Blaising, J.J.; Blyth, S.C.; Bobbink, G.J.; Bohm, A.; Boldizsar, L.; Borgia, B.; Bourilkov, D.; Bourquin, M.; Braccini, S.; Branson, J.G.; Brochu, F.; Buijs, A.; Burger, J.D.; Burger, W.J.; Cai, X.D.; Capell, M.; Cara Romeo, G.; Carlino, G.; Cartacci, A.; Casaus, J.; Cavallari, F.; Cavallo, N.; Cecchi, C.; Cerrada, M.; Chamizo, M.; Chang, Y.H.; Chemarin, M.; Chen, A.; Chen, G.; Chen, G.M.; Chen, H.F.; Chen, H.S.; Chiefari, G.; Cifarelli, L.; Cindolo, F.; Clare, I.; Clare, R.; Coignet, G.; Colino, N.; Costantini, S.; De la Cruz, B.; Cucciarelli, S.; Dai, T.S.; Van Dalen, J.A.; De Asmundis, R.; Deglon, P.; Debreczeni, J.; Degre, A.; Deiters, K.; Della Volpe, D.; Delmeire, E.; Denes, P.; DeNotaristefani, F.; De Salvo, A.; Diemoz, M.; Dierckxsens, M.; Van Dierendonck, D.; Dionisi, C.; Dittmar, M.; Doria, A.; Dova, M.T.; Duchesneau, D.; Duinker, P.; Echenard, B.; Eline, A.; El Mamouni, H.; Engler, A.; Eppling, F.J.; Ewers, A.; Extermann, P.; Falagan, M.A.; Falciano, S.; Favara, A.; Fay, J.; Fedin, O.; Felcini, M.; Ferguson, T.; Fesefeldt, H.; Fiandrini, E.; Field, J.H.; Filthaut, F.; Fisher, P.H.; Fisher, W.; Fisk, I.; Forconi, G.; Freudenreich, K.; Furetta, C.; Galaktionov, Iouri; Ganguli, S.N.; Garcia-Abia, Pablo; Gataullin, M.; Gentile, S.; Giagu, S.; Gong, Z.F.; Grenier, Gerald Jean; Grimm, O.; Gruenewald, M.W.; Guida, M.; Van Gulik, R.; Gupta, V.K.; Gurtu, A.; Gutay, L.J.; Haas, D.; Hatzifotiadou, D.; Hebbeker, T.; Herve, Alain; Hirschfelder, J.; Hofer, H.; Holzner, G.; Hou, S.R.; Hu, Y.; Jin, B.N.; Jones, Lawrence W.; de Jong, P.; Josa-Mutuberria, I.; Kafer, D.; Kaur, M.; Kienzle-Focacci, M.N.; Kim, J.K.; Kirkby, Jasper; Kittel, W.; Klimentov, A.; Konig, A.C.; Kopal, M.; Koutsenko, V.; Kraber, M.; Kraemer, R.W.; Krenz, W.; Kruger, A.; Kunin, A.; Ladron De Guevara, P.; Laktineh, I.; Landi, G.; Lebeau, M.; Lebedev, A.; Lebrun, P.; Lecomte, P.; Lecoq, P.; Le Coultre, P.; Lee, H.J.; Le Goff, J.M.; Leiste, R.; Levtchenko, P.; Li, C.; Likhoded, S.; Lin, C.H.; Lin, W.T.; Linde, F.L.; Lista, L.; Liu, Z.A.; Lohmann, W.; Longo, E.; Lu, Y.S.; Lubelsmeyer, K.; Luci, C.; Luckey, David; Luminari, L.; Lustermann, W.; Ma, W.G.; Malgeri, L.; Malinin, A.; Mana, C.; Mangeol, D.; Mans, J.; Martin, J.P.; Marzano, F.; Mazumdar, K.; McNeil, R.R.; Mele, S.; Merola, L.; Meschini, M.; Metzger, W.J.; Mihul, A.; Milcent, H.; Mirabelli, G.; Mnich, J.; Mohanty, G.B.; Muanza, G.S.; Muijs, A.J.M.; Musicar, B.; Musy, M.; Nagy, S.; Napolitano, M.; Nessi-Tedaldi, F.; Newman, H.; Niessen, T.; Nisati, A.; Kluge, Hannelies; Ofierzynski, R.; Organtini, G.; Palomares, C.; Pandoulas, D.; Paolucci, P.; Paramatti, R.; Passaleva, G.; Patricelli, S.; Paul, Thomas Cantzon; Pauluzzi, M.; Paus, C.; Pauss, F.; Pedace, M.; Pensotti, S.; Perret-Gallix, D.; Petersen, B.; Piccolo, D.; Pierella, F.; Piroue, P.A.; Pistolesi, E.; Plyaskin, V.; Pohl, M.; Pojidaev, V.; Postema, H.; Pothier, J.; Prokofev, D.O.; Prokofiev, D.; Quartieri, J.; Rahal-Callot, G.; Rahaman, M.A.; Raics, P.; Raja, N.; Ramelli, R.; Rancoita, P.G.; Ranieri, R.; Raspereza, A.; Razis, P.; Ren, D.; Rescigno, M.; Reucroft, S.; Riemann, S.; Riles, Keith; Roe, B.P.; Romero, L.; Rosca, A.; Rosier-Lees, S.; Roth, Stefan; Rosenbleck, C.; Roux, B.; Rubio, J.A.; Ruggiero, G.; Rykaczewski, H.; Sakharov, A.; Saremi, S.; Sarkar, S.; Salicio, J.; Sanchez, E.; Sanders, M.P.; Schafer, C.; Schegelsky, V.; Schmidt-Kaerst, S.; Schmitz, D.; Schopper, H.; Schotanus, D.J.; Schwering, G.; Sciacca, C.; Servoli, L.; Shevchenko, S.; Shivarov, N.; Shoutko, V.; Shumilov, E.; Shvorob, A.; Siedenburg, T.; Son, D.; Spillantini, P.; Steuer, M.; Stickland, D.P.; Stoyanov, B.; Straessner, A.; Sudhakar, K.; Sultanov, G.; Sun, L.Z.; Sushkov, S.; Suter, H.; Swain, J.D.; Szillasi, Z.; Tang, X.W.; Tarjan, P.; Tauscher, L.; Taylor, L.; Tellili, B.; Teyssier, D.; Timmermans, Charles; Ting, Samuel C.C.; Ting, S.M.; Tonwar, S.C.; Toth, J.; Tully, C.; Tung, K.L.; Uchida, Y.; Ulbricht, J.; Valente, E.; Van de Walle, R.T.; Veszpremi, V.; Vesztergombi, G.; Vetlitsky, I.; Vicinanza, D.; Viertel, G.; Villa, S.; Vivargent, M.; Vlachos, S.; Vodopianov, I.; Vogel, H.; Vogt, H.; Vorobev, I.; Vorobyov, A.A.; Wadhwa, M.; Wallraff, W.; Wang, M.; Wang, X.L.; Wang, Z.M.; Weber, M.; Wienemann, P.; Wilkens, H.; Wu, S.X.; Wynhoff, S.; Xia, L.; Xu, Z.Z.; Yamamoto, J.; Yang, B.Z.; Yang, C.G.; Yang, H.J.; Yang, M.; Yeh, S.C.; Zalite, A.; Zalite, Yu.; Zhang, Z.P.; Zhao, J.; Zhu, G.Y.; Zhu, R.Y.; Zhuang, H.L.; Zichichi, A.; Zilizi, G.; Zimmermann, B.; Zoller, M.

    2001-01-01

    Final results of the search for the Standard Model Higgs boson are presented for the data collected by the L3 detector at LEP at centre-of-mass energies up to about 209 GeV. These data are compared with the expectations of Standard Model processes for Higgs boson masses up to 120 GeV. A lower limit on the mass of the Standard Model Higgs boson of 112.0 GeV is set at the 95% confidence level. The most significant high mass candidate is a Hnn bar event. It has a reconstructed Higgs mass of 115 GeV and it was recorded at Square root of s =206.4 GeV.

  3. Russian Language Development Assessment as a Standardized Technique for Assessing Communicative Function in Children Aged 3–9 Years

    Directory of Open Access Journals (Sweden)

    Prikhoda N.A.,

    2016-10-01

    Full Text Available The article describes the Russian Language Development Assessment, a standardized individual diagnostic tool for children aged from 3 to 9 that helps to assess the following components of a child’s communicative function: passive vocabulary, expressive vocabulary, knowledge of semantic constructs with logical, temporal and spatial relations, passive perception and active use of syntactic and morphological features of words in a sentence, active and passive phonological awareness, active and passive knowledge of syntactic structures and categories. The article provides descriptions of content and diagnostic procedures for all 7 subtests included in the assessment (Passive Vocabulary, Active Vocabulary, Linguistic Operators, Sentence structure, Word Structure, Phonology, Sentence Repetition. Basing on the data collected in the study that involved 86 first- graders of a Moscow school, the article analyzes the internal consistency and construct validity of each subtest of the technique. It concludes that the Russian Language Development Assessment technique can be of much use both in terms of diagnostic purposes and in supporting children with ASD taking into account the lack of standardized tools for language and speech development assessment in Russian and the importance of this measure in general.

  4. 78 FR 45447 - Revisions to Modeling, Data, and Analysis Reliability Standard

    Science.gov (United States)

    2013-07-29

    ...; Order No. 782] Revisions to Modeling, Data, and Analysis Reliability Standard AGENCY: Federal Energy... Analysis (MOD) Reliability Standard MOD- 028-2, submitted to the Commission for approval by the North... Organization. The Commission finds that the proposed Reliability Standard represents an improvement over the...

  5. Standardizing data exchange for clinical research protocols and case report forms: An assessment of the suitability of the Clinical Data Interchange Standards Consortium (CDISC) Operational Data Model (ODM).

    Science.gov (United States)

    Huser, Vojtech; Sastry, Chandan; Breymaier, Matthew; Idriss, Asma; Cimino, James J

    2015-10-01

    Efficient communication of a clinical study protocol and case report forms during all stages of a human clinical study is important for many stakeholders. An electronic and structured study representation format that can be used throughout the whole study life-span can improve such communication and potentially lower total study costs. The most relevant standard for representing clinical study data, applicable to unregulated as well as regulated studies, is the Operational Data Model (ODM) in development since 1999 by the Clinical Data Interchange Standards Consortium (CDISC). ODM's initial objective was exchange of case report forms data but it is increasingly utilized in other contexts. An ODM extension called Study Design Model, introduced in 2011, provides additional protocol representation elements. Using a case study approach, we evaluated ODM's ability to capture all necessary protocol elements during a complete clinical study lifecycle in the Intramural Research Program of the National Institutes of Health. ODM offers the advantage of a single format for institutions that deal with hundreds or thousands of concurrent clinical studies and maintain a data warehouse for these studies. For each study stage, we present a list of gaps in the ODM standard and identify necessary vendor or institutional extensions that can compensate for such gaps. The current version of ODM (1.3.2) has only partial support for study protocol and study registration data mainly because it is outside the original development goal. ODM provides comprehensive support for representation of case report forms (in both the design stage and with patient level data). Inclusion of requirements of observational, non-regulated or investigator-initiated studies (outside Food and Drug Administration (FDA) regulation) can further improve future revisions of the standard. Published by Elsevier Inc.

  6. Precision tests of the standard model at LEP

    International Nuclear Information System (INIS)

    Mele, Barbara; Universita La Sapienza, Rome

    1994-01-01

    Recent LEP results on electroweak precision measurements are reviewed. Line-shape and asymmetries analysis on the Z 0 peak is described. Then, the consistency of the Standard Model predictions with experimental data and consequent limits on the top mass are discussed. Finally, the possibility of extracting information and constrains on new theoretical models from present data is examined. (author). 20 refs., 5 tabs

  7. The development of methodological tools to assess the health sector with the resulting standardized index

    Directory of Open Access Journals (Sweden)

    Hansuvarova Evgenia Adolfovna

    2016-10-01

    The proposed assessment methodology resulting standardized health index in the various countries of the world allows you to define the country implementing an effective management strategy in the health sector. The leading positions belong to the countries where the state health policy has shown its greatest efficiency. This technique can be used not only for point scoring result of a standardized health index in the world, but also to assess in a particular country.

  8. g-2 and α(MZ2): Status of the Standard Model predictions

    International Nuclear Information System (INIS)

    Teubner, T.; Hagiwara, K.; Liao, R.; Martin, A.D.; Nomura, D.

    2012-01-01

    We review the status of the Standard Model prediction of the anomalous magnetic moment of the muon and the electromagnetic coupling at the scale M Z . Recent progress in the evaluation of the hadronic contributions have consolidated the prediction of both quantities. For g-2, the discrepancy between the measurement from BNL and the Standard Model prediction stands at a level of more than three standard deviations.

  9. Guidelines for standard preclinical experiments in the mouse model of myasthenia gravis induced by acetylcholine receptor immunization.

    Science.gov (United States)

    Tuzun, Erdem; Berrih-Aknin, Sonia; Brenner, Talma; Kusner, Linda L; Le Panse, Rozen; Yang, Huan; Tzartos, Socrates; Christadoss, Premkumar

    2015-08-01

    Myasthenia gravis (MG) is an autoimmune disorder characterized by generalized muscle weakness due to neuromuscular junction (NMJ) dysfunction brought by acetylcholine receptor (AChR) antibodies in most cases. Although steroids and other immunosuppressants are effectively used for treatment of MG, these medications often cause severe side effects and a complete remission cannot be obtained in many cases. For pre-clinical evaluation of more effective and less toxic treatment methods for MG, the experimental autoimmune myasthenia gravis (EAMG) induced by Torpedo AChR immunization has become one of the standard animal models. Although numerous compounds have been recently proposed for MG mostly by using the active immunization EAMG model, only a few have been proven to be effective in MG patients. The variability in the experimental design, immunization methods and outcome measurements of pre-clinical EAMG studies make it difficult to interpret the published reports and assess the potential for application to MG patients. In an effort to standardize the active immunization EAMG model, we propose standard procedures for animal care conditions, sampling and randomization of mice, experimental design and outcome measures. Utilization of these standard procedures might improve the power of pre-clinical EAMG experiments and increase the chances for identifying promising novel treatment methods that can be effectively translated into clinical trials for MG. Copyright © 2015 Elsevier Inc. All rights reserved.

  10. Search for the Standard Model Higgs boson produced in the decay ...

    Indian Academy of Sciences (India)

    2012-10-06

    Oct 6, 2012 ... s = 7 TeV. No evidence is found for a significant deviation from Standard Model expectations anywhere in the ZZ mass range considered in this analysis. An upper limit at 95% CL is placed on the product of the cross-section and decay branching ratio for the Higgs boson decaying with Standard Model-like ...

  11. Violations of universality in a vectorlike extension of the standard model

    International Nuclear Information System (INIS)

    Montvay, I.

    1996-04-01

    Violations of universality of couplings in a vectorlike extension of the standard model with three heavy mirror fermion families are considered. The recently observed discrepancies betwen experiments and the standard model in the hadronic branching fractions R b and R c of the Z-boson are explained by the mixing of fermions with their mirror fermion partners. (orig.)

  12. Modeling of Communication in a Computational Situation Assessment Model

    International Nuclear Information System (INIS)

    Lee, Hyun Chul; Seong, Poong Hyun

    2009-01-01

    Operators in nuclear power plants have to acquire information from human system interfaces (HSIs) and the environment in order to create, update, and confirm their understanding of a plant state, or situation awareness, because failures of situation assessment may result in wrong decisions for process control and finally errors of commission in nuclear power plants. Quantitative or prescriptive models to predict operator's situation assessment in a situation, the results of situation assessment, provide many benefits such as HSI design solutions, human performance data, and human reliability. Unfortunately, a few computational situation assessment models for NPP operators have been proposed and those insufficiently embed human cognitive characteristics. Thus we proposed a new computational situation assessment model of nuclear power plant operators. The proposed model incorporating significant cognitive factors uses a Bayesian belief network (BBN) as model architecture. It is believed that communication between nuclear power plant operators affects operators' situation assessment and its result, situation awareness. We tried to verify that the proposed model represent the effects of communication on situation assessment. As the result, the proposed model succeeded in representing the operators' behavior and this paper shows the details

  13. Searches for Physics Beyond Standard Model at LHC with ATLAS

    CERN Document Server

    Soni, N; The ATLAS collaboration

    2013-01-01

    This contribution summarises some of the recent results on the searches for physics beyond the Standard Model using the pp-collision data collected at Large Hadron Collider (LHC) with ATLAS detector at centre-of-mass energy of sqrt{s} = 8 TeV. The search for supersymmetry (SUSY) is carried out in a large variety of production modes such as strong production of squarks and gluinos, weak production of sleptons and gauginos os production of massive long-lived particles through R-parity violation. No excess above the Standard Model background expectation is observed and exclusion limits are derived on the production of new physics. The results are interpreted as lower limits on sparticle masses in SUSY breaking scenarios. Searches for new exotic phenomena such as dark matter, large extra dimensions and black holes are also performed at ATLAS. As in the case of SUSY searches, no new exotic phenomena is observed and results are presented as upper limits on event yields from non-Standard-Model processes in a model i...

  14. Reconsidering the risk assessment concept: Standardizing the impact description as a building block for vulnerability assessment

    Directory of Open Access Journals (Sweden)

    K. Hollenstein

    2005-01-01

    Full Text Available Risk assessments for natural hazards are becoming more widely used and accepted. Using an extended definition of risk, it becomes obvious that performant procedures for vulnerability assessments are vital for the success of the risk concept. However, there are large gaps in knowledge about vulnerability. To alleviate the situation, a conceptual extension of the scope of existing and new models is suggested. The basis of the suggested concept is a stadardization of the output of hazard assessments. This is achieved by defining states of the target objects that depend on the impact and at the same time affect the object's performance characteristics. The possible state variables can be related to a limited set of impact descriptors termed generic impact description interface. The concept suggests that both hazard and vulnerability assessment models are developed according to the specification of this interface, thus facilitating modularized risk assessments. Potential problems related to the application of the concept include acceptance issues and the lacking accuracy of transformation of outputs of existing models. Potential applications and simple examples for adapting existing models are briefly discussed.

  15. Standards for Documenting Finite‐Fault Earthquake Rupture Models

    KAUST Repository

    Mai, Paul Martin

    2016-04-06

    In this article, we propose standards for documenting and disseminating finite‐fault earthquake rupture models, and related data and metadata. A comprehensive documentation of the rupture models, a detailed description of the data processing steps, and facilitating the access to the actual data that went into the earthquake source inversion are required to promote follow‐up research and to ensure interoperability, transparency, and reproducibility of the published slip‐inversion solutions. We suggest a formatting scheme that describes the kinematic rupture process in an unambiguous way to support subsequent research. We also provide guidelines on how to document the data, metadata, and data processing. The proposed standards and formats represent a first step to establishing best practices for comprehensively documenting input and output of finite‐fault earthquake source studies.

  16. Standards for Documenting Finite‐Fault Earthquake Rupture Models

    KAUST Repository

    Mai, Paul Martin; Shearer, Peter; Ampuero, Jean‐Paul; Lay, Thorne

    2016-01-01

    In this article, we propose standards for documenting and disseminating finite‐fault earthquake rupture models, and related data and metadata. A comprehensive documentation of the rupture models, a detailed description of the data processing steps, and facilitating the access to the actual data that went into the earthquake source inversion are required to promote follow‐up research and to ensure interoperability, transparency, and reproducibility of the published slip‐inversion solutions. We suggest a formatting scheme that describes the kinematic rupture process in an unambiguous way to support subsequent research. We also provide guidelines on how to document the data, metadata, and data processing. The proposed standards and formats represent a first step to establishing best practices for comprehensively documenting input and output of finite‐fault earthquake source studies.

  17. ATLAS Standard Model Measurements Using Jet Grooming and Substructure

    CERN Document Server

    Ucchielli, Giulia; The ATLAS collaboration

    2017-01-01

    Boosted topologies allow to explore Standard Model processes in kinematical regimes never tested before. In such LHC challenging environments, standard reconstruction techniques quickly hit the wall. Targeting hadronic final states means to properly reconstruct energy and multiplicity of the jets in the event. In order to be able to identify the decay product of boosted objects, i.e. W bosons, $t\\bar{t}$ pairs or Higgs produced in association with $t\\bar{t}$ pairs, ATLAS experiment is currently exploiting several algorithms using jet grooming and jet substructure. This contribution will mainly cover the following ATLAS measurements: $t\\bar{t}$ differential cross section production and jet mass using the soft drop procedure. Standard Model measurements offer the perfect field to test the performances of new jet tagging techniques which will become even more important in the search for new physics in highly boosted topologies.”

  18. DEVELOPING STANDARDS FOR ASSESSING ENVIRONMENTAL CHEMICAL, PHYSICAL, AND BIOLOGICAL STRESSORS THROUGH ASTM COMMITTEE E47: A PAST FOUNDATION OF PROVEN STANDARDS, A FUTURE OF GREAT POTENTIAL AND OPPORTUNITY

    Science.gov (United States)

    Development of standards associated with assessing the bioavailability of contaminants in sediment will be used as a case study for how standards have been developed through Committee E47. In 1987, Committee E47 established Subcommittee E47.03 on Sediment Assessment and Toxicity....

  19. Establishment of quality assessment standard for mammographic equipment: evaluation of phantom and clinical images

    International Nuclear Information System (INIS)

    Lee, Sung Hoon; Choe, Yeon Hyeon; Chung, Soo Young

    2005-01-01

    The purpose of this study was to establish a quality standard for mammographic equipment Korea and to eventually improve mammographic quality in clinics and hospitals throughout Korea by educating technicians and clinic personnel. For the phantom test and on site assessment, we visited 37 sites and examined 43 sets of mammographic equipment. Items that were examined include phantom test, radiation dose measurement, developer assessment, etc. The phantom images were assessed visually and by optical density measurements. For the clinical image assessment, clinical images from 371 sites were examined following the new Korean standard for clinical image evaluation. The items examined include labeling, positioning, contrast, exposure, artifacts, collimation among others. Quality standard of mammographic equipment was satisfied in all equipment on site visits. Average mean glandular dose was 114.9 mRad. All phantom image test scores were over 10 points (average, 10.8 points). However, optical density measurements were below 1.2 in 9 sets of equipment (20.9%). Clinical image evaluation revealed appropriate image quality in 83.5%, while images from non-radiologist clinics were adequate in 74.6% (91/122), which was the lowest score of any group. Images were satisfactory in 59.0% (219/371) based on evaluation by specialists following the new Korean standard for clinical image evaluation. Satisfactory images had a mean score of 81.7 (1 S.D. =8.9) and unsatisfactory images had a mean score of 61.9 (1 S.D = 11). The correlation coefficient between the two observers was 0.93 (ρ < 0.01) in 49 consecutive cases. The results of the phantom tests suggest that optical density measurements should be performed as part of a new quality standard for mammographic equipment. The new clinical evaluation criteria that was used in this study can be implemented with some modifications for future mammography quality control by the Korean government

  20. Public School Finance Assessment Project Aligned with ELCC Standards

    Science.gov (United States)

    Risen, D. Michael

    2008-01-01

    This is a detailed description of an assessment that can be used in a graduate level of study in the area of public school finance. This has been approved by NCATE as meeting all of the stipulated ELCC standards for which it is designed (1.1, 1.2, 1.3, 1.4, 1.5, 2.1, 2.2, 2.3, 2.4, 3.1, 3.2, 3.3, 4.1, 4.2, 4.3, 5.1, 5.2, 5.3.). This course of…

  1. Business performance assessment and the EFQM Excellence Model 2010 (case study

    Directory of Open Access Journals (Sweden)

    Mária Antošová

    2015-01-01

    Full Text Available The application of the latest management methods becomes today, during the ongoing European integration processes, a requisite for the success of the business within the environment of international markets. In this paper, the authors pay attention to the expansion of the organizational knowledge base with an application of a new management method. They are dealing with assessment of the organization performance, the issue of its effectiveness and the increasing economic efficiency, on an example of a selected mining company as a case study. In order to detect weaknesses in the company reviewed and to offer suggestions for improvement, which may move the company towards prosperity, the authors carried out the performance assessment by using the EFQM Excellence Model 2010. The aim of this paper is to introduce the business performance assessment model in terms of the European standards that will allow the use of latest management methods in the environment of both the Slovak and European companies.

  2. Integration of nursing assessment concepts into the medical entities dictionary using the LOINC semantic structure as a terminology model.

    OpenAIRE

    Cieslowski, B. J.; Wajngurt, D.; Cimino, J. J.; Bakken, S.

    2001-01-01

    Recent investigations have tested the applicability of various terminology models for the representing nursing concepts including those related to nursing diagnoses, nursing interventions, and standardized nursing assessments as a prerequisite for building a reference terminology that supports the nursing domain. We used the semantic structure of Clinical LOINC (Logical Observations, Identifiers, Names, and Codes) as a reference terminology model to support the integration of standardized ass...

  3. B physics beyond the Standard Model

    International Nuclear Information System (INIS)

    Hewett, J.A.L.

    1997-12-01

    The ability of present and future experiments to test the Standard Model in the B meson sector is described. The authors examine the loop effects of new interactions in flavor changing neutral current B decays and in Z → b anti b, concentrating on supersymmetry and the left-right symmetric model as specific examples of new physics scenarios. The procedure for performing a global fit to the Wilson coefficients which describe b → s transitions is outlined, and the results of such a fit from Monte Carlo generated data is compared to the predictions of the two sample new physics scenarios. A fit to the Zb anti b couplings from present data is also given

  4. Does a massive neutrino imply to go beyond the standard model?

    International Nuclear Information System (INIS)

    Le Diberder, F.; Cohen-Tannoudji, G.; Davier, M.

    2002-01-01

    This article gathers the 15 contributions to this seminar. The purpose of this seminar was to define up to which extend the standard model is challenged by massive neutrinos. A non-zero mass for neutrinos, even a few eV, would solve the problem of the missing mass of the universe, and it would mean no more need for supersymmetry and its neutralinos. A massless neutrino theoretically implies a symmetry and an interaction that are not described by the standard model. In some aspects, it appears that a non-zero mass is natural within the framework of the standard model, and for some scientists the smallness of this value could be the hint of the need for a new physics

  5. Development of an analytical model to assess fuel property effects on combustor performance

    Science.gov (United States)

    Sutton, R. D.; Troth, D. L.; Miles, G. A.; Riddlebaugh, S. M.

    1987-01-01

    A generalized first-order computer model has been developed in order to analytically evaluate the potential effect of alternative fuels' effects on gas turbine combustors. The model assesses the size, configuration, combustion reliability, and durability of the combustors required to meet performance and emission standards while operating on a broad range of fuels. Predictions predicated on combustor flow-field determinations by the model indicate that fuel chemistry, as defined by hydrogen content, exerts a significant influence on flame retardation, liner wall temperature, and smoke emission.

  6. From the Margins to the Spotlight: Diverse Deaf and Hard of Hearing Student Populations and Standardized Assessment Accessibility.

    Science.gov (United States)

    Cawthon, Stephanie

    2015-01-01

    Designing assessments and tests is one of the more challenging aspects of creating an accessible learning environment for students who are deaf or hard of hearing (DHH), particularly for deaf students with a disability (DWD). Standardized assessments are a key mechanism by which the educational system in the United States measures student progress, teacher effectiveness, and the impact of school reform. The diversity of student characteristics within DHH and DWD populations is only now becoming visible in the research literature relating to standardized assessments and their use in large-scale accountability reforms. The purpose of this article is to explore the theoretical frameworks surrounding assessment policy and practice, current research related to standardized assessment and students who are DHH and DWD, and potential implications for practice within both the assessment and instruction contexts.

  7. Probabilistic Radiological Performance Assessment Modeling and Uncertainty

    Science.gov (United States)

    Tauxe, J.

    2004-12-01

    A generic probabilistic radiological Performance Assessment (PA) model is presented. The model, built using the GoldSim systems simulation software platform, concerns contaminant transport and dose estimation in support of decision making with uncertainty. Both the U.S. Nuclear Regulatory Commission (NRC) and the U.S. Department of Energy (DOE) require assessments of potential future risk to human receptors of disposal of LLW. Commercially operated LLW disposal facilities are licensed by the NRC (or agreement states), and the DOE operates such facilities for disposal of DOE-generated LLW. The type of PA model presented is probabilistic in nature, and hence reflects the current state of knowledge about the site by using probability distributions to capture what is expected (central tendency or average) and the uncertainty (e.g., standard deviation) associated with input parameters, and propagating through the model to arrive at output distributions that reflect expected performance and the overall uncertainty in the system. Estimates of contaminant release rates, concentrations in environmental media, and resulting doses to human receptors well into the future are made by running the model in Monte Carlo fashion, with each realization representing a possible combination of input parameter values. Statistical summaries of the results can be compared to regulatory performance objectives, and decision makers are better informed of the inherently uncertain aspects of the model which supports their decision-making. While this information may make some regulators uncomfortable, they must realize that uncertainties which were hidden in a deterministic analysis are revealed in a probabilistic analysis, and the chance of making a correct decision is now known rather than hoped for. The model includes many typical features and processes that would be part of a PA, but is entirely fictitious. This does not represent any particular site and is meant to be a generic example. A

  8. Comparison of Standard Wind Turbine Models with Vendor Models for Power System Stability Analysis: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Honrubia-Escribano, A.; Gomez Lazaro, E.; Jimenez-Buendia, F.; Muljadi, Eduard

    2016-11-01

    The International Electrotechnical Commission Standard 61400-27-1 was published in February 2015. This standard deals with the development of generic terms and parameters to specify the electrical characteristics of wind turbines. Generic models of very complex technological systems, such as wind turbines, are thus defined based on the four common configurations available in the market. Due to its recent publication, the comparison of the response of generic models with specific vendor models plays a key role in ensuring the widespread use of this standard. This paper compares the response of a specific Gamesa dynamic wind turbine model to the corresponding generic IEC Type III wind turbine model response when the wind turbine is subjected to a three-phase voltage dip. This Type III model represents the doubly-fed induction generator wind turbine, which is not only one of the most commonly sold and installed technologies in the current market but also a complex variable-speed operation implementation. In fact, active and reactive power transients are observed due to the voltage reduction. Special attention is given to the reactive power injection provided by the wind turbine models because it is a requirement of current grid codes. Further, the boundaries of the generic models associated with transient events that cannot be represented exactly are included in the paper.

  9. Solar Luminosity on the Main Sequence, Standard Model and Variations

    Science.gov (United States)

    Ayukov, S. V.; Baturin, V. A.; Gorshkov, A. B.; Oreshina, A. V.

    2017-05-01

    Our Sun became Main Sequence star 4.6 Gyr ago according Standard Solar Model. At that time solar luminosity was 30% lower than current value. This conclusion is based on assumption that Sun is fueled by thermonuclear reactions. If Earth's albedo and emissivity in infrared are unchanged during Earth history, 2.3 Gyr ago oceans had to be frozen. This contradicts to geological data: there was liquid water 3.6-3.8 Gyr ago on Earth. This problem is known as Faint Young Sun Paradox. We analyze luminosity change in standard solar evolution theory. Increase of mean molecular weight in the central part of the Sun due to conversion of hydrogen to helium leads to gradual increase of luminosity with time on the Main Sequence. We also consider several exotic models: fully mixed Sun; drastic change of pp reaction rate; Sun consisting of hydrogen and helium only. Solar neutrino observations however exclude most non-standard solar models.

  10. Towards identifying dyslexia in Standard Indonesian: the development of a reading assessment battery.

    Science.gov (United States)

    Jap, Bernard A J; Borleffs, Elisabeth; Maassen, Ben A M

    2017-01-01

    With its transparent orthography, Standard Indonesian is spoken by over 160 million inhabitants and is the primary language of instruction in education and the government in Indonesia. An assessment battery of reading and reading-related skills was developed as a starting point for the diagnosis of dyslexia in beginner learners. Founded on the International Dyslexia Association's definition of dyslexia, the test battery comprises nine empirically motivated reading and reading-related tasks assessing word reading, pseudoword reading, arithmetic, rapid automatized naming, phoneme deletion, forward and backward digit span, verbal fluency, orthographic choice (spelling), and writing. The test was validated by computing the relationships between the outcomes on the reading-skills and reading-related measures by means of correlation and factor analyses. External variables, i.e., school grades and teacher ratings of the reading and learning abilities of individual students, were also utilized to provide evidence of its construct validity. Four variables were found to be significantly related with reading-skill measures: phonological awareness, rapid naming, spelling, and digit span. The current study on reading development in Standard Indonesian confirms findings from other languages with transparent orthographies and suggests a test battery including preliminary norm scores for screening and assessment of elementary school children learning to read Standard Indonesian.

  11. Identifying best existing practice for characterization modeling in life cycle impact assessment

    DEFF Research Database (Denmark)

    Hauschild, Michael Zwicky; Goedkoop, Mark; Guinée, Jeroen

    2013-01-01

    Purpose: Life cycle impact assessment (LCIA) is a field of active development. The last decade has seen prolific publication of new impact assessment methods covering many different impact categories and providing characterization factors that often deviate from each other for the same substance...... and impact. The LCA standard ISO 14044 is rather general and unspecific in its requirements and offers little help to the LCA practitioner who needs to make a choice. With the aim to identify the best among existing characterization models and provide recommendations to the LCA practitioner, a study...... was performed for the Joint Research Centre of the European Commission (JRC). Methods Existing LCIA methods were collected and their individual characterization models identified at both midpoint and endpoint levels and supplemented with other environmental models of potential use for LCIA. No new developments...

  12. Supersymmetry and String Theory: Beyond the Standard Model

    International Nuclear Information System (INIS)

    Rocek, Martin

    2007-01-01

    When I was asked to review Michael Dine's new book, 'Supersymmetry and String Theory', I was pleased to have a chance to read a book by such an established authority on how string theory might become testable. The book is most useful as a list of current topics of interest in modern theoretical physics. It gives a succinct summary of a huge variety of subjects, including the standard model, symmetry, Yang-Mills theory, quantization of gauge theories, the phenomenology of the standard model, the renormalization group, lattice gauge theory, effective field theories, anomalies, instantons, solitons, monopoles, dualities, technicolor, supersymmetry, the minimal supersymmetric standard model, dynamical supersymmetry breaking, extended supersymmetry, Seiberg-Witten theory, general relativity, cosmology, inflation, bosonic string theory, the superstring, the heterotic string, string compactifications, the quintic, string dualities, large extra dimensions, and, in the appendices, Goldstone's theorem, path integrals, and exact beta-functions in supersymmetric gauge theories. Its breadth is both its strength and its weakness: it is not (and could not possibly be) either a definitive reference for experts, where the details of thorny technical issues are carefully explored, or a textbook for graduate students, with detailed pedagogical expositions. As such, it complements rather than replaces the much narrower and more focussed String Theory I and II volumes by Polchinski, with their deep insights, as well the two older volumes by Green, Schwarz, and Witten, which develop string theory pedagogically. (book review)

  13. A meta model-based methodology for an energy savings uncertainty assessment of building retrofitting

    Directory of Open Access Journals (Sweden)

    Caucheteux Antoine

    2016-01-01

    Full Text Available To reduce greenhouse gas emissions, energy retrofitting of building stock presents significant potential for energy savings. In the design stage, energy savings are usually assessed through Building Energy Simulation (BES. The main difficulty is to first assess the energy efficiency of the existing buildings, in other words, to calibrate the model. As calibration is an under determined problem, there is many solutions for building representation in simulation tools. In this paper, a method is proposed to assess not only energy savings but also their uncertainty. Meta models, using experimental designs, are used to identify many acceptable calibrations: sets of parameters that provide the most accurate representation of the building are retained to calculate energy savings. The method was applied on an existing office building modeled with the TRNsys BES. The meta model, using 13 parameters, is built with no more than 105 simulations. The evaluation of the meta model on thousands of new simulations gives a normalized mean bias error between the meta model and BES of <4%. Energy savings are assessed based on six energy savings concepts, which indicate savings of 2–45% with a standard deviation ranging between 1.3% and 2.5%.

  14. An ICF-Based Model for Implementing and Standardizing Multidisciplinary Obesity Rehabilitation Programs within the Healthcare System

    Directory of Open Access Journals (Sweden)

    Amelia Brunani

    2015-05-01

    Full Text Available Introduction/Objective: In this study, we aimed to design an ICF-based individual rehabilitation project for obese patients with comorbidities (IRPOb integrated into the Rehab-CYCLE to standardize rehabilitative programs. This might facilitate the different health professionals involved in the continuum of care of obese patients to standardize rehabilitation interventions. Methods: After training on the ICF and based on the relevant studies, ICF categories were identified in a formal consensus process by our multidisciplinary team. Thereafter, we defined an individual rehabilitation project based on a structured multi-disciplinary approach to obesity. Results: the proposed IRPOb model identified the specific intervention areas (nutritional, physiotherapy, psychology, nursing, the short-term goals, the intervention modalities, the professionals involved and the assessment of the outcomes. Information was shared with the patient who signed informed consent. Conclusions: The model proposed provides the following advantages: (1 standardizes rehabilitative procedures; (2 facilitates the flow of congruent and updated information from the hospital to outpatient facilities, relatives, and care givers; (3 addresses organizational issues; (4 might serve as a benchmark for professionals who have limited specific expertise in rehabilitation of comorbid obese patients.

  15. R parity in standard-like superstring models

    International Nuclear Information System (INIS)

    Halyo, Edi.

    1994-01-01

    We investigate the R symmetries of standard-like superstring models. At the level of the cubic superpotential there are three global U(1) R symmetries. These are broken explicitly by N > 3 terms in the superpotential and spontaneously by scalar Vacuum Expectation values necessary to preserve supersymmetry at Mp. A Z 2 discrete symmetry remains but is equivalent to fermion number modulo 2. These models possess an effective R parity which arises from the interplay between the gauged U(1) B-L and U(1) r j+3 . (author). 14 refs

  16. Astrophysical neutrinos flavored with beyond the Standard Model physics

    Energy Technology Data Exchange (ETDEWEB)

    Rasmussen, Rasmus W.; Ackermann, Markus; Winter, Walter [Deutsches Elektronen-Synchrotron (DESY), Zeuthen (Germany); Lechner, Lukas [Vienna Univ. of Technology (Austria). Dept. of Physics; Kowalski, Marek [Deutsches Elektronen-Synchrotron (DESY), Zeuthen (Germany); Humboldt-Universitaet, Berlin (Germany). Inst. fuer Physik

    2017-07-15

    We systematically study the allowed parameter space for the flavor composition of astrophysical neutrinos measured at Earth, including beyond the Standard Model theories at production, during propagation, and at detection. One motivation is to illustrate the discrimination power of the next-generation neutrino telescopes such as IceCube-Gen2. We identify several examples that lead to potential deviations from the standard neutrino mixing expectation such as significant sterile neutrino production at the source, effective operators modifying the neutrino propagation at high energies, dark matter interactions in neutrino propagation, or non-standard interactions in Earth matter. IceCube-Gen2 can exclude about 90% of the allowed parameter space in these cases, and hence will allow to efficiently test and discriminate models. More detailed information can be obtained from additional observables such as the energy-dependence of the effect, fraction of electron antineutrinos at the Glashow resonance, or number of tau neutrino events.

  17. Astrophysical neutrinos flavored with beyond the Standard Model physics

    International Nuclear Information System (INIS)

    Rasmussen, Rasmus W.; Ackermann, Markus; Winter, Walter; Lechner, Lukas; Kowalski, Marek; Humboldt-Universitaet, Berlin

    2017-07-01

    We systematically study the allowed parameter space for the flavor composition of astrophysical neutrinos measured at Earth, including beyond the Standard Model theories at production, during propagation, and at detection. One motivation is to illustrate the discrimination power of the next-generation neutrino telescopes such as IceCube-Gen2. We identify several examples that lead to potential deviations from the standard neutrino mixing expectation such as significant sterile neutrino production at the source, effective operators modifying the neutrino propagation at high energies, dark matter interactions in neutrino propagation, or non-standard interactions in Earth matter. IceCube-Gen2 can exclude about 90% of the allowed parameter space in these cases, and hence will allow to efficiently test and discriminate models. More detailed information can be obtained from additional observables such as the energy-dependence of the effect, fraction of electron antineutrinos at the Glashow resonance, or number of tau neutrino events.

  18. Flavour physics beyond the standard model in top and bottom quarks

    International Nuclear Information System (INIS)

    Stamou, Emmanuel

    2013-01-01

    The Large Hadron Collider is currently exploring dynamics at high energies where we expect physics beyond the standard model to emerge as an answer to at least some of the questions the standard model cannot address. We consider the low-energy flavour signatures of a model with a dynamical explanation of quark masses and mixings, construct a model with new strong interactions that account for the anomalously large measurement of an asymmetry in top antitop production at Tevatron, and compute next-to-leading-order electroweak corrections to the recently observed rare decay B s →μ + μ - .

  19. Self-assessment: Strategy for higher standards, consistency, and performance

    International Nuclear Information System (INIS)

    Ide, W.E.

    1996-01-01

    In late 1994, Palo Verde operations underwent a transformation from a unitized structure to a single functional unit. It was necessary to build consistency in watchstanding practices and create a shared mission. Because there was a lack of focus on actual plant operations and because personnel were deeply involved with administrative tasks, command and control of evolutions were weak. Improvement was needed. Consistent performance standards have been set for all three operating units. These expectation focus on nuclear, radiological, and industrial safety. Straightforward descriptions of watchstanding and monitoring practices have been provided to all department personnel. The desired professional and leadership qualities for employee conduct have been defined and communicated thoroughly. A healthy and competitive atmosphere developed with the successful implementation of these standards. Overall performance improved. The auxiliary operators demonstrated increased pride and ownership in the performance of their work activities. In addition, their morale improved. Crew teamwork improved as well as the quality of shift briefs. There was a decrease in the noise level and the administrative functions in the control room. The use of self-assessment helped to anchor and define higher and more consistent standards. The proof of Palo Verde's success was evident when an Institute of Nuclear Power Operations finding was turned into a strength within 1 yr

  20. Incorporating Standardized Colleague Simulations in a Clinical Assessment Course and Evaluating the Impact on Interprofessional Communication.

    Science.gov (United States)

    Shrader, Sarah; Dunn, Brianne; Blake, Elizabeth; Phillips, Cynthia

    2015-05-25

    To determine the impact of incorporating standardized colleague simulations on pharmacy students' confidence and interprofessional communication skills. Four simulations using standardized colleagues portraying attending physicians in inpatient and outpatient settings were integrated into a required course. Pharmacy students interacted with the standardized colleagues using the Situation, Background, Assessment, Request/Recommendation (SBAR) communication technique and were evaluated on providing recommendations while on simulated inpatient rounds and in an outpatient clinic. Additionally, changes in student attitudes and confidence toward interprofessional communication were assessed with a survey before and after the standardized colleague simulations. One hundred seventy-one pharmacy students participated in the simulations. Student interprofessional communication skills improved after each simulation. Student confidence with interprofessional communication in both inpatient and outpatient settings significantly improved. Incorporation of simulations using standardized colleagues improves interprofessional communication skills and self-confidence of pharmacy students.

  1. Study on Design and Implementation of JAVA Programming Procedural Assessment Standard

    Science.gov (United States)

    Tingting, Xu; Hua, Ma; Xiujuan, Wang; Jing, Wang

    2015-01-01

    The traditional JAVA course examination is just a list of questions from which we cannot know students' skills of programming. According to the eight abilities in curriculum objectives, we designed an assessment standard of JAVA programming course that is based on employment orientation and apply it to practical teaching to check the teaching…

  2. Neutrons and the new Standard Model

    Energy Technology Data Exchange (ETDEWEB)

    Ramsey-Musolf, M.J., E-mail: mjrm@physics.wisc.ed [Department of Physics, University of Wisconsin-Madison, Madison, WI 53706 (United States); Kellogg Radiation Laboratory, California Institute of Technology, Pasadena, CA 91125 (United States)

    2009-12-11

    Fundamental symmetry tests with neutrons can provide unique information about whatever will be the new Standard Model of fundamental interactions. I review two aspects of this possibility: searches for the permanent electric dipole moment of the neutron and its relation to the origin of baryonic matter, and precision studies of neutron decay that can probe new symmetries. I discuss the complementarity of these experiments with other low-energy precision tests and high energy collider searches for new physics.

  3. Standardized binomial models for risk or prevalence ratios and differences.

    Science.gov (United States)

    Richardson, David B; Kinlaw, Alan C; MacLehose, Richard F; Cole, Stephen R

    2015-10-01

    Epidemiologists often analyse binary outcomes in cohort and cross-sectional studies using multivariable logistic regression models, yielding estimates of adjusted odds ratios. It is widely known that the odds ratio closely approximates the risk or prevalence ratio when the outcome is rare, and it does not do so when the outcome is common. Consequently, investigators may decide to directly estimate the risk or prevalence ratio using a log binomial regression model. We describe the use of a marginal structural binomial regression model to estimate standardized risk or prevalence ratios and differences. We illustrate the proposed approach using data from a cohort study of coronary heart disease status in Evans County, Georgia, USA. The approach reduces problems with model convergence typical of log binomial regression by shifting all explanatory variables except the exposures of primary interest from the linear predictor of the outcome regression model to a model for the standardization weights. The approach also facilitates evaluation of departures from additivity in the joint effects of two exposures. Epidemiologists should consider reporting standardized risk or prevalence ratios and differences in cohort and cross-sectional studies. These are readily-obtained using the SAS, Stata and R statistical software packages. The proposed approach estimates the exposure effect in the total population. © The Author 2015; all rights reserved. Published by Oxford University Press on behalf of the International Epidemiological Association.

  4. Standard model status (in search of ''new physics'')

    International Nuclear Information System (INIS)

    Marciano, W.J.

    1993-03-01

    A perspective on successes and shortcomings of the standard model is given. The complementarity between direct high energy probes of new physics and lower energy searches via precision measurements and rare reactions is described. Several illustrative examples are discussed

  5. Introduction to physics beyond the Standard Model

    CERN Document Server

    Giudice, Gian Francesco

    1998-01-01

    These lectures will give an introductory review of the main ideas behind the attempts to extend the standard-model description of elementary particle interactions. After analysing the conceptual motivations that lead us to blieve in the existence of an underlying fundamental theory, wi will discuss the present status of various theoretical constructs : grand unification, supersymmetry and technicolour.

  6. Results on Standard Model Higgs Boson searches at high mass at the LHC

    International Nuclear Information System (INIS)

    Gao, Yanyan

    2014-01-01

    We present results from searches for the standard model Higgs boson with a mass greater than 200 GeV in pp collisions at √(s)=7 TeV. The data are collected at the LHC with both ATLAS and CMS detectors, and correspond to integrated luminosity of 5 fb -1 each. Searches are performed in the 2 main decay modes WW and ZZ. No significant excess of events above the standard model background expectations is observed, and upper limits on the Higgs boson production relative to the standard model expectation are derived. A standard model Higgs boson is excluded in the mass range up to 539 GeV or 600 GeV at 95% confidence level by the ATLAS or CMS experiments respectively. (author)

  7. Impact of model uncertainty on soil quality standards for cadmium in rice paddy fields

    International Nuclear Information System (INIS)

    Roemkens, P.F.A.M.; Brus, D.J.; Guo, H.Y.; Chu, C.L.; Chiang, C.M.; Koopmans, G.F.

    2011-01-01

    At present, soil quality standards used for agriculture do not consider the influence of pH and CEC on the uptake of pollutants by crops. A database with 750 selected paired samples of cadmium (Cd) in soil and paddy rice was used to calibrate soil to plant transfer models using the soil metal content, pH, and CEC or soil Cd and Zn extracted by 0.01 M CaCl 2 as explanatory variables. The models were validated against a set of 2300 data points not used in the calibration. These models were then used inversely to derive soil quality standards for Japonica and Indica rice cultivars based on the food quality standards for rice. To account for model uncertainty, strict soil quality standards were derived considering a maximum probability that rice exceeds the food quality standard equal to 10 or 5%. Model derived soil standards based on Aqua Regia ranged from less than 0.3 mg kg -1 for Indica at pH 4.5 to more than 6 mg kg -1 for Japonica-type cultivars in clay soils at pH 7. Based on the CaCl 2 extract, standards ranged from 0.03 mg kg -1 Cd for Indica cultivars to 0.1 mg kg -1 Cd for Japonica cultivars. For both Japonica and Indica-type cultivars, the soil quality standards must be reduced by a factor of 2 to 3 to obtain the strict standards. The strong impact of pH and CEC on soil quality standards implies that it is essential to correct for soil type when deriving national or local standards. Validation on the remaining 2300 samples indicated that both types of models were able to accurately predict (> 92%) whether rice grown on a specific soil will meet the food quality standard used in Taiwan. - Research highlights: → Cadmium uptake by Japonica and Indica rice varieties depends on soil pH and CEC. → Food safety based soil standards range from 0.3 (Indica) to 6 mg kg -1 (Japonica). → Model uncertainty leads to strict soil standards of less than 0.1 mg kg -1 for Indica. → Soil pH and CEC should be considered to obtain meaningful standards for agriculture.

  8. Standard model parameters and the search for new physics

    International Nuclear Information System (INIS)

    Marciano, W.J.

    1988-04-01

    In these lectures, my aim is to present an up-to-date status report on the standard model and some key tests of electroweak unification. Within that context, I also discuss how and where hints of new physics may emerge. To accomplish those goals, I have organized my presentation as follows: I discuss the standard model parameters with particular emphasis on the gauge coupling constants and vector boson masses. Examples of new physics appendages are also briefly commented on. In addition, because these lectures are intended for students and thus somewhat pedagogical, I have included an appendix on dimensional regularization and a simple computational example that employs that technique. Next, I focus on weak charged current phenomenology. Precision tests of the standard model are described and up-to-date values for the Cabibbo-Kobayashi-Maskawa (CKM) mixing matrix parameters are presented. Constraints implied by those tests for a 4th generation, supersymmetry, extra Z/prime/ bosons, and compositeness are also discussed. I discuss weak neutral current phenomenology and the extraction of sin/sup 2/ /theta//sub W/ from experiment. The results presented there are based on a recently completed global analysis of all existing data. I have chosen to concentrate that discussion on radiative corrections, the effect of a heavy top quark mass, and implications for grand unified theories (GUTS). The potential for further experimental progress is also commented on. I depart from the narrowest version of the standard model and discuss effects of neutrino masses and mixings. I have chosen to concentrate on oscillations, the Mikheyev-Smirnov- Wolfenstein (MSW) effect, and electromagnetic properties of neutrinos. On the latter topic, I will describe some recent work on resonant spin-flavor precession. Finally, I conclude with a prospectus on hopes for the future. 76 refs

  9. Accounting for the NCEA : Has the Transition to Standards-based Assessment Achieved its Objectives?

    Directory of Open Access Journals (Sweden)

    Stephen Agnew

    2010-12-01

    Full Text Available This paper identifies trends in secondary school accounting participation and achievement during the firstfive years of the full implementation of the National Certificate of Educational Achievement (NCEA in NewZealand schools. NCEA marks a shift from a norm-referenced assessment regime to standards-basedassessment. Literature suggests that standards-based assessment increases the academic performance ofminority ethnic groups (such as Maori and Pacific Island students, and low socio-economic status (SESstudents. The author pays particular attention to these groups and his analysis reveals some interestingresults: in accounting, the NCEA has not met expectations for these students. From 2004 to 2008, thenumber of low SES accounting students has dropped, as has the number of accounting standards entered andthe rates of achievement. Likewise, there has been no significant improvement in the academic performanceof Maori students taking accounting standards, while Pacific Island students have experienced a significantdecrease in achievement. The author also discusses how studying high school accounting impacts on tertiarylevel study and offers some future implications of this research.

  10. Radiation protection standards: A practical exercise in risk assessment

    Energy Technology Data Exchange (ETDEWEB)

    Clarke, Roger H [National Radiological Protection Board (United Kingdom)

    1992-07-01

    Within 12 months of the discovery of x-rays in 1895, it was reported that large doses of radiation were harmful to living human tissues. The first radiation protection standards were set to avoid the early effects of acute irradiation. By the 1950s, evidence was mounting for late somatic effects - mainly a small excess of cancers - in irradiated populations. In the late 1980's, sufficient human epidemiological data had been accumulated to allow a comprehensive assessment of carcinogenic radiation risks following the delivery of moderately high doses. Workers and the public are exposed to lower doses and dose-rates than the groups from whom good data are available so that risks have had to be estimated for protection purposes. However, in the 1990s, some confirmation of these risk factors has been derived occupationally exposed populations. If an estimate is made of the risk per unit dose, then in order to set dose limits, an unacceptable level of risk must be established for both workers and the public. There has been and continues to be a debate about the definitions of 'acceptable' and 'tolerable' and the attributing of numerical values to these definitions. This paper discusses the issues involved in the quantification of these terms and their application to setting dose limits on risk grounds. Conclusions are drawn about the present protection standards and the application of the methods to other fields of risk assessment. (author)

  11. Radiation protection standards: A practical exercise in risk assessment

    International Nuclear Information System (INIS)

    Clarke, Roger H.

    1992-01-01

    Within 12 months of the discovery of x-rays in 1895, it was reported that large doses of radiation were harmful to living human tissues. The first radiation protection standards were set to avoid the early effects of acute irradiation. By the 1950s, evidence was mounting for late somatic effects - mainly a small excess of cancers - in irradiated populations. In the late 1980's, sufficient human epidemiological data had been accumulated to allow a comprehensive assessment of carcinogenic radiation risks following the delivery of moderately high doses. Workers and the public are exposed to lower doses and dose-rates than the groups from whom good data are available so that risks have had to be estimated for protection purposes. However, in the 1990s, some confirmation of these risk factors has been derived occupationally exposed populations. If an estimate is made of the risk per unit dose, then in order to set dose limits, an unacceptable level of risk must be established for both workers and the public. There has been and continues to be a debate about the definitions of 'acceptable' and 'tolerable' and the attributing of numerical values to these definitions. This paper discusses the issues involved in the quantification of these terms and their application to setting dose limits on risk grounds. Conclusions are drawn about the present protection standards and the application of the methods to other fields of risk assessment. (author)

  12. Quantifying relative importance: Computing standardized effects in models with binary outcomes

    Science.gov (United States)

    Grace, James B.; Johnson, Darren; Lefcheck, Jonathan S.; Byrnes, Jarrett E.K.

    2018-01-01

    Scientists commonly ask questions about the relative importances of processes, and then turn to statistical models for answers. Standardized coefficients are typically used in such situations, with the goal being to compare effects on a common scale. Traditional approaches to obtaining standardized coefficients were developed with idealized Gaussian variables in mind. When responses are binary, complications arise that impact standardization methods. In this paper, we review, evaluate, and propose new methods for standardizing coefficients from models that contain binary outcomes. We first consider the interpretability of unstandardized coefficients and then examine two main approaches to standardization. One approach, which we refer to as the Latent-Theoretical or LT method, assumes that underlying binary observations there exists a latent, continuous propensity linearly related to the coefficients. A second approach, which we refer to as the Observed-Empirical or OE method, assumes responses are purely discrete and estimates error variance empirically via reference to a classical R2 estimator. We also evaluate the standard formula for calculating standardized coefficients based on standard deviations. Criticisms of this practice have been persistent, leading us to propose an alternative formula that is based on user-defined “relevant ranges”. Finally, we implement all of the above in an open-source package for the statistical software R.

  13. Review of the standard model

    International Nuclear Information System (INIS)

    Treille, D.

    1992-01-01

    The goal of this review is not to make one more celebration of the accuracy of LEP results, but rather to put them in a broader perspective. This set of measurements are compared with what they could and should be in the future if the various options available at LEP are exploited properly, and show that much is left to be done. Then various classes of non-LEP results are discussed which are already remarkable and still prone to improvements, which bring complementary information on the Standard Model, by probing it in widely different domains of applicability. (author) 46 refs.; 29 figs.; 12 tabs

  14. Research and Demonstration of‘Double-chain’Eco-agricultural Model Standardization and Industrialization

    Directory of Open Access Journals (Sweden)

    ZHANG Jia-hong

    2015-04-01

    Full Text Available According to agricultural resource endowment of Jiangsu Province, this paper created kinds of double-chain eco-agricultural model and integrated supporting system based on 'waterfowl, marine lives, aquatic vegetable and paddy rice', 'special food and economic crops with livestock’and‘special food and economic crops with livestock and marine lives’, which were suitable for extension and application in Jiangsu Province. Besides, it set 12 provincial standards and established preliminary technical standard system of‘double-chain’eco-agricultural model. In addition, it explored that‘the leading agricultural enterprises (agricultural co-operatives or family farms+demonstration zones+farmer households’was adopted as operating mechanism of industrialization of eco-agricultural model, which pushed forward rapid development of standardization and industrialization of‘double-chain’eco-agricultural model.

  15. Precision Electroweak Measurements and Constraints on the Standard Model

    CERN Document Server

    ,

    2010-01-01

    This note presents constraints on Standard Model parameters using published and preliminary precision electroweak results measured at the electron-positron colliders LEP and SLC. The results are compared with precise electroweak measurements from other experiments, notably CDF and DØ at the Tevatron. Constraints on the input parameters of the Standard Model are derived from the combined set of results obtained in high-$Q^2$ interactions, and used to predict results in low-$Q^2$ experiments, such as atomic parity violation, Møller scattering, and neutrino-nucleon scattering. The main changes with respect to the experimental results presented in 2009 are new combinations of results on the width of the W boson and the mass of the top quark.

  16. On light dilaton extensions of the Standard Model

    International Nuclear Information System (INIS)

    Megías, Eugenio; Pujolàs, Oriol; Quirós, Mariano

    2016-01-01

    We discuss the presence of a light dilaton in Conformal Field Theories deformed by a single scalar operator, in the holographic realization consisting of confining Renormalization Group flows. Then, we apply this formalism to study the extension of the Standard Model with a light dilaton in a 5D warped model. We study the spectrum of scalar and vector perturbations, compare the model predictions with Electroweak Precision Tests and find the corresponding bounds for the lightest modes. Finally, we analyze the possibility that the Higgs resonance found at the LHC be a dilaton

  17. Proceedings of standard model at the energy of present and future accelerators

    International Nuclear Information System (INIS)

    Csikor, F.; Pocsik, G.; Toth, E.

    1992-01-01

    This book contains the proceedings of the Workshop on The Standard Model at the Energy of the Present and Future Accelerators, 27 June - 1 July 1989, Budapest. The Standard Model of strong and electro-weak interactions providing essential insights into the fundamental structure of matter and being the basic building block of further generalizations has a rich content. The Workshop was devoted to discussing topical problems of testing the Standard Model in high energy reactions such as jet physics and fragmentation, new applications and tests of perturbative QCD, CP-violation, B-meson physics and developments in weak decays, some of the future experimental plans and related topics

  18. Understanding Standards and Assessment Policy in Science Education: Relating and Exploring Variations in Policy Implementation by Districts and Teachers in Wisconsin

    Science.gov (United States)

    Anderson, Kevin John Boyett

    Current literature shows that many science teachers view policies of standards-based and test-based accountability as conflicting with research-based instruction in science education. With societal goals of improving scientific literacy and using science to spur economic growth, improving science education policy becomes especially important. To understand perceived influences of science education policy, this study looked at three questions: 1) How do teachers perceive state science standards and assessment and their influence on curriculum and instruction? 2) How do these policy perspectives vary by district and teacher level demographic and contextual differences? 3) How do district leaders' interpretations of and efforts within these policy realms relate to teachers' perceptions of the policies? To answer these questions, this study used a stratified sample of 53 districts across Wisconsin, with 343 middle school science teachers responding to an online survey; science instructional leaders from each district were also interviewed. Survey results were analyzed using multiple regression modeling, with models generally predicting 8-14% of variance in teacher perceptions. Open-ended survey and interview responses were analyzed using a constant comparative approach. Results suggested that many teachers saw state testing as limiting use of hands-on pedagogy, while standards were seen more positively. Teachers generally held similar views of the degree of influence of standards and testing regardless of their experience, background in science, credentials, or grade level taught. District SES, size and past WKCE scores had some limited correlations to teachers' views of policy, but teachers' perceptions of district policies and leadership consistently had the largest correlation to their views. District leadership views of these state policies correlated with teachers' views. Implications and future research directions are provided. Keywords: science education, policy

  19. Constraining new physics with collider measurements of Standard Model signatures

    Energy Technology Data Exchange (ETDEWEB)

    Butterworth, Jonathan M. [Department of Physics and Astronomy, University College London,Gower St., London, WC1E 6BT (United Kingdom); Grellscheid, David [IPPP, Department of Physics, Durham University,Durham, DH1 3LE (United Kingdom); Krämer, Michael; Sarrazin, Björn [Institute for Theoretical Particle Physics and Cosmology, RWTH Aachen University,Sommerfeldstr. 16, 52056 Aachen (Germany); Yallup, David [Department of Physics and Astronomy, University College London,Gower St., London, WC1E 6BT (United Kingdom)

    2017-03-14

    A new method providing general consistency constraints for Beyond-the-Standard-Model (BSM) theories, using measurements at particle colliders, is presented. The method, ‘Constraints On New Theories Using Rivet’, CONTUR, exploits the fact that particle-level differential measurements made in fiducial regions of phase-space have a high degree of model-independence. These measurements can therefore be compared to BSM physics implemented in Monte Carlo generators in a very generic way, allowing a wider array of final states to be considered than is typically the case. The CONTUR approach should be seen as complementary to the discovery potential of direct searches, being designed to eliminate inconsistent BSM proposals in a context where many (but perhaps not all) measurements are consistent with the Standard Model. We demonstrate, using a competitive simplified dark matter model, the power of this approach. The CONTUR method is highly scaleable to other models and future measurements.

  20. Draft CSA standard on environmental risk assessments at class I nuclear facilities and uranium mines and mills

    International Nuclear Information System (INIS)

    Hart, D.; Garisto, N.; Parker, R.; Kovacs, R.; Thompson, B.

    2012-01-01

    The Canadian Standards Association (CSA) is preparing a draft Standard on environmental risk assessments (ERAs) at Class I nuclear facilities and uranium mines and mills (CSA N288.6). It is being prepared by a technical subcommittee of the CSA N288 Technical Committee, including experts from across the nuclear industry, government and regulatory authorities, and environmental service providers, among others. It addresses the design, implementation, and management of environmental risk assessment programs, and is intended to standardize practice across the industry. This paper outlines the scope of the draft Standard and highlights key features. It is under development and subject to change. (author)

  1. An introduction to the standard model of particle physics for the non-specialist

    CERN Document Server

    Marsh, Gerald E

    2018-01-01

    This book takes the reader from some elementary ideas about groups to the essence of the Standard Model of particle physics along a relatively straight and intuitive path. Groups alone are first used to arrive at a classical analog of the Dirac equation. Using elementary quantum mechanics, this analog can be turned into the actual Dirac equation, which governs the motion of the quarks and leptons of the Standard Model. After an introduction to the gauge principle, the groups introduced in the beginning of the book are used to give an introduction to the Standard Model. The idea is to give an Olympian view of this evolution, one that is often missing when absorbing the detailed subject matter of the Standard Model as presented in an historical approach to the subject.

  2. Lorentz-violating theories in the standard model extension

    Energy Technology Data Exchange (ETDEWEB)

    Ferreira Junior, Manoel Messias [Universidade Federal do Maranhao (UFMA), Sao Luis, MA (Brazil)

    2012-07-01

    Full text: Lorentz-violating theories have been an issue of permanent interest in the latest years. Many of these investigations are developed under the theoretical framework of the Standard Model Extension (SME), a broad extension of the minimal Standard Model embracing Lorentz-violating (LV) terms, generated as vacuum expectation values of tensor quantities, in all sectors of interaction. In this talk, we comment on some general properties of the SME, concerning mainly the gauge and fermion sectors, focusing in new phenomena induced by Lorentz violation. The LV terms are usually separated in accordance with the behavior under discrete symmetries, being classified as CPT-odd or CPT-even, parity-even or parity-odd. We follow this classification scheme discussing some features and new properties of the CPT-even and CPT-odd parts of the gauge and fermion sectors. We finalize presenting some upper bounds imposed on the corresponding LV coefficients. (author)

  3. Our sun. I. The standard model: Successes and failures

    International Nuclear Information System (INIS)

    Sackmann, I.J.; Boothroyd, A.I.; Fowler, W.A.

    1990-01-01

    The results of computing a number of standard solar models are reported. A presolar helium content of Y = 0.278 is obtained, and a Cl-37 capture rate of 7.7 SNUs, consistently several times the observed rate of 2.1 SNUs, is determined. Thus, the solar neutrino problem remains. The solar Z value is determined primarily by the observed Z/X ratio and is affected very little by differences in solar models. Even large changes in the low-temperature molecular opacities have no effect on Y, nor even on conditions at the base of the convective envelope. Large molecular opacities do cause a large increase in the mixing-length parameter alpha but do not cause the convective envelope to reach deeper. The temperature remains too low for lithium burning, and there is no surface lithium depletion; thus, the lithium problem of the standard solar model remains. 103 refs

  4. Model developments in TERRA_URB, the upcoming standard urban parametrization of the atmospheric numerical model COSMO(-CLM)

    Science.gov (United States)

    Wouters, Hendrik; Blahak, Ulrich; Helmert, Jürgen; Raschendorfer, Matthias; Demuzere, Matthias; Fay, Barbara; Trusilova, Kristina; Mironov, Dmitrii; Reinert, Daniel; Lüthi, Daniel; Machulskaya, Ekaterina

    2015-04-01

    In order to address urban climate at the regional scales, a new efficient urban land-surface parametrization TERRA_URB has been developed and coupled to the atmospheric numerical model COSMO-CLM. Hereby, several new advancements for urban land-surface models are introduced which are crucial for capturing the urban surface-energy balance and its seasonal dependency in the mid-latitudes. This includes a new PDF-based water-storage parametrization for impervious land, the representation of radiative absorption and emission by greenhouse gases in the infra-red spectrum in the urban canopy layer, and the inclusion of heat emission from human activity. TERRA_URB has been applied in offline urban-climate studies during European observation campaigns at Basel (BUBBLE), Toulouse (CAPITOUL), and Singapore, and currently applied in online studies for urban areas in Belgium, Germany, Switzerland, Helsinki, Singapore, and Melbourne. Because of its computational efficiency, high accuracy and its to-the-point conceptual easiness, TERRA_URB has been selected to become the standard urban parametrization of the atmospheric numerical model COSMO(-CLM). This allows for better weather forecasts for temperature and precipitation in cities with COSMO, and an improved assessment of urban outdoor hazards in the context of global climate change and urban expansion with COSMO-CLM. We propose additional extensions to TERRA_URB towards a more robust representation of cities over the world including their structural design. In a first step, COSMO's standard EXTernal PARarameter (EXTPAR) tool is updated for representing the cities into the land cover over the entire globe. Hereby, global datasets in the standard EXTPAR tool are used to retrieve the 'Paved' or 'sealed' surface Fraction (PF) referring to the presence of buildings and streets. Furthermore, new global data sets are incorporated in EXTPAR for describing the Anthropogenic Heat Flux (AHF) due to human activity, and optionally the

  5. Tracer methodology: an appropriate tool for assessing compliance with accreditation standards?

    Science.gov (United States)

    Bouchard, Chantal; Jean, Olivier

    2017-10-01

    Tracer methodology has been used by Accreditation Canada since 2008 to collect evidence on the quality and safety of care and services, and to assess compliance with accreditation standards. Given the importance of this methodology in the accreditation program, the objective of this study is to assess the quality of the methodology and identify its strengths and weaknesses. A mixed quantitative and qualitative approach was adopted to evaluate consistency, appropriateness, effectiveness and stakeholder synergy in applying the methodology. An online questionnaire was sent to 468 Accreditation Canada surveyors. According to surveyors' perceptions, tracer methodology is an effective tool for collecting useful, credible and reliable information to assess compliance with Qmentum program standards and priority processes. The results show good coherence between methodology components (appropriateness of the priority processes evaluated, activities to evaluate a tracer, etc.). The main weaknesses are the time constraints faced by surveyors and management's lack of cooperation during the evaluation of tracers. The inadequate amount of time allowed for the methodology to be applied properly raises questions about the quality of the information obtained. This study paves the way for a future, more in-depth exploration of the identified weaknesses to help the accreditation organization make more targeted improvements to the methodology. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  6. Assessing for Learning: Some Dimensions Underlying New Approaches to Educational Assessment.

    Science.gov (United States)

    Biggs, John

    1995-01-01

    Different models of performance assessment arise from interactions of three dimensions of assessment: the measurement versus the standards model of testing, quantitative and qualitative assumptions concerning the nature of learning, and whether learning and testing are situated or decontextualized. Addresses difficulties in implementing…

  7. Human experimental pain models: A review of standardized methods in drug development

    Directory of Open Access Journals (Sweden)

    K. Sunil kumar Reddy

    2012-01-01

    Full Text Available Human experimental pain models are essential in understanding the pain mechanisms and appear to be ideally suited to test analgesic compounds. The challenge that confronts both the clinician and the scientist is to match specific treatments to different pain-generating mechanisms and hence reach a pain treatment tailored to each individual patient. Experimental pain models offer the possibility to explore the pain system under controlled settings. Standardized stimuli of different modalities (i.e., mechanical, thermal, electrical, or chemical can be applied to the skin, muscles, and viscera for a differentiated and comprehensive assessment of various pain pathways and mechanisms. Using a multimodel-multistructure testing, the nociception arising from different body structures can be explored and modulation of specific biomarkers by new and existing analgesic drugs can be profiled. The value of human experimental pain models is to link animal and clinical pain studies, providing new possibilities for designing successful clinical trials. Spontaneous pain, the main compliant of the neuropathic patients, but currently there is no human model available that would mimic chronic pain. Therefore, current human pain models cannot replace patient studies for studying efficacy of analgesic compounds, although being helpful for proof-of-concept studies and dose finding.

  8. Subjective Video Quality Assessment in H.264/AVC Video Coding Standard

    Directory of Open Access Journals (Sweden)

    Z. Miličević

    2012-11-01

    Full Text Available This paper seeks to provide an approach for subjective video quality assessment in the H.264/AVC standard. For this purpose a special software program for the subjective assessment of quality of all the tested video sequences is developed. It was developed in accordance with recommendation ITU-T P.910, since it is suitable for the testing of multimedia applications. The obtained results show that in the proposed selective intra prediction and optimized inter prediction algorithm there is a small difference in picture quality (signal-to-noise ratio between decoded original and modified video sequences.

  9. Health risk assessment standards of cyanobacteria bloom occurrence in bathing sites

    Directory of Open Access Journals (Sweden)

    Agnieszka Stankiewicz

    2011-03-01

    Full Text Available Threat for human health appears during a massive cyanobacteria bloom in potable water used for human consumption or in basins used for recreational purposes. General health risk assessment standards and preventive measures to be taken by sanitation service were presented in scope of: – evaluation of cyanobacteria bloom occurrence in bathing sites / water bodies, – procedures in case of cyanobacteria bloom, including health risk assessment and decision making process to protect users’ health at bathing sites, – preventive measures, to be taken in case of cyanobacteria bloom occurrence in bathing sites and basins, where bathing sites are located.

  10. Standard Model CP-violation and baryon asymmetry

    CERN Document Server

    Gavela, M.B.; Orloff, J.; Pene, O.

    1994-01-01

    Simply based on CP arguments, we argue against a Standard Model explanation of the baryon asymmetry of the universe in the presence of a first order phase transition. A CP-asymmetry is found in the reflection coefficients of quarks hitting the phase boundary created during the electroweak transition. The problem is analyzed both in an academic zero temperature case and in the realistic finite temperature one. The building blocks are similar in both cases: Kobayashi-Maskawa CP-violation, CP-even phases in the reflection coefficients of quarks, and physical transitions due to fermion self-energies. In both cases an effect is present at order $\\alpha_W^2$ in rate. A standard GIM behaviour is found as intuitively expected. In the finite temperature case, a crucial role is played by the damping rate of quasi-particles in a hot plasma, which is a relevant scale together with $M_W$ and the temperature. The effect is many orders of magnitude below what observation requires, and indicates that non standard physics is ...

  11. Informal Assessment of Competences in the Context of Science Standards in Austria

    Science.gov (United States)

    Schiffl, Iris

    2016-01-01

    Science standards have been a topic in educational research in Austria for about ten years now. Starting in 2005, competency structure models have been developed for junior and senior classes of different school types. After evaluating these models, prototypic tasks were created to point out the meaning of the models to teachers. At the moment,…

  12. Adherence of pain assessment to the German national standard for pain management in 12 nursing homes

    OpenAIRE

    Osterbrink, Jürgen; Bauer, Zsuzsa; Mitterlehner, Barbara; Gnass, Irmela; Kutschar, Patrick

    2014-01-01

    BACKGROUND: Pain is very common among nursing home residents. The assessment of pain is a prerequisite for effective multiprofessional pain management. Within the framework of the German health services research project, ‘Action Alliance Pain-Free City Muenster’, the authors investigated pain assessment adherence according to the German national Expert Standard for Pain Management in Nursing, which is a general standard applicable to all chronic/acute pain-affected persons and highly recommen...

  13. HCPB TBM thermo mechanical design: Assessment with respect codes and standards and DEMO relevancy

    International Nuclear Information System (INIS)

    Cismondi, F.; Kecskes, S.; Aiello, G.

    2011-01-01

    In the frame of the activities of the European TBM Consortium of Associates the Helium Cooled Pebble Bed Test Blanket Module (HCPB-TBM) is developed in Karlsruhe Institute of Technology (KIT). After performing detailed thermal and fluid dynamic analyses of the preliminary HCPB TBM design, the thermo mechanical behaviour of the TBM under typical ITER loads has to be assessed. A synthesis of the different design options proposed has been realized building two different assemblies of the HCPB-TBM: these two assemblies and the analyses performed on them are presented in this paper. Finite Element thermo-mechanical analyses of two detailed 1/4 scaled models of the HCPB-TBM assemblies proposed have been performed, with the aim of verifying the accordance of the mechanical behaviour with the criteria of the design codes and standards. The structural design limits specified in the codes and standard are discussed in relation with the EUROFER available data and possible damage modes. Solutions to improve the weak structural points of the present design are identified and the DEMO relevancy of the present thermal and structural design parameters is discussed.

  14. Almost-commutative geometries beyond the standard model II: new colours

    International Nuclear Information System (INIS)

    Stephan, Christoph A

    2007-01-01

    We will present an extension of the standard model of particle physics in its almost-commutative formulation. This extension is guided by the minimal approach to almost-commutative geometries employed by Iochum et al (2004 J. Math. Phys. 45 5003 (Preprint hep-th/0312276)), Jureit and Stephan (2005 J. Math. Phys. 46 043512 (Preprint hep-th/0501134)), Schuecker (2005 Preprint hep-th/0501181), Jureit et al (2005 J. Math. Phys. 46 072303 (Preprint hep-th/0503190)) and Jureit and Stephan (2006 Preprint hep-th/0610040), although the model presented here is not minimal itself. The corresponding almost-commutative geometry leads to a Yang-Mills-Higgs model which consists of the standard model and two new fermions of opposite electromagnetic charge which may possess a new colour-like gauge group. As a new phenomenon, grand unification is no longer required by the spectral action

  15. Beyond the Standard Model Higgs boson searches using the ATLAS Experiment

    CERN Document Server

    Tsukerman, Ilya; The ATLAS collaboration

    2014-01-01

    The discovery of a Higgs boson with a mass of about 125 GeV has prompted the question of whether or not this particle is part of a larger and more complex Higgs sector than that envisioned in the Standard Model. In this talk, the current results from the ATLAS experiment on Beyond the Standard Model (BSM) Higgs boson searches are outlined. The results are interpreted in well-motivated BSM Higgs frameworks.

  16. Impact of model uncertainty on soil quality standards for cadmium in rice paddy fields

    Energy Technology Data Exchange (ETDEWEB)

    Roemkens, P.F.A.M., E-mail: paul.romkens@wur.nl [Soil Science Center, Alterra, WageningenUR. P.O. Box 47, 6700AA Wageningen (Netherlands); Brus, D.J. [Soil Science Center, Alterra, WageningenUR. P.O. Box 47, 6700AA Wageningen (Netherlands); Guo, H.Y.; Chu, C.L.; Chiang, C.M. [Taiwan Agricultural Research Institute (TARI), Wufong, Taiwan (China); Koopmans, G.F. [Soil Science Center, Alterra, WageningenUR. P.O. Box 47, 6700AA Wageningen (Netherlands); Department of Soil Quality, Wageningen University, WageningenUR. P.O. Box 47, 6700AA, Wageningen (Netherlands)

    2011-08-01

    At present, soil quality standards used for agriculture do not consider the influence of pH and CEC on the uptake of pollutants by crops. A database with 750 selected paired samples of cadmium (Cd) in soil and paddy rice was used to calibrate soil to plant transfer models using the soil metal content, pH, and CEC or soil Cd and Zn extracted by 0.01 M CaCl{sub 2} as explanatory variables. The models were validated against a set of 2300 data points not used in the calibration. These models were then used inversely to derive soil quality standards for Japonica and Indica rice cultivars based on the food quality standards for rice. To account for model uncertainty, strict soil quality standards were derived considering a maximum probability that rice exceeds the food quality standard equal to 10 or 5%. Model derived soil standards based on Aqua Regia ranged from less than 0.3 mg kg{sup -1} for Indica at pH 4.5 to more than 6 mg kg{sup -1} for Japonica-type cultivars in clay soils at pH 7. Based on the CaCl{sub 2} extract, standards ranged from 0.03 mg kg{sup -1} Cd for Indica cultivars to 0.1 mg kg{sup -1} Cd for Japonica cultivars. For both Japonica and Indica-type cultivars, the soil quality standards must be reduced by a factor of 2 to 3 to obtain the strict standards. The strong impact of pH and CEC on soil quality standards implies that it is essential to correct for soil type when deriving national or local standards. Validation on the remaining 2300 samples indicated that both types of models were able to accurately predict (> 92%) whether rice grown on a specific soil will meet the food quality standard used in Taiwan. - Research highlights: {yields} Cadmium uptake by Japonica and Indica rice varieties depends on soil pH and CEC. {yields} Food safety based soil standards range from 0.3 (Indica) to 6 mg kg{sup -1} (Japonica). {yields} Model uncertainty leads to strict soil standards of less than 0.1 mg kg{sup -1} for Indica. {yields} Soil pH and CEC should be

  17. Assessment of a Low-Cost Ultrasound Pericardiocentesis Model

    Directory of Open Access Journals (Sweden)

    Marco Campo dell'Orto

    2013-01-01

    Full Text Available Introduction. The use of ultrasound during resuscitation is emphasized in the latest European resuscitation council guidelines of 2013 to identify treatable conditions such as pericardial tamponade. The recommended standard treatment of tamponade in various guidelines is pericardiocentesis. As ultrasound guidance lowers the complication rates and increases the patient’s safety, pericardiocentesis should be performed under ultrasound guidance. Acute care physicians actually need to train emergency pericardiocentesis. Methods. We describe in detail a pericardiocentesis ultrasound model, using materials at a cost of about 60 euros. During training courses of focused echocardiography n=67, participants tested the phantom and completed a 16-item questionnaire, assessing the model using a visual analogue scale (VAS. Results. Eleven of fourteen questions were answered with a mean VAS score higher than 60% and thus regarded as showing the strengths of the model. Unrealistically outer appearance and heart shape were rated as weakness of the model. A total mean VAS score of all questions of 63% showed that participants gained confidence for further interventions. Conclusions. Our low-cost pericardiocentesis model, which can be easily constructed, may serve as an effective training tool of ultrasound-guided pericardiocentesis for acute and critical care physicians.

  18. Assessment of a Low-Cost Ultrasound Pericardiocentesis Model

    Science.gov (United States)

    Campo dell'Orto, Marco; Hempel, Dorothea; Starzetz, Agnieszka; Seibel, Armin; Hannemann, Ulf; Walcher, Felix; Breitkreutz, Raoul

    2013-01-01

    Introduction. The use of ultrasound during resuscitation is emphasized in the latest European resuscitation council guidelines of 2013 to identify treatable conditions such as pericardial tamponade. The recommended standard treatment of tamponade in various guidelines is pericardiocentesis. As ultrasound guidance lowers the complication rates and increases the patient's safety, pericardiocentesis should be performed under ultrasound guidance. Acute care physicians actually need to train emergency pericardiocentesis. Methods. We describe in detail a pericardiocentesis ultrasound model, using materials at a cost of about 60 euros. During training courses of focused echocardiography n = 67, participants tested the phantom and completed a 16-item questionnaire, assessing the model using a visual analogue scale (VAS). Results. Eleven of fourteen questions were answered with a mean VAS score higher than 60% and thus regarded as showing the strengths of the model. Unrealistically outer appearance and heart shape were rated as weakness of the model. A total mean VAS score of all questions of 63% showed that participants gained confidence for further interventions. Conclusions. Our low-cost pericardiocentesis model, which can be easily constructed, may serve as an effective training tool of ultrasound-guided pericardiocentesis for acute and critical care physicians. PMID:24288616

  19. Simulating the SU(2) sector of the standard model with dynamical fermions

    International Nuclear Information System (INIS)

    Lee, I. Hsiu.

    1988-01-01

    The two-generation SU(2) sector of the standard model with zero Yukawa couplings is studied on the lattice. The results from analytic studies and simulations with quenched fermions are reviewed. The methods and results of a Langevin simulation with dynamical fermions are presented. Implications for the strongly coupled standard model are mentioned. 23 refs

  20. Assessment and improvement of condensation models in RELAP5/MOD3.2

    Energy Technology Data Exchange (ETDEWEB)

    Choi, Ki Yong; Park, Hyun Sik; Kim, Sang Jae; No, Hee Chen [Korea Advanced Institute of Science and Technology, Taejon (Korea, Republic of)

    1998-12-31

    The condensation models in the standard RELAP5/MOD3.2 code are assessed and improved based on the database, which is constructed from the previous experimental data on various condensation phenomena. The default model of the laminar film condensation in RELAP5/MOD3.2 does not give any reliable predictions, and its alternative model always predicts higher values than the experimental data. Therefore, it is needed to develop a new correlation based on the experimental data of various operating ranges in the constructed database. The Shah correlation, which is used to calculate the turbulent film condensation heat transfer coefficients in the standard RELAP5/MOD3.2, well predicts the experimental data in the database. The horizontally stratified condensation model of RELAP5/MOD3.2 overpredicts both cocurrent and countercurrent experimental data. The correlation proposed by H.J.Kim predicts the database relatively well compared with that of RELAP6/MOD3.2. The RELAP5/MOD3.2 model should use the liquid velocity for the calculation of the liquid Reynolds number and be modified to consider the effects of the gas velocity and the film thickness. 2 refs., 5 figs., 1 tab. (Author)

  1. Assessment and improvement of condensation models in RELAP5/MOD3.2

    Energy Technology Data Exchange (ETDEWEB)

    Choi, Ki Yong; Park, Hyun Sik; Kim, Sang Jae; No, Hee Chen [Korea Advanced Institute of Science and Technology, Taejon (Korea, Republic of)

    1997-12-31

    The condensation models in the standard RELAP5/MOD3.2 code are assessed and improved based on the database, which is constructed from the previous experimental data on various condensation phenomena. The default model of the laminar film condensation in RELAP5/MOD3.2 does not give any reliable predictions, and its alternative model always predicts higher values than the experimental data. Therefore, it is needed to develop a new correlation based on the experimental data of various operating ranges in the constructed database. The Shah correlation, which is used to calculate the turbulent film condensation heat transfer coefficients in the standard RELAP5/MOD3.2, well predicts the experimental data in the database. The horizontally stratified condensation model of RELAP5/MOD3.2 overpredicts both cocurrent and countercurrent experimental data. The correlation proposed by H.J.Kim predicts the database relatively well compared with that of RELAP6/MOD3.2. The RELAP5/MOD3.2 model should use the liquid velocity for the calculation of the liquid Reynolds number and be modified to consider the effects of the gas velocity and the film thickness. 2 refs., 5 figs., 1 tab. (Author)

  2. e/sup +/e/sup -/ interactions at very high energy: searching beyond the standard model

    Energy Technology Data Exchange (ETDEWEB)

    Dorfan, J.

    1983-04-01

    These lectures discuss e/sup +/e/sup -/ interactions at very high energies with a particular emphasis on searching the standard model which we take to be SU(3)/sub color/..lambda.. SU(2) ..lambda.. U(1). The highest e/sup +/e/sup -/ collision energy exploited to date is at PETRA where data have been taken at 38 GeV. We will consider energies above this to be the very high energy frontier. The lectures will begin with a review of the collision energies which will be available in the upgraded machines of today and the machines planned for tomorrow. Without going into great detail, we will define the essential elements of the standard model. We will remind ourselves that some of these essential elements have not yet been verified and that part of the task of searching beyond the standard model will involve experiments aimed at this verification. For if we find the standard model lacking, then clearly we are forced to find an alternative. So we will investigate how the higher energy e/sup +/e/sup -/ collisions can be used to search for the top quark, the neutral Higgs scalar, provide true verification of the non-Abelian nature of QCD, etc. Having done this we will look at tests of models involving simple extensions of the standard model. Models considered are those without a top quark, those with charged Higgs scalars, with multiple and/or composite vector bosons, with additional generations and possible alternative explanations for the PETRA three jet events which don't require gluon bremsstrahlung. From the simple extensions of the standard model we will move to more radical alternatives, alternatives which have arisen from the unhappiness with the gauge hierarchy problem of the standard model. Technicolor, Supersymmetry and composite models will be discussed. In the final section we will summarize what the future holds in terms of the search beyond the standard model.

  3. Noncommutative geometry and its application to the standard model

    Energy Technology Data Exchange (ETDEWEB)

    Martinetti, Pierre [Georg-August Universitaet, Goettingen (Germany)

    2009-07-01

    We give an overview of the description of the standard model of particle physics minimally coupled to gravity within the framework of noncommutative geometry. Especially we study in detail the metric structure of spacetime that emerges from the spectral triple recently proposed by Chamseddine, Connes and Marcolli. Within this framework points of spacetime acquire an internal structure inherited from the gauge group of the standard model. A distance is defined on this generalized spacetime which is fully encoded by the Yang-Mills gauge fields together with the Higgs field. We focus on some explicit examples, underlying the link between this distance and other distances well known by physicists and mathematicians, such has the Carnot-Caratheodory horizontal distance or the Monge-Kantorovitch transport distance.

  4. Overview of the Standard Model Measurements with the ATLAS Detector

    CERN Document Server

    Liu, Yanwen; The ATLAS collaboration

    2017-01-01

    The ATLAS Collaboration is engaged in precision measurement of fundamental Standard Model parameters, such as the W boson mass, the weak-mixing angle or the strong coupling constant. In addition, the production cross-sections of a large variety of final states involving high energetic jets, photons as well as single and multi vector bosons are measured multi differentially at several center of mass energies. This allows to test perturbative QCD calculations to highest precision. In addition, these measurements allow also to test models beyond the SM, e.g. those leading to anomalous gauge couplings. In this talk, we give a broad overview of the Standard Model measurement campaign of the ATLAS collaboration, where selected topics will be discussed in more detail.

  5. Distinguishing standard model extensions using monotop chirality at the LHC

    Energy Technology Data Exchange (ETDEWEB)

    Allahverdi, Rouzbeh [Department of Physics and Astronomy, University of New Mexico,Albuquerque, NM 87131 (United States); Dalchenko, Mykhailo; Dutta, Bhaskar [Department of Physics and Astronomy,Mitchell Institute for Fundamental Physics and Astronomy, Texas A& M University,College Station, TX 77843-4242 (United States); Flórez, Andrés [Departamento de Física, Universidad de los Andes,Bogotá, Carrera 1 18A-10, Bloque IP (Colombia); Gao, Yu [Department of Physics and Astronomy,Mitchell Institute for Fundamental Physics and Astronomy, Texas A& M University,College Station, TX 77843-4242 (United States); Kamon, Teruki [Department of Physics and Astronomy,Mitchell Institute for Fundamental Physics and Astronomy, Texas A& M University,College Station, TX 77843-4242 (United States); Department of Physics, Kyungpook National University,Daegu 702-701 (Korea, Republic of); Kolev, Nikolay [Department of Physics, University of Regina,SK, S4S 0A2 (Canada); Mueller, Ryan [Department of Physics and Astronomy,Mitchell Institute for Fundamental Physics and Astronomy, Texas A& M University,College Station, TX 77843-4242 (United States); Segura, Manuel [Departamento de Física, Universidad de los Andes,Bogotá, Carrera 1 18A-10, Bloque IP (Colombia)

    2016-12-13

    We present two minimal extensions of the standard model, each giving rise to baryogenesis. They include heavy color-triplet scalars interacting with a light Majorana fermion that can be the dark matter (DM) candidate. The electroweak charges of the new scalars govern their couplings to quarks of different chirality, which leads to different collider signals. These models predict monotop events at the LHC and the energy spectrum of decay products of highly polarized top quarks can be used to establish the chiral nature of the interactions involving the heavy scalars and the DM. Detailed simulation of signal and standard model background events is performed, showing that top quark chirality can be distinguished in hadronic and leptonic decays of the top quarks.

  6. Dimensional reduction of the Standard Model coupled to a new singlet scalar field

    Energy Technology Data Exchange (ETDEWEB)

    Brauner, Tomáš [Faculty of Science and Technology, University of Stavanger,N-4036 Stavanger (Norway); Tenkanen, Tuomas V.I. [Department of Physics and Helsinki Institute of Physics,P.O. Box 64, FI-00014 University of Helsinki (Finland); Tranberg, Anders [Faculty of Science and Technology, University of Stavanger,N-4036 Stavanger (Norway); Vuorinen, Aleksi [Department of Physics and Helsinki Institute of Physics,P.O. Box 64, FI-00014 University of Helsinki (Finland); Weir, David J. [Faculty of Science and Technology, University of Stavanger,N-4036 Stavanger (Norway); Department of Physics and Helsinki Institute of Physics,P.O. Box 64, FI-00014 University of Helsinki (Finland)

    2017-03-01

    We derive an effective dimensionally reduced theory for the Standard Model augmented by a real singlet scalar. We treat the singlet as a superheavy field and integrate it out, leaving an effective theory involving only the Higgs and SU(2){sub L}×U(1){sub Y} gauge fields, identical to the one studied previously for the Standard Model. This opens up the possibility of efficiently computing the order and strength of the electroweak phase transition, numerically and nonperturbatively, in this extension of the Standard Model. Understanding the phase diagram is crucial for models of electroweak baryogenesis and for studying the production of gravitational waves at thermal phase transitions.

  7. Study on Modelling Standardization of Double-fed Wind Turbine and Its Application

    Directory of Open Access Journals (Sweden)

    Li Xiang

    2016-01-01

    Full Text Available Based on the standardized modelling of the International Modelling Team, study on double-fed induction generator (DFIG wind turbine is processed in this paper, aiming at capability of universally and reasonably reflecting key performance related to large scale system analysis. The standardized model proposed is of high degree of structural modularity, easy functional extension and universalization of control strategy and signal. Moreover, it is applicable for wind turbines produced by different manufacturers through model parameter adjustment. The complexity of the model can meet both needs of grid-connected characteristic simulation of wind turbine and large scale power system simulation.

  8. Low-energy photon-neutrino inelastic processes beyond the Standard Model

    CERN Document Server

    Abada, A.; Pittau, R.

    1999-01-01

    We investigate in this work the leading contributions of the MSSM with R-parity violation and of Left-Right models to the low-energy five-leg photon-neutrino processes. We discuss the results and compare them to the Standard Model ones.

  9. Numerical Models of Sewage Dispersion and Statistica Bathing Water Standards

    DEFF Research Database (Denmark)

    Petersen, Ole; Larsen, Torben

    1991-01-01

    As bathing water standards usually are founded in statistical methods, the numerical models used in outfall design should reflect this. A statistical approach, where stochastic variations in source strength and bacterial disappearance is incorporated into a numerical dilution model is presented. ...

  10. Higgs Boson Properties in the Standard Model and its Supersymmetric Extensions

    CERN Document Server

    Ellis, Jonathan Richard; Zwirner, F; Ellis, John; Ridolfi, Giovanni; Zwirner, Fabio

    2007-01-01

    We review the realization of the Brout-Englert-Higgs mechanism in the electroweak theory and describe the experimental and theoretical constraints on the mass of the single Higgs boson expected in the minimal Standard Model. We also discuss the couplings of this Higgs boson and its possible decay modes as functions of its unknown mass. We then review the structure of the Higgs sector in the minimal supersymmetric extension of the Standard Model (MSSM), noting the importance of loop corrections to the masses of its five physical Higgs bosons. Finally, we discuss some non-minimal models.

  11. Higgs boson properties in the Standard Model and its supersymmetric extensions

    International Nuclear Information System (INIS)

    Ellis, J.; Ridolfi, G.; Zwirner, F.

    2007-01-01

    We review the realization of the Brout-Englert-Higgs mechanism in the electroweak theory and describe the experimental and theoretical constraints on the mass of the single Higgs boson expected in the minimal Standard Model. We also discuss the couplings of this Higgs boson and its possible decay modes as functions of its unknown mass. We then review the structure of the Higgs sector in the minimal supersymmetric extension of the Standard Model (MSSM), noting the importance of loop corrections to the masses of its 5 physical Higgs bosons. Finally, we discuss some non-minimal models. (authors)

  12. SLHAplus: A library for implementing extensions of the standard model

    Science.gov (United States)

    Bélanger, G.; Christensen, Neil D.; Pukhov, A.; Semenov, A.

    2011-03-01

    We provide a library to facilitate the implementation of new models in codes such as matrix element and event generators or codes for computing dark matter observables. The library contains an SLHA reader routine as well as diagonalisation routines. This library is available in CalcHEP and micrOMEGAs. The implementation of models based on this library is supported by LanHEP and FeynRules. Program summaryProgram title: SLHAplus_1.3 Catalogue identifier: AEHX_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEHX_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 6283 No. of bytes in distributed program, including test data, etc.: 52 119 Distribution format: tar.gz Programming language: C Computer: IBM PC, MAC Operating system: UNIX (Linux, Darwin, Cygwin) RAM: 2000 MB Classification: 11.1 Nature of problem: Implementation of extensions of the standard model in matrix element and event generators and codes for dark matter observables. Solution method: For generic extensions of the standard model we provide routines for reading files that adopt the standard format of the SUSY Les Houches Accord (SLHA) file. The procedure has been generalized to take into account an arbitrary number of blocks so that the reader can be used in generic models including non-supersymmetric ones. The library also contains routines to diagonalize real and complex mass matrices with either unitary or bi-unitary transformations as well as routines for evaluating the running strong coupling constant, running quark masses and effective quark masses. Running time: 0.001 sec

  13. Assessment of N and P status at the landscape scale using environmental models and measurements

    International Nuclear Information System (INIS)

    Sonneveld, M.P.W.; Vos, J.A. de; Kros, J.; Knotters, M.; Frumau, A.; Bleeker, A.; Vries, W. de

    2012-01-01

    We assessed the compliance of a Dutch landscape, dominated by dairy farming, with environmental quality standards using a combination of model calculations and measurements. The total ammonia emission of 2.4 kton NH 3 yr −1 does not exceed the environmental quality standard (2.6 kton NH 3 yr −1 ). Nevertheless, the total N deposition (on average 24.4 kg N ha −1 yr −1 ) is such that critical N loads are exceeded at 53% of the nature areas. The deposited N mainly results from non-agricultural sources and agricultural sources outside the area (72%). The calculated average NO 3 − concentration in the upper groundwater does not exceed the 50 mg l −1 threshold. Calculated annual average N-total and P-total concentrations in discharge water are relatively high but these cannot be directly compared with thresholds for surface water. The results suggest that compliance monitoring at the landscape scale needs to include source indicators and cannot be based on state indicators alone. - Highlights: ► There is scope for environmental monitoring programs at the landscape scale. ► Landscape assessment of state indicators for N and P require models and measurements. ► Monitoring at the landscape scale needs to consider farm management indicators. - The compliance of an agricultural landscape with quality standards is investigated using a combination of model calculations and measurements.

  14. Collider physics within the standard model a primer

    CERN Document Server

    Altarelli, Guido

    2017-01-01

    With this graduate-level primer, the principles of the standard model of particle physics receive a particular skillful, personal and enduring exposition by one of the great contributors to the field. In 2013 the late Prof. Altarelli wrote: The discovery of the Higgs boson and the non-observation of new particles or exotic phenomena have made a big step towards completing the experimental confirmation of the standard model of fundamental particle interactions. It is thus a good moment for me to collect, update and improve my graduate lecture notes on quantum chromodynamics and the theory of electroweak interactions, with main focus on collider physics. I hope that these lectures can provide an introduction to the subject for the interested reader, assumed to be already familiar with quantum field theory and some basic facts in elementary particle physics as taught in undergraduate courses. “These lecture notes are a beautiful example of Guido’s unique pedagogical abilities and scientific vision”. From...

  15. Uncertainties in radioecological assessment models

    International Nuclear Information System (INIS)

    Hoffman, F.O.; Miller, C.W.; Ng, Y.C.

    1983-01-01

    Environmental radiological assessments rely heavily on the use of mathematical models. The predictions of these models are inherently uncertain because models are inexact representations of real systems. The major sources of this uncertainty are related to bias in model formulation and imprecision in parameter estimation. The magnitude of uncertainty is a function of the questions asked of the model and the specific radionuclides and exposure pathways of dominant importance. It is concluded that models developed as research tools should be distinguished from models developed for assessment applications. Furthermore, increased model complexity does not necessarily guarantee increased accuracy. To improve the realism of assessment modeling, stochastic procedures are recommended that translate uncertain parameter estimates into a distribution of predicted values. These procedures also permit the importance of model parameters to be ranked according to their relative contribution to the overall predicted uncertainty. Although confidence in model predictions can be improved through site-specific parameter estimation and increased model validation, health risk factors and internal dosimetry models will probably remain important contributors to the amount of uncertainty that is irreducible. 41 references, 4 figures, 4 tables

  16. Adherence of Pain Assessment to the German National Standard for Pain Management in 12 Nursing Homes

    Directory of Open Access Journals (Sweden)

    Jürgen Osterbrink

    2014-01-01

    Full Text Available BACKGROUND: Pain is very common among nursing home residents. The assessment of pain is a prerequisite for effective multiprofessional pain management. Within the framework of the German health services research project, ‘Action Alliance Pain-Free City Muenster’, the authors investigated pain assessment adherence according to the German national Expert Standard for Pain Management in Nursing, which is a general standard applicable to all chronic/acute pain-affected persons and highly recommended for practice.

  17. When do latent class models overstate accuracy for diagnostic and other classifiers in the absence of a gold standard?

    Science.gov (United States)

    Spencer, Bruce D

    2012-06-01

    Latent class models are increasingly used to assess the accuracy of medical diagnostic tests and other classifications when no gold standard is available and the true state is unknown. When the latent class is treated as the true class, the latent class models provide measures of components of accuracy including specificity and sensitivity and their complements, type I and type II error rates. The error rates according to the latent class model differ from the true error rates, however, and empirical comparisons with a gold standard suggest the true error rates often are larger. We investigate conditions under which the true type I and type II error rates are larger than those provided by the latent class models. Results from Uebersax (1988, Psychological Bulletin 104, 405-416) are extended to accommodate random effects and covariates affecting the responses. The results are important for interpreting the results of latent class analyses. An error decomposition is presented that incorporates an error component from invalidity of the latent class model. © 2011, The International Biometric Society.

  18. CP violation for electroweak baryogenesis from mixing of standard model and heavy vector quarks

    International Nuclear Information System (INIS)

    McDonald, J.

    1996-01-01

    It is known that the CP violation in the minimal standard model is insufficient to explain the observed baryon asymmetry of the Universe in the context electroweak baryogenesis. In this paper we consider the possibility that the additional CP violation required could originate in the mixing of the standard model quarks and heavy vector quark pairs. We consider the baryon asymmetry in the context of the spontaneous baryogenesis scenario. It is shown that, in general, the CP-violating phase entering the mass matrix of the standard model and heavy vector quarks must be space dependent in order to produce a baryon asymmetry, suggesting that the additional CP violation must be spontaneous in nature. This is true for the case of the simplest models which mix the standard model and heavy vector quarks. We derive a charge potential term for the model by diagonalizing the quark mass matrix in the presence of the electroweak bubble wall, which turns out to be quite different from the fermionic hypercharge potentials usually considered in spontaneous baryogenesis models, and obtain the rate of baryon number generation within the wall. We find, for the particular example where the standard model quarks mix with weak-isodoublet heavy vector quarks via the expectation value of a gauge singlet scalar, that we can account for the observed baryon asymmetry with conservative estimates for the uncertain parameters of electroweak baryogenesis, provided that the heavy vector quarks are not heavier than a few hundred GeV and that the coupling of the standard model quarks to the heavy vector quarks and gauge singlet scalars is not much smaller than order of 1, corresponding to a mixing angle of the heavy vector quarks and standard model quarks not much smaller than order of 10 -1 . copyright 1996 The American Physical Society

  19. LHC 2008 talks "What’s at stake for the Standard Model "

    CERN Multimedia

    2008-01-01

    All the visible matter in the Universe can be described by the Standard Model. According to this theory, matter consists of atoms, which are made up of electrons orbiting around nuclei, whose fundamental building blocks are known as the quarks. Four fundamental forces govern interactions between the elementary particles: the electromagnetic force, the gravitational force, and the strong and weak nuclear interactions. Experiments have fully borne out the description that the Standard Model gives us of these particles and their interactions. However, some fundamental questions remain unresolved: what is the origin of particle mass? Why do so many different types of particles exist? Is there a unified theory that could explain all interactions? What is the nature of the dark matter postulated by astrophysicists? CERN’s LHC will provide clues to resolving these questions beyond the Standard Model. Thursday, 29 May 2008 at 8.00 p.m....

  20. Domain walls in the extensions of the Standard Model

    Science.gov (United States)

    Krajewski, Tomasz; Lalak, Zygmunt; Lewicki, Marek; Olszewski, Paweł

    2018-05-01

    Our main interest is the evolution of domain walls of the Higgs field in the early Universe. The aim of this paper is to understand how dynamics of Higgs domain walls could be influenced by yet unknown interactions from beyond the Standard Model. We assume that the Standard Model is valid up to certain, high, energy scale Λ and use the framework of the effective field theory to describe physics below that scale. Performing numerical simulations with different values of the scale Λ we are able to extend our previous analysis [1]. Our recent numerical simulations show that evolution of Higgs domain walls is rather insensitive to interactions beyond the Standard Model as long as masses of new particles are grater than 1012 GeV. For lower values of Λ the RG improved effective potential is strongly modified at field strengths crucial to the evolution of domain walls. However, we find that even for low values of Λ, Higgs domain walls decayed shortly after their formation for generic initial conditions. On the other hand, in simulations with specifically chosen initial conditions Higgs domain walls can live longer and enter the scaling regime. We also determine the energy spectrum of gravitational waves produced by decaying domain walls of the Higgs field. For generic initial field configurations the amplitude of the signal is too small to be observed in planned detectors.

  1. Effect of standardized training on the reliability of the Cochrane risk of bias assessment tool: a study protocol.

    Science.gov (United States)

    da Costa, Bruno R; Resta, Nina M; Beckett, Brooke; Israel-Stahre, Nicholas; Diaz, Alison; Johnston, Bradley C; Egger, Matthias; Jüni, Peter; Armijo-Olivo, Susan

    2014-12-13

    The Cochrane risk of bias (RoB) tool has been widely embraced by the systematic review community, but several studies have reported that its reliability is low. We aim to investigate whether training of raters, including objective and standardized instructions on how to assess risk of bias, can improve the reliability of this tool. We describe the methods that will be used in this investigation and present an intensive standardized training package for risk of bias assessment that could be used by contributors to the Cochrane Collaboration and other reviewers. This is a pilot study. We will first perform a systematic literature review to identify randomized clinical trials (RCTs) that will be used for risk of bias assessment. Using the identified RCTs, we will then do a randomized experiment, where raters will be allocated to two different training schemes: minimal training and intensive standardized training. We will calculate the chance-corrected weighted Kappa with 95% confidence intervals to quantify within- and between-group Kappa agreement for each of the domains of the risk of bias tool. To calculate between-group Kappa agreement, we will use risk of bias assessments from pairs of raters after resolution of disagreements. Between-group Kappa agreement will quantify the agreement between the risk of bias assessment of raters in the training groups and the risk of bias assessment of experienced raters. To compare agreement of raters under different training conditions, we will calculate differences between Kappa values with 95% confidence intervals. This study will investigate whether the reliability of the risk of bias tool can be improved by training raters using standardized instructions for risk of bias assessment. One group of inexperienced raters will receive intensive training on risk of bias assessment and the other will receive minimal training. By including a control group with minimal training, we will attempt to mimic what many review authors

  2. Charged and neutral minimal supersymmetric standard model Higgs ...

    Indian Academy of Sciences (India)

    physics pp. 759–763. Charged and neutral minimal supersymmetric standard model Higgs boson decays and measurement of tan β at the compact linear collider. E CONIAVITIS and A FERRARI∗. Department of Nuclear and Particle Physics, Uppsala University, 75121 Uppsala, Sweden. ∗E-mail: ferrari@tsl.uu.se. Abstract.

  3. Non-small cell carcinoma: Comparison of postoperative intra- and extrathoracic recurrence assessment capability of qualitatively and/or quantitatively assessed FDG-PET/CT and standard radiological examinations

    Energy Technology Data Exchange (ETDEWEB)

    Onishi, Yumiko, E-mail: onitan@med.kobe-u.ac.jp [Department of Radiology, Kobe University Graduate School of Medicine, 7-5-2 Kusunoki-cho, Chuo-ku, Kobe 650-0017, Hyogo (Japan); Ohno, Yoshiharu, E-mail: yosirad@kobe-u.ac.jp [Department of Radiology, Kobe University Graduate School of Medicine, 7-5-2 Kusunoki-cho, Chuo-ku, Kobe 650-0017, Hyogo (Japan); Koyama, Hisanobu; Nogami, Munenobu; Takenaka, Daisuke [Department of Radiology, Kobe University Graduate School of Medicine, 7-5-2 Kusunoki-cho, Chuo-ku, Kobe 650-0017, Hyogo (Japan); Matsumoto, Keiko [Department of Radiology, Kobe University Graduate School of Medicine, 7-5-2 Kusunoki-cho, Chuo-ku, Kobe 650-0017, Hyogo (Japan); Department of Radiology, Yamanashi University, Shimokato, Yamanashi (Japan); Yoshikawa, Takeshi; Matsumoto, Sumiaki [Department of Radiology, Kobe University Graduate School of Medicine, 7-5-2 Kusunoki-cho, Chuo-ku, Kobe 650-0017, Hyogo (Japan); Maniwa, Yoshimasa [Division of Thoracic Surgery, Kobe University Graduate School of Medicine, Kobe, Hyogo (Japan); Nishimura, Yoshihiro [Division of Respiratory Medicine, Department of Internal Medicine, Kobe University Graduate School of Medicine, Kobe, Hyogo (Japan); Sugimura, Kazuro [Department of Radiology, Kobe University Graduate School of Medicine, 7-5-2 Kusunoki-cho, Chuo-ku, Kobe 650-0017, Hyogo (Japan)

    2011-09-15

    Purpose: The purpose of this study was to compare the capability of integrated FDG-PET/CT for assessment of postoperative intra- and extrathoracic recurrence in non-small cell lung cancer (NSCLC) patients with that of standard radiological examinations. Materials and methods: A total of 121 consecutive pathologically diagnosed NSCLC patients (80 males, 41 females; mean age, 71 years) underwent pathologically and surgically confirmed complete resection, followed by prospective integrated FDG-PET/CT and standard radiological examinations. Final diagnosis of recurrence was based on the results of more than 12 months of follow-up and/or pathological examinations. The probability of recurrence was assessed with either method for each patient by using 5-point visual scoring system, and final diagnosis was made by consensus between two readers. ROC analysis was used to compare the capability of the two methods for assessment of postoperative recurrence on a per-patient basis. The ROC-based positive test was used to determine optimal cut-off value for FDG uptake measurement at a site suspected on the basis of qualitatively assessed PET/CT. Finally, sensitivities, specificities and accuracies of all methods were compared by means of McNemar's test. Results: Areas under the curve of qualitatively assessed PET/CT and standard radiological examinations showed no significant differences (p > 0.05). At an optimal cut-off value of 2.5, specificity and accuracy of quantitatively and qualitatively assessed PET/CT were significantly higher than those of qualitatively assessed PET/CT and standard radiological examinations (p < 0.05). Conclusion: Accuracy of assessment of postoperative intra- and extrathoracic recurrence in NSCLC patients by qualitative and/or quantitative FDG-PET/CT is equivalent to or higher than that by standard radiological examinations.

  4. Non-small cell carcinoma: Comparison of postoperative intra- and extrathoracic recurrence assessment capability of qualitatively and/or quantitatively assessed FDG-PET/CT and standard radiological examinations

    International Nuclear Information System (INIS)

    Onishi, Yumiko; Ohno, Yoshiharu; Koyama, Hisanobu; Nogami, Munenobu; Takenaka, Daisuke; Matsumoto, Keiko; Yoshikawa, Takeshi; Matsumoto, Sumiaki; Maniwa, Yoshimasa; Nishimura, Yoshihiro; Sugimura, Kazuro

    2011-01-01

    Purpose: The purpose of this study was to compare the capability of integrated FDG-PET/CT for assessment of postoperative intra- and extrathoracic recurrence in non-small cell lung cancer (NSCLC) patients with that of standard radiological examinations. Materials and methods: A total of 121 consecutive pathologically diagnosed NSCLC patients (80 males, 41 females; mean age, 71 years) underwent pathologically and surgically confirmed complete resection, followed by prospective integrated FDG-PET/CT and standard radiological examinations. Final diagnosis of recurrence was based on the results of more than 12 months of follow-up and/or pathological examinations. The probability of recurrence was assessed with either method for each patient by using 5-point visual scoring system, and final diagnosis was made by consensus between two readers. ROC analysis was used to compare the capability of the two methods for assessment of postoperative recurrence on a per-patient basis. The ROC-based positive test was used to determine optimal cut-off value for FDG uptake measurement at a site suspected on the basis of qualitatively assessed PET/CT. Finally, sensitivities, specificities and accuracies of all methods were compared by means of McNemar's test. Results: Areas under the curve of qualitatively assessed PET/CT and standard radiological examinations showed no significant differences (p > 0.05). At an optimal cut-off value of 2.5, specificity and accuracy of quantitatively and qualitatively assessed PET/CT were significantly higher than those of qualitatively assessed PET/CT and standard radiological examinations (p < 0.05). Conclusion: Accuracy of assessment of postoperative intra- and extrathoracic recurrence in NSCLC patients by qualitative and/or quantitative FDG-PET/CT is equivalent to or higher than that by standard radiological examinations.

  5. Guidelines, Criteria, and Rules of Thumb for Evaluating Normed and Standardized Assessment Instruments in Psychology.

    Science.gov (United States)

    Cicchetti, Domenic V.

    1994-01-01

    In the context of developing assessment instruments in psychology, issues of standardization, norming procedures, and test reliability and validity are discussed. Criteria, guidelines, and rules of thumb are provided to help the clinician with instrument selection for a given psychological assessment. (SLD)

  6. Teachers' use of a self-assessment procedure : the role of criteria, standards, feedback and reflection

    NARCIS (Netherlands)

    Diggelen, van M.R.; Brok, den P.J.; Beijaard, D.

    2013-01-01

    This article reports on the way teachers assess their own coaching competencies regarding the development of vocational education students’ reflection skills. The participating teachers used a self-assessment procedure in which they had to judge themselves with the help of criteria and standards,

  7. Balance Assessment Practices and Use of Standardized Balance Measures Among Ontario Physical Therapists

    Science.gov (United States)

    Sibley, Kathryn M.; Straus, Sharon E.; Inness, Elizabeth L.; Salbach, Nancy M.

    2011-01-01

    Background Balance impairment is a significant problem for older adults, as it can influence daily functioning. Treating balance impairment in this population is a major focus of physical therapist practice. Objective The purpose of this study was to document current practices in clinical balance assessment and compare components of balance assessed and measures used across practice areas among physical therapists. Design This was a cross-sectional study. Methods A survey questionnaire was mailed to 1,000 practicing physical therapists in Ontario, Canada. Results Three hundred sixty-nine individuals completed the survey questionnaire. More than 80% of respondents reported that they regularly (more than 60% of the time) assessed postural alignment, static and dynamic stability, functional balance, and underlying motor systems. Underlying sensory systems, cognitive contributions to balance, and reactive control were regularly assessed by 59.6%, 55.0%, and 41.2% of the respondents, respectively. The standardized measures regularly used by the most respondents were the single-leg stance test (79.1%), the Berg Balance Scale (45.0%), and the Timed “Up & Go” Test (27.6%). There was considerable variation in the components of balance assessed and measures used by respondents treating individuals in the orthopedic, neurologic, geriatric, and general rehabilitation populations. Limitations The survey provides quantitative data about what is done to assess balance, but does not explain the factors influencing current practice. Conclusions Many important components of balance and standardized measures are regularly used by physical therapists to assess balance. Further research, however, is needed to understand the factors contributing to the relatively lower rates of assessing reactive control, the component of balance most directly responsible for avoiding a fall. PMID:21868613

  8. Comparison of air-standard rectangular cycles with different specific heat models

    International Nuclear Information System (INIS)

    Wang, Chao; Chen, Lingen; Ge, Yanlin; Sun, Fengrui

    2016-01-01

    Highlights: • Air-standard rectangular cycle models are built and investigated. • Finite-time thermodynamics is applied. • Different dissipation models and variable specific heats models are adopted. • Performance characteristics of different cycle models are compared. - Abstract: In this paper, performance comparison of air-standard rectangular cycles with constant specific heat (SH), linear variable SH and non-linear variable SH are conducted by using finite time thermodynamics. The power output and efficiency of each cycle model and the characteristic curves of power output versus compression ratio, efficiency versus compression ratio, as well as power output versus efficiency are obtained by taking heat transfer loss (HTL) and friction loss (FL) into account. The influences of HTL, FL and SH on cycle performance are analyzed by detailed numerical examples.

  9. The EUR assessment process, methodology and highlights of the compliance analysis for the EU-APWR standard design - 15235

    International Nuclear Information System (INIS)

    Facciolo, L.; Welander, D.; Nuutinen, P.

    2015-01-01

    In August 2007 the European Utility Requirements organisation (EUR) received an initial application from Mitsubishi Heavy Industries asking for submitting the EU-APWR standard design to the EUR assessment. The EU-APWR is an advanced PWR, 1700 MWe class, 4-loops, 14 ft active core fuel length. The EU-APWR Standard Design documentation has been assessed against the EUR Volume 2 - Generic Nuclear Island requirements - Revision D. The assessment is divided into 20 chapters for a total of over 4000 individual requirements. A Synthesis Report for each chapter was written by the assessment performers. The Synthesis Reports showed that the EU-APWR Standard Design was in compliance with 77% of the EUR requirements. The percentage increases to 85% when taking into account the requirements where the design has been considered in compliance with the objectives. The requirements resulting in a non-compliance assessment correspond to less than 2%. This confirms the overall good level of compliance. From the Utilities point of view it is possible to state that the differences in standards, codes and regulations applied in Japan and in Europe contribute to a series of discrepancies between the EU-APWR Standard Design and the EUR, regarding, for instance, outage durations, operational capability, layout, personal protection or radiation monitoring. Some disagreements are easy to overcome, others require particular attention

  10. Particle physics and cosmology beyond the Standard Model: inflation, dark matter and flavour

    International Nuclear Information System (INIS)

    Heurtier, L.

    2015-01-01

    This thesis has been focusing on beyond the Standard Model aspects of particle physics and their implication in cosmology. We have gone through this work along the timeline of the Universe History focusing on three major topics that are the inflationary period, the dark matter relic density production and detection, and finally the question of flavor changing constraints on low energy supersymmetric theories. In the first part of this thesis, after reviewing the theoretical and phenomenological aspects of both the Big Bang theory and the theory of Inflation we will study in detail how describing Inflation in a high energy supersymmetric theory. The second part of this thesis is dedicated to dark matter. We have studied phenomenological aspects of simple models, extending the present Standard Model with simple abelian symmetries, by assuming that the constituent of dark matter would be able to exchange information with the visible sector by the help of a mediator particle. We have studied in particular possible interactions of heavy or light dark matter with respectively the strong and the electroweak sectors of the Standard Model. Our models are strongly constrained of course by experiments. The third part of this work will be dedicated to a different aspect of beyond Standard Model theories, that is the treatment of the flavour changing processes of particle physics. The Minimal Supersymmetric Standard Model (MSSM), as one of these possible enlargement of the Standard Model, introduces new processes of flavour changing that are highly constrained by experiment. We present some works in which we consider the possibility of adding so called Dirac Gauginos to the MSSM to render flavour changing weaker in the theory, and propose different flavour patterns theories

  11. Efficient Lattice-Based Signcryption in Standard Model

    Directory of Open Access Journals (Sweden)

    Jianhua Yan

    2013-01-01

    Full Text Available Signcryption is a cryptographic primitive that can perform digital signature and public encryption simultaneously at a significantly reduced cost. This advantage makes it highly useful in many applications. However, most existing signcryption schemes are seriously challenged by the booming of quantum computations. As an interesting stepping stone in the post-quantum cryptographic community, two lattice-based signcryption schemes were proposed recently. But both of them were merely proved to be secure in the random oracle models. Therefore, the main contribution of this paper is to propose a new lattice-based signcryption scheme that can be proved to be secure in the standard model.

  12. A see-saw scenario of an $A_4$ flavour symmetric standard model

    CERN Document Server

    Dinh, Dinh Nguyen; Văn, Phi Quang; Vân, Nguyen Thi Hông

    2016-01-01

    A see-saw scenario for an $A_4$ flavour symmetric standard model is presented. As before, the see-saw mechanism can be realized in several models of different types depending on different ways of neutrino mass generation corresponding to the introduction of new fields with different symmetry structures. In the present paper, a general desription of all these see-saw types is made with a more detailed investigation on type-I models. As within the original see-saw mechanism, the symmetry structure of the standard model fields decides the number and the symmetry structure of the new fields. In a model considered here, the scalar sector consists of three standard-model-Higgs-like iso-doublets ($SU_L(2)$-doublets) forming an $A_4$ triplet. The latter is a superposition of three mass-eigen states, one of which could be identified with the recently discovered Higgs boson. A possible relation to the still-deliberated 750 GeV diphoton resonance at the 13 TeV LHC collisions is also discussed. In the lepton sector, the ...

  13. The Impact of Early Exposure of Eighth Grade Math Standards on End of Grade Assessments

    Science.gov (United States)

    Robertson, Tonjai E.

    2016-01-01

    The purpose of this study was to examine the Cumberland County Schools district-wide issue surrounding the disproportional performance of eighth grade Math I students' proficiency scores on standardized end-of-grade and end-of-course assessments. The study focused on the impact of the school district incorporating eighth grade math standards in…

  14. Neutron electric dipole moment and extension of the standard model

    International Nuclear Information System (INIS)

    Oshimo, Noriyuki

    2001-01-01

    A nonvanishing value for the electric dipole moment (EDM) of the neutron is a prominent signature for CP violation. The EDM induced by the Kobayashi-Maskawa mechanism of the standard model (SM) has a small magnitude and its detection will be very difficult. However, since baryon asymmetry of the universe cannot be accounted for by the SM, there should exist some other source of CP violation, which may generate a large magnitude for the EDM. One of the most hopeful candidates for physics beyond the SM is the supersymmetric standard model, which contains such sources of CP violation. This model suggests that the EDM has a magnitude not much smaller than the present experimental bounds. Progress in measuring the EDM provides very interesting information about extension of the SM. (author)

  15. Video Modeling of SBIRT for Alcohol Use Disorders Increases Student Empathy in Standardized Patient Encounters.

    Science.gov (United States)

    Crisafio, Anthony; Anderson, Victoria; Frank, Julia

    2018-04-01

    The purpose of this study was to assess the usefulness of adding video models of brief alcohol assessment and counseling to a standardized patient (SP) curriculum that covers and tests acquisition of this skill. The authors conducted a single-center, retrospective cohort study of third- and fourth-year medical students between 2013 and 2015. All students completed a standardized patient (SP) encounter illustrating the diagnosis of alcohol use disorder, followed by an SP exam on the same topic. Beginning in August 2014, the authors supplemented the existing formative SP exercise on problem drinking with one of two 5-min videos demonstrating screening, brief intervention, and referral for treatment (SBIRT). P values and Z tests were performed to evaluate differences between students who did and did not see the video in knowledge and skills related to alcohol use disorders. One hundred ninety-four students were included in this analysis. Compared to controls, subjects did not differ in their ability to uncover and accurately characterize an alcohol problem during a standardized encounter (mean exam score 41.29 vs 40.93, subject vs control, p = 0.539). However, the SPs' rating of students' expressions of empathy were significantly higher for the group who saw the video (81.63 vs 69.79%, p videos would improve students' recognition and knowledge of alcohol-related conditions. However, feedback from the SPs produced the serendipitous finding that the communication skills demonstrated in the videos had a sustained effect in enhancing students' professional behavior.

  16. The implementation of internal assessment mechanisms in the management of the educational program: ESG principles and new educational standards in the Russian Federation

    Directory of Open Access Journals (Sweden)

    Nikanorov Ivan

    2018-01-01

    Full Text Available The purpose of this article is to describe possible approaches to transforming the management of the educational program into an educational institution of higher education in the context of implementing the principles laid down in the Standards and Recommendations for Quality Assurance in the European Higher Education Area (ESG, as well as the updated Russian state higher education standards (RSHES 3 ++. The article analyzes the ESG, examines the main models of managing educational programs, as well as their possible transformations in terms of the formation of mechanisms for internal assessment of the quality of educational activities and the training of students, introduced in the updated federal state educational standards.

  17. Statistical aspects of autoregressive-moving average models in the assessment of radon mitigation

    International Nuclear Information System (INIS)

    Dunn, J.E.; Henschel, D.B.

    1989-01-01

    Radon values, as reflected by hourly scintillation counts, seem dominated by major, pseudo-periodic, random fluctuations. This methodological paper reports a moderate degree of success in modeling these data using relatively simple autoregressive-moving average models to assess the effectiveness of radon mitigation techniques in existing housing. While accounting for the natural correlation of successive observations, familiar summary statistics such as steady state estimates, standard errors, confidence limits, and tests of hypothesis are produced. The Box-Jenkins approach is used throughout. In particular, intervention analysis provides an objective means of assessing the effectiveness of an active mitigation measure, such as a fan off/on cycle. Occasionally, failure to declare a significant intervention has suggested a means of remedial action in the data collection procedure

  18. A standardized patient model to teach and assess professionalism and communication skills: the effect of personality type on performance.

    Science.gov (United States)

    Lifchez, Scott D; Redett, Richard J

    2014-01-01

    Teaching and assessing professionalism and interpersonal communication skills can be more difficult for surgical residency programs than teaching medical knowledge or patient care, for which many structured educational curricula and assessment tools exist. Residents often learn these skills indirectly, by observing the behavior of their attendings when communicating with patients and colleagues. The purpose of this study was to assess the results of an educational curriculum we created to teach and assess our residents in professionalism and communication. We assessed resident and faculty prior education in delivering bad news to patients. Residents then participated in a standardized patient (SP) encounter to deliver bad news to a patient's family regarding a severe burn injury. Residents received feedback from the encounter and participated in an education curriculum on communication skills and professionalism. As a part of this curriculum, residents underwent assessment of communication style using the Myers-Briggs type inventory. The residents then participated in a second SP encounter discussing a severe pulmonary embolus with a patient's family. Resident performance on the SP evaluation correlated with an increased comfort in delivering bad news. Comfort in delivering bad news did not correlate with the amount of prior education on the topic for either residents or attendings. Most of our residents demonstrated an intuitive thinking style (NT) on the Myers-Briggs type inventory, very different from population norms. The lack of correlation between comfort in delivering bad news and prior education on the subject may indicate the difficulty in imparting communication and professionalism skills to residents effectively. Understanding communication style differences between our residents and the general population can help us teach professionalism and communication skills more effectively. With the next accreditation system, residency programs would need to

  19. Challenges to the standard model of Big Bang nucleosynthesis

    International Nuclear Information System (INIS)

    Steigman, G.

    1993-01-01

    Big Bang nucleosynthesis provides a unique probe of the early evolution of the Universe and a crucial test of the consistency of the standard hot Big Bang cosmological model. Although the primordial abundances of 2 H, 3 He, 4 He, and 7 Li inferred from current observational data are in agreement with those predicted by Big Bang nucleosynthesis, recent analysis has severely restricted the consistent range for the nucleon-to-photon ratio: 3.7 ≤ η 10 ≤ 4.0. Increased accuracy in the estimate of primordial 4 he and observations of Be and B in Pop II stars are offering new challenges to the standard model and suggest that no new light particles may be allowed (N ν BBN ≤ 3.0, where N ν is the number of equivalent light neutrinos). 23 refs

  20. In vivo validation of cardiac output assessment in non-standard 3D echocardiographic images

    Energy Technology Data Exchange (ETDEWEB)

    Nillesen, M M; Lopata, R G P; Gerrits, I H; Thijssen, J M; De Korte, C L [Clinical Physics Laboratory-833, Department of Pediatrics, Radboud University Nijmegen Medical Centre, Nijmegen (Netherlands); De Boode, W P [Neonatology, Department of Pediatrics, Radboud University Nijmegen Medical Centre, Nijmegen (Netherlands); Huisman, H J [Department of Radiology, Radboud University Nijmegen Medical Centre, Nijmegen (Netherlands); Kapusta, L [Pediatric Cardiology, Department of Pediatrics, Radboud University Nijmegen Medical Centre, Nijmegen (Netherlands)], E-mail: m.m.nillesen@cukz.umcn.nl

    2009-04-07

    Automatic segmentation of the endocardial surface in three-dimensional (3D) echocardiographic images is an important tool to assess left ventricular (LV) geometry and cardiac output (CO). The presence of speckle noise as well as the nonisotropic characteristics of the myocardium impose strong demands on the segmentation algorithm. In the analysis of normal heart geometries of standardized (apical) views, it is advantageous to incorporate a priori knowledge about the shape and appearance of the heart. In contrast, when analyzing abnormal heart geometries, for example in children with congenital malformations, this a priori knowledge about the shape and anatomy of the LV might induce erroneous segmentation results. This study describes a fully automated segmentation method for the analysis of non-standard echocardiographic images, without making strong assumptions on the shape and appearance of the heart. The method was validated in vivo in a piglet model. Real-time 3D echocardiographic image sequences of five piglets were acquired in radiofrequency (rf) format. These ECG-gated full volume images were acquired intra-operatively in a non-standard view. Cardiac blood flow was measured simultaneously by an ultrasound transit time flow probe positioned around the common pulmonary artery. Three-dimensional adaptive filtering using the characteristics of speckle was performed on the demodulated rf data to reduce the influence of speckle noise and to optimize the distinction between blood and myocardium. A gradient-based 3D deformable simplex mesh was then used to segment the endocardial surface. A gradient and a speed force were included as external forces of the model. To balance data fitting and mesh regularity, one fixed set of weighting parameters of internal, gradient and speed forces was used for all data sets. End-diastolic and end-systolic volumes were computed from the segmented endocardial surface. The cardiac output derived from this automatic segmentation was