WorldWideScience

Sample records for included standardized measures

  1. 34 CFR 403.202 - What must each State's system of core standards and measures of performance include?

    Science.gov (United States)

    2010-07-01

    ... academic skills; (2) One or more measures of the following: (i) Student competency attainment. (ii) Job or... secondary school or its equivalent. (iv) Placement into additional training or education, military service...) Procedures for using existing resources and methods developed in other programs receiving Federal assistance...

  2. Standards for holdup measurement

    International Nuclear Information System (INIS)

    Zucker, M.S.

    1982-01-01

    Holdup measurement, needed for material balance, depend intensively on standards and on interpretation of the calibration procedure. More than other measurements, the calibration procedure using the standard becomes part of the standard. Standards practical for field use and calibration techniques have been developed. While accuracy in holdup measurements is comparatively poor, avoidance of bias is a necessary goal

  3. Standardization of depression measurement

    DEFF Research Database (Denmark)

    Wahl, Inka; Löwe, Bernd; Bjørner, Jakob

    2014-01-01

    OBJECTIVES: To provide a standardized metric for the assessment of depression severity to enable comparability among results of established depression measures. STUDY DESIGN AND SETTING: A common metric for 11 depression questionnaires was developed applying item response theory (IRT) methods. Data...... of 33,844 adults were used for secondary analysis including routine assessments of 23,817 in- and outpatients with mental and/or medical conditions (46% with depressive disorders) and a general population sample of 10,027 randomly selected participants from three representative German household surveys....... RESULTS: A standardized metric for depression severity was defined by 143 items, and scores were normed to a general population mean of 50 (standard deviation = 10) for easy interpretability. It covers the entire range of depression severity assessed by established instruments. The metric allows...

  4. Standard Weights and Measures

    Indian Academy of Sciences (India)

    The mass standard, represented by the proto- type kilogram, is the only remaining artifact, but there are promising proposals to replace that in the near future. Ever since humans started living in community settle- ments, day to day activities have required the adoption of a set of standards for weights and measures. For ex-.

  5. Standard Weights and Measures

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 6; Issue 8. Standard Weights and Measures. Vasant Natarajan. General Article Volume 6 Issue 8 August 2001 pp 44-59. Fulltext. Click here to view fulltext PDF. Permanent link: http://www.ias.ac.in/article/fulltext/reso/006/08/0044-0059. Author Affiliations.

  6. Use of a non-linear method for including the mass uncertainty of gravimetric standards and system measurement errors in the fitting of calibration curves for XRFA freeze-dried UNO3 standards

    International Nuclear Information System (INIS)

    Pickles, W.L.; McClure, J.W.; Howell, R.H.

    1978-05-01

    A sophisticated nonlinear multiparameter fitting program was used to produce a best fit calibration curve for the response of an x-ray fluorescence analyzer to uranium nitrate, freeze dried, 0.2% accurate, gravimetric standards. The program is based on unconstrained minimization subroutine, VA02A. The program considers the mass values of the gravimetric standards as parameters to be fit along with the normal calibration curve parameters. The fitting procedure weights with the system errors and the mass errors in a consistent way. The resulting best fit calibration curve parameters reflect the fact that the masses of the standard samples are measured quantities with a known error. Error estimates for the calibration curve parameters can be obtained from the curvature of the ''Chi-Squared Matrix'' or from error relaxation techniques. It was shown that nondispersive XRFA of 0.1 to 1 mg freeze-dried UNO 3 can have an accuracy of 0.2% in 1000 s

  7. Coordinate Standard Measurement Development

    Energy Technology Data Exchange (ETDEWEB)

    Hanshaw, R.A.

    2000-02-18

    A Shelton Precision Interferometer Base, which is used for calibration of coordinate standards, was improved through hardware replacement, software geometry error correction, and reduction of vibration effects. Substantial increases in resolution and reliability, as well as reduction in sampling time, were achieved through hardware replacement; vibration effects were reduced substantially through modification of the machine component dampening and software routines; and the majority of the machine's geometry error was corrected through software geometry error correction. Because of these modifications, the uncertainty of coordinate standards calibrated on this device has been reduced dramatically.

  8. Standard Weights and Measures

    Indian Academy of Sciences (India)

    and measures from olden days to their modern .... 1 second per day. John Harrison, a carpenter and self-taught clock-maker, refined temperature compensation techniques and added new methods of reducing friction ... slave pendulum gave the master pendulum gentle pushes needed to maintain its motion, and also drove.

  9. Measurements, Standards, and the SI.

    Science.gov (United States)

    Journal of Chemical Education, 1983

    1983-01-01

    Highlights six papers presented at the Seventh Biennial Conference on Chemical Education (Stillwater, Oklahoma 1982). Topics addressed included history, status, and future of SI units, algebra of SI units, periodic table, new standard-state pressure unit, and suggested new names for mole concept ("numerity" and "chemical amount"). (JN)

  10. 2016 Updated American Society of Clinical Oncology/Oncology Nursing Society Chemotherapy Administration Safety Standards, Including Standards for Pediatric Oncology.

    Science.gov (United States)

    Neuss, Michael N; Gilmore, Terry R; Belderson, Kristin M; Billett, Amy L; Conti-Kalchik, Tara; Harvey, Brittany E; Hendricks, Carolyn; LeFebvre, Kristine B; Mangu, Pamela B; McNiff, Kristen; Olsen, MiKaela; Schulmeister, Lisa; Von Gehr, Ann; Polovich, Martha

    2016-12-01

    Purpose To update the ASCO/Oncology Nursing Society (ONS) Chemotherapy Administration Safety Standards and to highlight standards for pediatric oncology. Methods The ASCO/ONS Chemotherapy Administration Safety Standards were first published in 2009 and updated in 2011 to include inpatient settings. A subsequent 2013 revision expanded the standards to include the safe administration and management of oral chemotherapy. A joint ASCO/ONS workshop with stakeholder participation, including that of the Association of Pediatric Hematology Oncology Nurses and American Society of Pediatric Hematology/Oncology, was held on May 12, 2015, to review the 2013 standards. An extensive literature search was subsequently conducted, and public comments on the revised draft standards were solicited. Results The updated 2016 standards presented here include clarification and expansion of existing standards to include pediatric oncology and to introduce new standards: most notably, two-person verification of chemotherapy preparation processes, administration of vinca alkaloids via minibags in facilities in which intrathecal medications are administered, and labeling of medications dispensed from the health care setting to be taken by the patient at home. The standards were reordered and renumbered to align with the sequential processes of chemotherapy prescription, preparation, and administration. Several standards were separated into their respective components for clarity and to facilitate measurement of adherence to a standard. Conclusion As oncology practice has changed, so have chemotherapy administration safety standards. Advances in technology, cancer treatment, and education and training have prompted the need for periodic review and revision of the standards. Additional information is available at http://www.asco.org/chemo-standards .

  11. A tool for standardized collector performance calculations including PVT

    DEFF Research Database (Denmark)

    Perers, Bengt; Kovacs, Peter; Olsson, Marcus

    2012-01-01

    A tool for standardized calculation of solar collector performance has been developed in cooperation between SP Technical Research Institute of Sweden, DTU Denmark and SERC Dalarna University. The tool is designed to calculate the annual performance of solar collectors at representative locations...

  12. A tool for standardized collector performance calculations including PVT

    DEFF Research Database (Denmark)

    Perers, Bengt; Kovacs, Peter; Olsson, Marcus

    2012-01-01

    A tool for standardized calculation of solar collector performance has been developed in cooperation between SP Technical Research Institute of Sweden, DTU Denmark and SERC Dalarna University. The tool is designed to calculate the annual performance of solar collectors at representative locations...... can be tested and modeled as a thermal collector, when the PV electric part is active with an MPP tracker in operation. The thermal collector parameters from this operation mode are used for the PVT calculations....

  13. Standard-Setting Methods as Measurement Processes

    Science.gov (United States)

    Nichols, Paul; Twing, Jon; Mueller, Canda D.; O'Malley, Kimberly

    2010-01-01

    Some writers in the measurement literature have been skeptical of the meaningfulness of achievement standards and described the standard-setting process as blatantly arbitrary. We argue that standard setting is more appropriately conceived of as a measurement process similar to student assessment. The construct being measured is the panelists'…

  14. Standardized Testing of Phasor Measurement Units

    Energy Technology Data Exchange (ETDEWEB)

    Martin, Kenneth E.; Faris, Anthony J.; Hauer, John F.

    2006-05-31

    This paper describes a set of tests used to determine Phasor Measurement Unit (PMU) measurement characteristics under steady state and dynamic conditions. The methodology is repeatable, comparable among test facilities, and can be performed at any facility with commonly available relay and standard test equipment. The methodology is based upon using test signals that are mathematically generated from a signal model and played into the PMU with precise GPS synchronization. Timing flags included with the test signal provide correlate the test signals and the PMU output. This allows accurate comparison of the phasor model with the value estimated by the PMU for accurate performance analysis. The timing flags also facilitate programmed plot and report generation.

  15. Standard Model measurements with the ATLAS detector

    Directory of Open Access Journals (Sweden)

    Hassani Samira

    2015-01-01

    Full Text Available Various Standard Model measurements have been performed in proton-proton collisions at a centre-of-mass energy of √s = 7 and 8 TeV using the ATLAS detector at the Large Hadron Collider. A review of a selection of the latest results of electroweak measurements, W/Z production in association with jets, jet physics and soft QCD is given. Measurements are in general found to be well described by the Standard Model predictions.

  16. Russian standards for dimensional measurements for nanotechnologies

    Science.gov (United States)

    Gavrilenko, V. P.; Filippov, M. N.; Novikov, Yu. A.; Rakov, A. V.; Todua, P. A.

    2009-05-01

    In order to provide the uniformity of measurements at the nanoscale, seven national standards have been developed in the Russian Federation. Of these seven standards, three standards specify the procedures of fabrication and certification of linear measures with the linewidth lying in the nanometer range. The other four standards specify the procedures of verification and calibration of customer's atomic force microscopes and scanning electron microscopes, intended to perform measurements of linear dimensions of relief nanostructures. For an atomic force microscope, the following four parameters can be deduced: scale factor for the video signal, effective radius of the cantilever tip, scale factor for the vertical axis of the microscope, relative deflection of the microscope's Z-scanner from the orthogonality to the plane of a sample surface. For a scanning electron microscope, the following two parameters can be deduced: scale factor for the video signal and the effective diameter of the electron beam. The standards came into force in 2008.

  17. Including Alternative Resources in State Renewable Portfolio Standards: Current Design and Implementation Experience

    Energy Technology Data Exchange (ETDEWEB)

    Heeter, J.; Bird, L.

    2012-11-01

    Currently, 29 states, the District of Columbia, and Puerto Rico have instituted a renewable portfolio standard (RPS). An RPS sets a minimum threshold for how much renewable energy must be generated in a given year. Each state policy is unique, varying in percentage targets, timetables, and eligible resources. This paper examines state experience with implementing renewable portfolio standards that include energy efficiency, thermal resources, and non-renewable energy and explores compliance experience, costs, and how states evaluate, measure, and verify energy efficiency and convert thermal energy. It aims to gain insights from the experience of states for possible federal clean energy policy as well as to share experience and lessons for state RPS implementation.

  18. Including alternative resources in state renewable portfolio standards: Current design and implementation experience

    International Nuclear Information System (INIS)

    Heeter, Jenny; Bird, Lori

    2013-01-01

    As of October 2012, 29 states, the District of Columbia, and Puerto Rico have instituted a renewable portfolio standard (RPS). Each state policy is unique, varying in percentage targets, timetables, and eligible resources. Increasingly, new RPS polices have included alternative resources. Alternative resources have included energy efficiency, thermal resources, and, to a lesser extent, non-renewables. This paper examines state experience with implementing renewable portfolio standards that include energy efficiency, thermal resources, and non-renewable energy and explores compliance experience, costs, and how states evaluate, measure, and verify energy efficiency and convert thermal energy. It aims to gain insights from the experience of states for possible federal clean energy policy as well as to share experience and lessons for state RPS implementation. - Highlights: • Increasingly, new RPS policies have included alternative resources. • Nearly all states provide a separate tier or cap on the quantity of eligible alternative resources. • Where allowed, non-renewables and energy efficiency are being heavily utilized

  19. Benefits of including methane measurements in selection strategies.

    Science.gov (United States)

    Robinson, D L; Oddy, V H

    2016-09-01

    Estimates of genetic/phenotypic covariances and economic values for slaughter weight, growth, feed intake and efficiency, and three potential methane traits were compiled to explore the effect of incorporating methane measurements in breeding objectives for cattle and meat sheep. The cost of methane emissions was assumed to be zero (scenario A), A$476/t (based on A$14/t CO equivalent and methane's 100-yr global warming potential [GWP] of 34; scenario B), or A$2,580/t (A$30/t CO equivalent combined with methane's 20-yr GWP of 86; scenario C). Methane traits were methane yield (MY; methane production divided by feed intake based on measurements over 1 d in respiration chambers) or short-term measurements of methane production adjusted for live weight (MPadjWt) in grazing animals, e.g., 40-60 min measurements in portable accumulation chambers (PAC) on 1 or 3 occasions, or measurements for 1 wk using a GreenFeed Emissions Monitor (GEM) on 1 or 3 occasions. Feed costs included the cost of maintaining the breeding herd and growth from weaning to slaughter. Sheep were assumed to be grown and finished on pasture (A$50/t DM). Feed costs for cattle included 365 d on pasture for the breeding herd and averages of 200 d postweaning grow-out on pasture and 100 d feedlot finishing. The greatest benefit of including methane in the breeding objective for both sheep and cattle was as a proxy for feed intake. For cattle, 3 GEM measurements were estimated to increase profit from 1 round of selection in scenario A (no payment for methane) by A$6.24/animal (from A$20.69 to A$26.93) because of reduced feed costs relative to gains in slaughter weight and by A$7.16 and A$12.09/animal, respectively, for scenarios B and C, which have payments for reduced methane emissions. For sheep, the improvements were more modest. Returns from 1 round of selection (no methane measurements) were A$5.06 (scenario A), A$4.85 (scenario B), and A$3.89 (scenario C) compared to A$5.26 (scenario A), A$5

  20. Migration path for structured documentation systems including standardized medical device data.

    Science.gov (United States)

    Kock, Ann-Kristin; Ingenerf, Josef; Halkaliev, Stoyan; Handels, Heinz

    2012-01-01

    A standardized end-to-end solution has been implemented with the aim of supporting the semantic integration of clinical content in institution spanning applications. The approach outlined is a proof-of-concept design. It has shown that the standards chosen are suitable to integrate device data into forms, to document the results consistently and finally enable semantic interoperability. In detail the implementation includes a standardized device interface, a standardized representation of data entry forms and enables the communication of structured data via HL7 CDA. Because the proposed method applies a combination of standards semantic interoperability and the possibility of a contextual interpretation at each stage can be ensured.

  1. The Agency's Safety Standards and Measures

    International Nuclear Information System (INIS)

    1976-04-01

    The Agency's Health and Safety Measures were first, approved by the Board of Governors on 31 March 1960 in implementation of Articles III.A.6 and XII of the Statute of the Agency. On the basis of the experience gained from applying those measures to projects carried out by Members under agreements concluded with the Agency, the Agency's Health and Safety Measures were revised in 1975 and approved by the Board of Governors on 25 February 1976. The Agency's Safety Standards and Measures as revised are reproduced in this document for the information of all Members

  2. STANDARDS OF FUNCTIONAL MEASUREMENTS IN OCULAR TOXICOLOGY.

    Science.gov (United States)

    The visual system, like other sensory systems, may be a frequent target of exposure to toxic chemicals. A thorough evaluation of visual toxicity should include both structural and functional measures. Sensory evoked potentials are one set of neurophysiological procedures that...

  3. Preliminary Safety Information Document for the Standard MHTGR. Volume 1, (includes latest Amendments)

    Energy Technology Data Exchange (ETDEWEB)

    None

    1986-01-01

    With NRC concurrence, the Licensing Plan for the Standard HTGR describes an application program consistent with 10CFR50, Appendix O to support a US Nuclear Regulatory Commission (NRC) review and design certification of an advanced Standard modular High Temperature Gas-Cooled Reactor (MHTGR) design. Consistent with the NRC's Advanced Reactor Policy, the Plan also outlines a series of preapplication activities which have as an objective the early issuance of an NRC Licensability Statement on the Standard MHTGR conceptual design. This Preliminary Safety Information Document (PSID) has been prepared as one of the submittals to the NRC by the US Department of Energy in support of preapplication activities on the Standard MHTGR. Other submittals to be provided include a Probabilistic Risk Assessment, a Regulatory Technology Development Plan, and an Emergency Planning Bases Report.

  4. Standard test method for conducting potentiodynamic polarization resistance measurements

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    1997-01-01

    1.1 This test method covers an experimental procedure for polarization resistance measurements which can be used for the calibration of equipment and verification of experimental technique. The test method can provide reproducible corrosion potentials and potentiodynamic polarization resistance measurements. 1.2 The values stated in SI units are to be regarded as standard. No other units of measurement are included in this standard. 1.3 This standard does not purport to address all of the safety concerns, if any, associated with its use. It is the responsibility of the user of this standard to establish appropriate safety and health practices and determine the applicability of regulatory limitations prior to use.

  5. 32 CFR 37.620 - What financial management standards do I include for nonprofit participants?

    Science.gov (United States)

    2010-07-01

    ... 32 National Defense 1 2010-07-01 2010-07-01 false What financial management standards do I include... SECRETARY OF DEFENSE DoD GRANT AND AGREEMENT REGULATIONS TECHNOLOGY INVESTMENT AGREEMENTS Award Terms Affecting Participants' Financial, Property, and Purchasing Systems Financial Matters § 37.620 What...

  6. Standardization in measurement philosophical, historical and sociological issues

    CERN Document Server

    Schlaudt, Oliver

    2015-01-01

    The application of standard measurement is a cornerstone of modern science. In this collection of essays, standardization of procedure, units of measurement and the epistemology of standardization are addressed by specialists from sociology, history and the philosophy of science.

  7. GNSS-Based Space Weather Systems Including COSMIC Ionospheric Measurements

    Science.gov (United States)

    Komjathy, Attila; Mandrake, Lukas; Wilson, Brian; Iijima, Byron; Pi, Xiaoqing; Hajj, George; Mannucci, Anthony J.

    2006-01-01

    The presentation outline includes University Corporation for Atmospheric Research (UCAR) and Jet Propulsion Laboratory (JPL) product comparisons, assimilating ground-based global positioning satellites (GPS) and COSMIC into JPL/University of Southern California (USC) Global Assimilative Ionospheric Model (GAIM), and JPL/USC GAIM validation. The discussion of comparisons examines Abel profiles and calibrated TEC. The JPL/USC GAIM validation uses Arecibo ISR, Jason-2 VTEC, and Abel profiles.

  8. Temperature measurements at the National Institute of Standards and Technology

    Science.gov (United States)

    Mangum, B. W.

    The high-precision and high-accuracy measurements involved in the calibrations of various types of thermometers at the National Institute of Standards and Technology (NIST) are described. The responsibilities of the NIST Thermometry Group include not only calibration of the standard instruments of the scales but also the calibration of base-metal and noble-metal thermocouples, industrial platinum resistance thermometers, liquid in-glass thermometers, thermistor thermometers, and digital thermometers. General laboratory thermometer calibrations are described. Also, a Measurement Assurance Program is described which provides a direct assessment of a customer's technological competence in thermometry.

  9. Standardized measurement of quality of life after incisional hernia repair

    DEFF Research Database (Denmark)

    Jensen, Kristian K; Henriksen, Nadia A; Harling, Henrik

    2014-01-01

    repair. The aim of this systematic review was to analyze existing standardized methods to measure quality of life after incisional hernia repair. DATA SOURCES: A PubMed and Embase search was carried out together with a cross-reference search of eligible papers, giving a total of 26 included studies......BACKGROUND: Recent improvements in incisional hernia repair have led to lower rates of recurrence. As a consequence, increasing attention has been paid to patient-reported outcomes after surgery. However, there is no consensus on how to measure patients' quality of life after incisional hernia....... CONCLUSIONS: Different standardized methods for measurement of quality of life after incisional hernia repair are available, but no consensus on the optimal method, timing, or length of follow-up exist. International guidelines could help standardization, enabling better comparison between studies....

  10. Making and Measuring the California History Standards

    Science.gov (United States)

    Fogo, Bradley

    2011-01-01

    The California history and social science standards-based reform has been touted as the "gold standard" for state history curricula. But the standards, framework, and tests that constitute this reform provide inconsistent and contradictory criteria for teaching and assessing history and social science. An examination of the political…

  11. Evaluating a standardized measure of healthcare personnel influenza vaccination.

    Science.gov (United States)

    Lindley, Megan C; Lorick, Suchita A; Geevarughese, Anita; Lee, Soo-Jeong; Makvandi, Monear; Miller, Brady L; Nace, David A; Smith, Carmela; Ahmed, Faruque

    2013-09-01

    Methods of measuring influenza vaccination of healthcare personnel (HCP) vary substantially, as do the groups of HCP that are included in any given set of measurements. Thus, comparison of vaccination rates across healthcare facilities is difficult. The goal of the study was to determine the feasibility of implementing a standardized measure for reporting HCP influenza vaccination data in various types of healthcare facilities. A total of 318 facilities recruited in four U.S. jurisdictions agreed to participate in the evaluation, including hospitals, long-term care facilities, dialysis clinics, ambulatory surgery centers, and physician practices. HCP in participating facilities were categorized as employees, credentialed non-employees, or other non-employees using standard definitions. Data were gathered using cross-sectional web-based surveys completed at three intervals between October 2010 and May 2011; data were analyzed in February 2012. 234 facilities (74%) completed all three surveys. Most facilities could report on-site employee vaccination; almost one third could not provide complete data on HCP vaccinated outside the facility, contraindications, or declinations, primarily due to missing non-employee data. Inability to determine vaccination status of credentialed and other non-employees was cited as a major barrier to measure implementation by 24% and 27% of respondents, respectively. Using the measure to report employee vaccination status was feasible for most facilities; tracking non-employee HCP was more challenging. Based on evaluation findings, the measure was revised to limit the types of non-employees included. Although the revised measure is less comprehensive, it is more likely to produce valid vaccination coverage estimates. Use of this standardized measure can inform quality improvement efforts and facilitate comparison of HCP influenza vaccination among facilities. Published by Elsevier Inc.

  12. Weight Measurements and Standards for Soldiers, Phase 2

    Science.gov (United States)

    2016-10-01

    includes personalized eating , fitness, and APFT tools to help Soldiers stay fit and meet AR600-9 and APFT standards, and 2) a promotion program designed...Pennington Biomedical Research Center (PBRC) is delivering a program to the Louisiana Army National Guard (LANG) called Healthy Eating , Activity, and...and the Army Physical Fitness Test (APFT) (3), i.e., height, weight, fatness estimates, and measures of fitness, 2) assess the unique health risk

  13. International standards for phytosanitary measures (ISPM), publication No. 15

    CERN Multimedia

    Tom Wegelius

    2006-01-01

    GUIDELINES FOR REGULATING WOOD PACKAGING MATERIAL IN INTERNATIONAL TRADE SCOPE This standard describes phytosanitary measures to reduce the risk of introduction and/or spread of quarantine pests associated with wood packaging material (including dunnage), made of coniferous and non-coniferous raw wood, in use in international trade. For more information, contact the Shipping Service (FI-LS-SH) at 79947. Table of guidelines

  14. WHO standards for biotherapeutics, including biosimilars: an example of the evaluation of complex biological products.

    Science.gov (United States)

    Knezevic, Ivana; Griffiths, Elwyn

    2017-11-01

    The most advanced regulatory processes for complex biological products have been put in place in many countries to provide appropriate regulatory oversight of biotherapeutic products in general, and similar biotherapeutics in particular. This process is still ongoing and requires regular updates to national regulatory requirements in line with scientific developments and up-to-date standards. For this purpose, strong knowledge of and expertise in evaluating biotherapeutics in general and similar biotherapeutic products, also called biosimilars, in particular is essential. Here, we discuss the World Health Organization's international standard-setting role in the regulatory evaluation of recombinant DNA-derived biotherapeutic products, including biosimilars, and provide examples that may serve as models for moving forward with nonbiological complex medicinal products. A number of scientific challenges and regulatory considerations imposed by the advent of biosimilars are described, together with the lessons learned, to stimulate future discussions on this topic. In addition, the experiences of facilitating the implementation of guiding principles for evaluation of similar biotherapeutic products into regulatory and manufacturers' practices in various countries over the past 10 years are briefly explained, with the aim of promoting further developments and regulatory convergence of complex biological and nonbiological products. © 2017 The Authors. Annals of the New York Academy of Sciences. The World Health Organization retains copyright and all other rights in the manuscript of this article as submitted for publication.

  15. Standards for reference reactor physics measurements

    International Nuclear Information System (INIS)

    Harris, D.R.; Cokinos, D.M.; Uotinen, V.

    1990-01-01

    Reactor physics analysis methods require experimental testing and confirmation over the range of practical reactor configurations and states. This range is somewhat limited by practical fuel types such as actinide oxides or carbides enclosed in metal cladding. On the other hand, this range continues to broaden because of the trend of using higher enrichment, if only slightly enriched, electric utility fuel. The need for experimental testing of the reactor physics analysis methods arises in part because of the continual broadening of the range of core designs, and in part because of the nature of the analysis methods. Reactor physics analyses are directed primarily at the determination of core reactivities and reaction rates, the former largely for reasons of reactor control, and the latter largely to ensure that material limitations are not violated. Errors in these analyses can be regarded as being from numerics, from the data base, and from human factors. For numerical, data base, and human factor reasons, then, it is prudent and customary to qualify reactor physical analysis methods against experiments. These experiments can be treated as being at low power or at high power, and each of these types is subject to an American National Standards Institute standard. The purpose of these standards is to aid in improving and maintaining adequate quality in reactor physics methods, and it is from this point of view that the standards are examined here

  16. A comparative study of performance measurement standards of railway operator

    Directory of Open Access Journals (Sweden)

    Pongjirawut Siripong

    2017-01-01

    Full Text Available The European standard (EN 13816, is one of the widely accepted standards for measuring the quality of public passenger transport (PPT service. EN 13816 indicates 8 measurement criteria, 29 sub-criteria and 193 Key Performance Indicators (KPIs to be used to measure the performance of railway operators. Nowadays, there are other addition criteria beyond EN13816, developed by various organisations. This research firstly aims to explore the service performance measurement of railway operators used by actual railway operators at international level and in Thailand. After an intensive review of performance measurement standards, 9 standards are compiled and compared in terms of criteria, sub-criteria and KPIs using a cluster analysis methodology. The result found additional performance measurement aspects at 2 sub-criteria and 91 KPIs in addition to EN 13816. This research summarized and compared different performance measurement standards to measure service quality of metro rail line.

  17. Relating Standardized Visual Perception Measures to Simulator Visual System Performance

    Science.gov (United States)

    Kaiser, Mary K.; Sweet, Barbara T.

    2013-01-01

    Human vision is quantified through the use of standardized clinical vision measurements. These measurements typically include visual acuity (near and far), contrast sensitivity, color vision, stereopsis (a.k.a. stereo acuity), and visual field periphery. Simulator visual system performance is specified in terms such as brightness, contrast, color depth, color gamut, gamma, resolution, and field-of-view. How do these simulator performance characteristics relate to the perceptual experience of the pilot in the simulator? In this paper, visual acuity and contrast sensitivity will be related to simulator visual system resolution, contrast, and dynamic range; similarly, color vision will be related to color depth/color gamut. Finally, we will consider how some characteristics of human vision not typically included in current clinical assessments could be used to better inform simulator requirements (e.g., relating dynamic characteristics of human vision to update rate and other temporal display characteristics).

  18. Weight Measurements and Standards for Soldiers

    Science.gov (United States)

    2009-10-01

    dietetic and nutrition events. Among the topics discussed were fad diets , sport drinks, supplements, and nutrition myths. Soldiers were informed...providing a one week view, adding calorie requirements, including a weight and diet history, and displaying the number of days until the next APFT...visual estimation of portion sizes. J Am Diet Assoc. 2003 Sep;103:1139-45. 3. Williamson DA, Martin PD, Allen HR, Most MM, Alfonso AJ, Thomas V

  19. Development of a standard for indoor radon measurements in Australia

    Energy Technology Data Exchange (ETDEWEB)

    O`Brien, R.S.; Solomon, S.B. [Australian Radiation Lab., Melbourne, VIC (Australia)

    1994-12-31

    A standard covering methodologies for the measurement of indoor radon and radon progeny concentrations in air in Australian buildings is currently under preparation as part of a set of standards covering total indoor air quality. This paper outlines the suggested methodology for radon and discusses some of the problems associated with the development of the standard. The draft standard recommends measurement of the radon concentration in air using scintillation cells, charcoal cups and solid state nuclear track detectors, and measurement of radon progeny concentration in air using the Rolle method or the Nazaroff method. 14 refs., 1 tab.

  20. Development of a standard for indoor radon measurements in Australia

    International Nuclear Information System (INIS)

    O'Brien, R.S.; Solomon, S.B.

    1994-01-01

    A standard covering methodologies for the measurement of indoor radon and radon progeny concentrations in air in Australian buildings is currently under preparation as part of a set of standards covering total indoor air quality. This paper outlines the suggested methodology for radon and discusses some of the problems associated with the development of the standard. The draft standard recommends measurement of the radon concentration in air using scintillation cells, charcoal cups and solid state nuclear track detectors, and measurement of radon progeny concentration in air using the Rolle method or the Nazaroff method. 14 refs., 1 tab

  1. 7 CFR 300.5 - International Standards for Phytosanitary Measures.

    Science.gov (United States)

    2010-01-01

    ... 7 Agriculture 5 2010-01-01 2010-01-01 false International Standards for Phytosanitary Measures. 300.5 Section 300.5 Agriculture Regulations of the Department of Agriculture (Continued) ANIMAL AND PLANT HEALTH INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE INCORPORATION BY REFERENCE § 300.5 International Standards for Phytosanitary Measures. (a)...

  2. The use of personal values in living standards measures | Ungerer ...

    African Journals Online (AJOL)

    The Living Standards Measure (LSM), a South African marketing segmentation method, is a multivariate wealth measure based on standard of living. This article reports on whether a rationale can be found for the inclusion of psychological variables, particularly personal values, in this type of multivariate segmentation.

  3. [Investigation of the hygienic standard in two hospitals including the control of disinfection (author's transl)].

    Science.gov (United States)

    Pfanzelt, R; Schassan, H H

    1978-08-01

    In two operative departments with different architectural presuppositions, the hygienic standard was checked up. Under favourable conditions in clinic B (Hosch-filter, sluice-systems) the relative frequency of demonstrable bacteria amounted to 55%. In clinic A, where these conditions failed, it amounted to 80%. Among the non pathogenic bacteria DNase-negative staphylococci were demonstrated more frequently than others. 13.4% and 18.9% resp. of the bacteria were DNase-positive staphylococci. We used Clostridium perfringens for detecting invasion-paths of germs. The most important ones are leaky windows, air conditioning and insufficient sluice-systems. The success of desinfection was examined. It fluctuates from 67% to 100%. One control amounted to 42%. The results show, that it is impossible to establish sterile rooms for common operative departments. But they show as well that a satisfying hygienic standard cannot be arrived without sluice-systems and appropriate air conditioning.

  4. Broadening the Reach of Standardized Patients in Nurse Practitioner Education to Include the Distance Learner.

    Science.gov (United States)

    Ballman, Kathleen; Garritano, Nicole; Beery, Theresa

    2016-01-01

    Using standardized patients (SP) presenting with a specific complaint has been a mainstay in health care education. Increased use of technology has facilitated the move of instruction from the on-campus classroom to distance learning for many nurse practitioner programs. Using interactive case studies provides distance learners SP encounters. This technologically facilitated encounter gives the distance learner the opportunity for integrative thinking and development of problem solving and clinical reasoning skills.

  5. Modeling the wet bulb globe temperature using standard meteorological measurements.

    Science.gov (United States)

    Liljegren, James C; Carhart, Richard A; Lawday, Philip; Tschopp, Stephen; Sharp, Robert

    2008-10-01

    The U.S. Army has a need for continuous, accurate estimates of the wet bulb globe temperature to protect soldiers and civilian workers from heat-related injuries, including those involved in the storage and destruction of aging chemical munitions at depots across the United States. At these depots, workers must don protective clothing that increases their risk of heat-related injury. Because of the difficulty in making continuous, accurate measurements of wet bulb globe temperature outdoors, the authors have developed a model of the wet bulb globe temperature that relies only on standard meteorological data available at each storage depot for input. The model is composed of separate submodels of the natural wet bulb and globe temperatures that are based on fundamental principles of heat and mass transfer, has no site-dependent parameters, and achieves an accuracy of better than 1 degree C based on comparisons with wet bulb globe temperature measurements at all depots.

  6. Evaluation of Dogs with Border Collie Collapse, Including Response to Two Standardized Strenuous Exercise Protocols.

    Science.gov (United States)

    Taylor, Susan; Shmon, Cindy; Su, Lillian; Epp, Tasha; Minor, Katie; Mickelson, James; Patterson, Edward; Shelton, G Diane

    2016-01-01

    Clinical and metabolic variables were evaluated in 13 dogs with border collie collapse (BCC) before, during, and following completion of standardized strenuous exercise protocols. Six dogs participated in a ball-retrieving protocol, and seven dogs participated in a sheep-herding protocol. Findings were compared with 16 normal border collies participating in the same exercise protocols (11 retrieving, five herding). Twelve dogs with BCC developed abnormal mentation and/or an abnormal gait during evaluation. All dogs had post-exercise elevations in rectal temperature, pulse rate, arterial blood pH, PaO2, and lactate, and decreased PaCO2 and bicarbonate, as expected with strenuous exercise, but there were no significant differences between BCC dogs and normal dogs. Electrocardiography demonstrated sinus tachycardia in all dogs following exercise. Needle electromyography was normal, and evaluation of muscle biopsy cryosections using a standard panel of histochemical stains and reactions did not reveal a reason for collapse in 10 dogs with BCC in which these tests were performed. Genetic testing excluded the dynamin-1 related exercise-induced collapse mutation and the V547A malignant hyperthermia mutation as the cause of BCC. Common reasons for exercise intolerance were eliminated. Although a genetic basis is suspected, the cause of collapse in BCC was not determined.

  7. The Australian Commonwealth standard of measurement for absorbed radiation dose

    International Nuclear Information System (INIS)

    Sherlock, S.L.

    1990-06-01

    This report documents the absorbed dose standard for photon beams in the range from 1 to 25 MeV. Measurements of absorbed dose in graphite irradiated by a beam of cobalt-60 gamma rays from an Atomic Energy of Canada Limited (AECL) E1 Dorado 6 teletherapy unit are reported. The measurements were performed using a graphite calorimeter, which is the primary standard for absorbed dose. The measurements are used to calibrate a working standard ion chamber in terms of absorbed dose in graphite. Details of the methods, results and correction factors applied are given in Appendices. 13 refs., 6 tabs., 6 figs

  8. International Spinal Cord Injury Core Data Set (version 2.0)-including standardization of reporting

    NARCIS (Netherlands)

    Biering-Sorensen, F.; DeVivo, M. J.; Charlifue, S.; Chen, Y.; New, P. W.; Noonan, V.; Post, M. W. M.; Vogel, L.

    Study design: The study design includes expert opinion, feedback, revisions and final consensus. Objectives: The objective of the study was to present the new knowledge obtained since the International Spinal Cord Injury (SCI) Core Data Set (Version 1.0) published in 2006, and describe the

  9. 2013 updated American Society of Clinical Oncology/Oncology Nursing Society chemotherapy administration safety standards including standards for the safe administration and management of oral chemotherapy.

    Science.gov (United States)

    Neuss, Michael N; Polovich, Martha; McNiff, Kristen; Esper, Peg; Gilmore, Terry R; LeFebvre, Kristine B; Schulmeister, Lisa; Jacobson, Joseph O

    2013-05-01

    In 2009, the American Society of Clinical Oncology (ASCO) and the Oncology Nursing Society (ONS) published standards for the safe use of parenteral chemotherapy in the outpatient setting, including issues of practitioner orders, preparation, and administration of medication. In 2011, these were updated to include inpatient facilities. In December 2011, a multistakeholder workgroup met to address the issues associated with orally administered antineoplastics, under the leadership of ASCO and ONS. The workgroup participants developed recommended standards, which were presented for public comment. Public comments informed final edits, and the final standards were reviewed and approved by the ASCO and ONS Boards of Directors. Significant newly identified recommendations include those associated with drug prescription and the necessity of ascertaining that prescriptions are filled. In addition, the importance of patient and family education regarding administration schedules, exception procedures, disposal of unused oral medication, and aspects of continuity of care across settings were identified. This article presents the newly developed standards.

  10. 2013 updated American Society of Clinical Oncology/Oncology Nursing Society chemotherapy administration safety standards including standards for the safe administration and management of oral chemotherapy.

    Science.gov (United States)

    Neuss, Michael N; Polovich, Martha; McNiff, Kristen; Esper, Peg; Gilmore, Terry R; LeFebvre, Kristine B; Schulmeister, Lisa; Jacobson, Joseph O

    2013-03-01

    In 2009, ASCO and the Oncology Nursing Society (ONS) published standards for the safe use of parenteral chemotherapy in the outpatient setting, including issues of practitioner orders, preparation, and administration of medication. In 2011, these were updated to include inpatient facilities. In December 2011, a multistakeholder workgroup met to address the issues associated with orally administered antineoplastics, under the leadership of ASCO and ONS. The workgroup participants developed recommended standards, which were presented for public comment. Public comments informed final edits, and the final standards were reviewed and approved by the ASCO and ONS Boards of Directors. Significant newly identified recommendations include those associated with drug prescription and the necessity of ascertaining that prescriptions are filled. In addition, the importance of patient and family education regarding administration schedules, exception procedures, disposal of unused oral medication, and aspects of continuity of care across settings were identified. This article presents the newly developed standards.

  11. Temperature standards, what and where: resources for effective temperature measurements

    International Nuclear Information System (INIS)

    Johnston, W.W. Jr.

    1982-01-01

    Many standards have been published to describe devices, methods, and other topics. How they are developed and by whom are briefly described, and an attempt is made to extract most of those relating to temperature measurements. A directory of temperature standards and their sources is provided

  12. Implementation of a consolidated, standardized database of environmental measurements data

    International Nuclear Information System (INIS)

    James, T.L.

    1996-10-01

    This report discusses the benefits of a consolidated and standardized database; reasons for resistance to the consolidation of data; implementing a consolidated database, including attempts at standardization, deciding what to include in the consolidated database, establishing lists of valid values, and addressing quality assurance/quality control (QA/QC) issues; and the evolution of a consolidated database, which includes developing and training a user community, resolving configuration control issues, incorporating historical data, identifying emerging standards, and developing pointers to other data. OREIS is used to illustrate these topics

  13. Portable radiation instrumentation traceability of standards and measurements

    International Nuclear Information System (INIS)

    Wiserman, A.; Walke, M.

    1995-01-01

    Portable radiation measuring instruments are used to estimate and control doses for workers. Calibration of these instruments must be sufficiently accurate to ensure that administrative and legal dose limits are not likely to be exceeded due to measurement uncertainties. An instrument calibration and management program is established which permits measurements made with an instrument to be traced to a national standard. This paper describes the establishment and maintenance of calibration standards for gamma survey instruments and an instrument management program which achieves traceability of measurement for uniquely identified field instruments. (author)

  14. Measurements and standards for bulk-explosives detection

    International Nuclear Information System (INIS)

    Hudson, Larry; Bateman, Fred; Bergstrom, Paul; Cerra, Frank; Glover, Jack; Minniti, Ronaldo; Seltzer, Stephen; Tosh, Ronald

    2012-01-01

    Recent years have seen a dramatic expansion in the application of radiation and isotopes to security screening. This has been driven primarily by increased incidents involving improvised explosive devices as well as their ease of assembly and leveraged disruption of transportation and commerce. With global expenditures for security-screening systems in the hundreds of billions of dollars, there is a pressing need to develop, apply, and harmonize standards for x-ray and gamma-ray screening systems used to detect explosives and other contraband. The National Institute of Standards and Technology has been facilitating the development of standard measurement tools that can be used to gauge the technical performance (imaging quality) and radiation safety of systems used to screen luggage, persons, vehicles, cargo, and left-behind objects. After a review of this new suite of national standard test methods, test objects, and radiation-measurement protocols, we highlight some of the technical trends that are enhancing the revision of baseline standards. Finally we advocate a more intentional use of technical-performance standards by security stakeholders and outline the advantages this would accrue. - Highlights: ► This work responds to the need for standards for x-ray screening systems used to detect explosives. ► Described are new measurement tools to gage the performance and radiation safety of such systems. ► A more intentional use of technical-performance standards by security stakeholders is argued.

  15. The Australian national standard of measurement for radioactivity

    International Nuclear Information System (INIS)

    Buckman, S.M.

    1992-01-01

    This contribution outlines the activities of the Radiation Standards Group at Ansto which is responsible, under the National Measurement Act 1960, for Australia's national standard of radioactivity measurement. The Group can make absolute measurements of radioactivity using a 4π β-γ coincidence counting system, Solutions standardised by this technique are then used to calibrate a TPA ionisation chamber, this chamber being the national working standard of activity measurement. All of the calibration factors determined for this chamber by direct measurement have been compared internationally through the Bureau International de Poids et Mesures (BIPM). These comparisons were performed either by participating in international intercomparisons organised by the BIPM or by submitting a standardised solution to the Systeme International de Reference. 11 refs

  16. Developing a community-based flood resilience measurement standard

    Science.gov (United States)

    Keating, Adriana; Szoenyi, Michael; Chaplowe, Scott; McQuistan, Colin; Campbell, Karen

    2015-04-01

    Given the increased attention to resilience-strengthening in international humanitarian and development work, there has been concurrent interest in its measurement and the overall accountability of "resilience strengthening" initiatives. The literature is reaching beyond the polemic of defining resilience to its measurement. Similarly, donors are increasingly expecting organizations to go beyond claiming resilience programing to measuring and showing it. However, key questions must be asked, in particular "Resilience of whom and to what?". There is no one-size-fits-all solution. The approach to measuring resilience is dependent on the audience and the purpose of the measurement exercise. Deriving a resilience measurement system needs to be based on the question it seeks to answer and needs to be specific. This session highlights key lessons from the Zurich Flood Resilience Alliance approach to develop a flood resilience measurement standard to measure and assess the impact of community based flood resilience interventions, and to inform decision-making to enhance the effectiveness of these interventions. We draw on experience in methodology development to-date, together with lessons from application in two case study sites in Latin America. Attention will be given to the use of a consistent measurement methodology for community resilience to floods over time and place; challenges to measuring a complex and dynamic phenomenon such as community resilience; methodological implications of measuring community resilience versus impact on and contribution to this goal; and using measurement and tools such as cost-benefit analysis to prioritize and inform strategic decision making for resilience interventions. The measurement tool follows the five categories of the Sustainable Livelihoods Framework and the 4Rs of complex adaptive systems - robustness, rapidity, redundancy and resourcefulness -5C-4R. A recent white paper by the Zurich Flood Resilience Alliance traces the

  17. Quantum-enhanced measurements: beating the standard quantum limit.

    Science.gov (United States)

    Giovannetti, Vittorio; Lloyd, Seth; Maccone, Lorenzo

    2004-11-19

    Quantum mechanics, through the Heisenberg uncertainty principle, imposes limits on the precision of measurement. Conventional measurement techniques typically fail to reach these limits. Conventional bounds to the precision of measurements such as the shot noise limit or the standard quantum limit are not as fundamental as the Heisenberg limits and can be beaten using quantum strategies that employ "quantum tricks" such as squeezing and entanglement.

  18. Humidity correction in the standard measurement of exposure

    International Nuclear Information System (INIS)

    Ibaraki, Yasuyuki; Katoh, Akira

    1980-01-01

    This paper deals with the humidity correction to be made in the standard measurement of the exposure to the measured ionization current in the humid air for the purpose of excluding the influence of the water vapour that is not included in the definition of the exposure. First, formulae giving the humidity correction factors for a parallel plate free air chamber and a cavity chamber have been derived respectively in the case where the contributions of air and water vapour to the ionization are independent. Next, in the case where the contributions are not independent, i.e., the Jesse effect is taken into account, a formula to obtain the W-value for humid air has been derived on the basis of the Niatel's experimental result. Using this formula, formulae to obtain the humidity correction factors for the free air chamber and the cavity chamber are derived. The humidity calculated by the latter formulae show good agreements with the results by Niatel and Guiho, respectively. (author)

  19. Standard Test Method for Measured Speed of Oil Diffusion Pumps

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    1982-01-01

    1.1 This test method covers the determination of the measured speed (volumetric flow rate) of oil diffusion pumps. 1.2 The values stated in inch-pound units are to be regarded as the standard. The metric equivalents of inch-pound units may be approximate. 1.3 This standard does not purport to address all of the safety concerns, if any, associated with its use. It is the responsibility of the user of this standard to establish appropriate safety and health practices and determine the applicability of regulatory limitations prior to use.

  20. Standard cartridges used in gamma spectrometry measurements of radioactive halogens

    International Nuclear Information System (INIS)

    Lepy, M.C.; Etcheverry, M.; Morel, J.; Chauvenet, B.

    1988-10-01

    Activated charcoal cartridges are used to trap radioactive halogens contained in gaseous effluents of nuclear facilities. Two types of standard cartridges, with barium 133 or europium 152 are available. One of the models simulates a volumic distribution, and the other a surface distribution of the radionuclides inside the cartridge. They are characterized in terms of activity with an uncertainty lower than 5 %. The standard cartridges utilization conditions are specified and the main measurement error causes are analyzed. The proper routine use of these standards should allow us to get results with an accuracy better than 10 % [fr

  1. Performance Measurement of Management System Standards Using the Balanced Scorecard

    Directory of Open Access Journals (Sweden)

    Jan Kopia

    2017-11-01

    Full Text Available Management system standards (MSS, such as ISO standards, TQM, etc. are widely-used standards adopted by millions of organizations worldwide. It is still an unclear question whether these standards are beneficial for an organization, besides the fact that they might be required or expected by law or customers. The question, whether MSS increase the efficiency, the output, or the performance of an organization is still discussed in scientific research. One reason might be that performance measurement itself is not fully understood or in constant development ranging from pure financial evaluations over intellectual capital rating to calculating of levels of environmental, social or economic expectations known as the Trible Bottom Line. The Balanced Scorecard is one possible solution for performance measurement on a strategic and operational level and therefore useful for the measurement of the influence of MSS within organizations. This study summarized current research in the field of performance measurement in the context of MSS and IMS and the use of BSC and quantitatively and qualitatively tests the usefulness of BSC in measuring the effect of MSSs using the Execution Premium. It was found that BSC is often used, that an average number of companies integrate their measurement initiatives of their MSSs into the BSC-process, and that a high integration of MSS into the BSC improves the organizational performance. This research is useful for researchers and practitioners in order to understand the benefits of the usage of the BSC in the context of MSS or Integrated Management Systems.

  2. 32 CFR 37.615 - What standards do I include for financial systems of for-profit firms?

    Science.gov (United States)

    2010-07-01

    ... 32 National Defense 1 2010-07-01 2010-07-01 false What standards do I include for financial... SECRETARY OF DEFENSE DoD GRANT AND AGREEMENT REGULATIONS TECHNOLOGY INVESTMENT AGREEMENTS Award Terms Affecting Participants' Financial, Property, and Purchasing Systems Financial Matters § 37.615 What...

  3. Possible standard specimens for neutron diffraction residual stress measurements

    International Nuclear Information System (INIS)

    Brand, P.C.; Prask, H.J.; Fields, R.J.; Blackburn, J.; Proctor, T.M.

    1995-01-01

    Increasingly, sub-surface residual stress measurements by means of neutron diffraction are being conducted at various laboratories around the world. Unlike X-ray diffraction residual stress measurement setups, neutron instruments in use worldwide vary widely in design, neutron flux, and level of dedication towards residual stress measurements. Although confidence in the neutron technique has increased within the materials science and engineering communities, no demonstration of standardization or consistency between laboratories has been made. One of the steps in the direction of such standardization is the development of standard specimens, that have well characterized residual stress states and which could be examined worldwide. In this paper the authors will examine two options for a neutron stress standard specimen: (1) a steel ring-plug specimen with very well defined diametrical interference; (2) a spot weld in a High Strength Low Alloy steel disk. The results of neutron residual stress measurements on these specimens will be discussed and conclusions as to their usefulness as neutron stress standards will be presented

  4. Constraining new physics with collider measurements of Standard Model signatures

    Energy Technology Data Exchange (ETDEWEB)

    Butterworth, Jonathan M. [Department of Physics and Astronomy, University College London,Gower St., London, WC1E 6BT (United Kingdom); Grellscheid, David [IPPP, Department of Physics, Durham University,Durham, DH1 3LE (United Kingdom); Krämer, Michael; Sarrazin, Björn [Institute for Theoretical Particle Physics and Cosmology, RWTH Aachen University,Sommerfeldstr. 16, 52056 Aachen (Germany); Yallup, David [Department of Physics and Astronomy, University College London,Gower St., London, WC1E 6BT (United Kingdom)

    2017-03-14

    A new method providing general consistency constraints for Beyond-the-Standard-Model (BSM) theories, using measurements at particle colliders, is presented. The method, ‘Constraints On New Theories Using Rivet’, CONTUR, exploits the fact that particle-level differential measurements made in fiducial regions of phase-space have a high degree of model-independence. These measurements can therefore be compared to BSM physics implemented in Monte Carlo generators in a very generic way, allowing a wider array of final states to be considered than is typically the case. The CONTUR approach should be seen as complementary to the discovery potential of direct searches, being designed to eliminate inconsistent BSM proposals in a context where many (but perhaps not all) measurements are consistent with the Standard Model. We demonstrate, using a competitive simplified dark matter model, the power of this approach. The CONTUR method is highly scaleable to other models and future measurements.

  5. A Network for Standardized Ocean Color Validation Measurements

    Science.gov (United States)

    Zibordi, Giuseppe; Holben, Brent; Hooker, Stanford; Melin, Frederic; Berthon, Jean-Francois; Slutsker, Ilya; Giles, David; Vandemark, Doug; Feng, Hui; Rutledge, Ken; hide

    2006-01-01

    The Aerosol Robotic Network (AERONET) was developed to support atmospheric studies at various scales with measurements from worldwide distributed autonomous sunphotometers [Holben et al. 1998]. AERONET has now extended its support to marine applications through the additional capability of measuring the radiance emerging from the sea with modified sun-photometers installed on offshore platforms like lighthouses, navigation aids, oceanographic and oil towers. The functionality of this added network component called AERONET - Ocean Color (AERONET-OC), has been verified at different sites and deployment structures over a four year testing phase. Continuous or occasional deployment platforms (see Fig. 1) included: the Acqua Alta Oceanographic Tower (AAOT) of the Italian National Research Council in the northern Adriatic Sea since spring 2002; the Martha s Vineyard Coastal Observatory (MVCO) tower of the Woods Hole Oceanographic Institution in the Atlantic off the Massachusetts coast for different periods since spring 2004; the TOTAL Abu-Al-Bukhoosh oil Platform (AABP, shown through an artistic rendition in Fig. 1) in the Persian (Arabian) Gulf in fall 2004; the Gustaf Dal n Lighthouse Tower (GDLT) of the Swedish Maritime Administration in the Baltic Sea in summer 2005; and the platform at the Clouds and the Earth's Radiant Energy System (CERES) Ocean Validation Experiment (COVE) site located in the Atlantic Ocean off the Virginia coast since fall 2005. Data collected during the network testing phase, confirm the capability of AERONET-OC to support the validation of marine optical remote sensing products through standardized measurements of normalized water-leaving radiance, LWN, and aerosol optical thickness, a, at multiple coastal sites.

  6. Overview of the Standard Model Measurements with the ATLAS Detector

    CERN Document Server

    Liu, Yanwen; The ATLAS collaboration

    2017-01-01

    The ATLAS Collaboration is engaged in precision measurement of fundamental Standard Model parameters, such as the W boson mass, the weak-mixing angle or the strong coupling constant. In addition, the production cross-sections of a large variety of final states involving high energetic jets, photons as well as single and multi vector bosons are measured multi differentially at several center of mass energies. This allows to test perturbative QCD calculations to highest precision. In addition, these measurements allow also to test models beyond the SM, e.g. those leading to anomalous gauge couplings. In this talk, we give a broad overview of the Standard Model measurement campaign of the ATLAS collaboration, where selected topics will be discussed in more detail.

  7. ATLAS Standard Model Measurements Using Jet Grooming and Substructure

    CERN Document Server

    Ucchielli, Giulia; The ATLAS collaboration

    2017-01-01

    Boosted topologies allow to explore Standard Model processes in kinematical regimes never tested before. In such LHC challenging environments, standard reconstruction techniques quickly hit the wall. Targeting hadronic final states means to properly reconstruct energy and multiplicity of the jets in the event. In order to be able to identify the decay product of boosted objects, i.e. W bosons, $t\\bar{t}$ pairs or Higgs produced in association with $t\\bar{t}$ pairs, ATLAS experiment is currently exploiting several algorithms using jet grooming and jet substructure. This contribution will mainly cover the following ATLAS measurements: $t\\bar{t}$ differential cross section production and jet mass using the soft drop procedure. Standard Model measurements offer the perfect field to test the performances of new jet tagging techniques which will become even more important in the search for new physics in highly boosted topologies.”

  8. Standard Test Method for Measuring Heat Flux Using a Water-Cooled Calorimeter

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    2005-01-01

    1.1 This test method covers the measurement of a steady heat flux to a given water-cooled surface by means of a system energy balance. 1.2 The values stated in SI units are to be regarded as standard. No other units of measurement are included in this standard. 1.3 This standard does not purport to address all of the safety concerns, if any, associated with its use. It is the responsibility of the user of this standard to establish appropriate safety and health practices and determine the applicability of regulatory limitations prior to use.

  9. Conditional Standard Errors of Measurement for Scale Scores.

    Science.gov (United States)

    Kolen, Michael J.; And Others

    1992-01-01

    A procedure is described for estimating the reliability and conditional standard errors of measurement of scale scores incorporating the discrete transformation of raw scores to scale scores. The method is illustrated using a strong true score model, and practical applications are described. (SLD)

  10. Reconstruction of 6 MV photon spectra from measured transmission including maximum energy estimation.

    Science.gov (United States)

    Baker, C R; Peck, K K

    1997-11-01

    Photon spectra from a nominally 6 MV beam under standard clinical conditions and at higher and lower beam qualities have been derived from narrow-beam transmission measurements using a previously published three-parameter reconstruction model. Estimates of the maximum photon energy present in each spectrum were derived using a reduced number of model parameters. An estimate of the maximum contribution of background, or room, scatter to transmission measurements has been made for this study and is shown to be negligible in terms of the quality index and percentage depth-dose of the derived spectra. Percentage depth-dose data for standard beam conditions derived from the reconstructed spectrum were found to agree with direct measurements to within approximately 1% for depths of up to 25 cm in water. Quality indices expressed in terms of TPR10(20) for all spectra were found to agree with directly measured values to within 1%. The experimental procedure and reconstruction model are therefore shown to produce photon spectra whose derived quality indices and percentage depth-dose values agree with direct measurement to within expected experimental uncertainty.

  11. Vitamin D measurement standardization: The way out of the chaos.

    Science.gov (United States)

    Binkley, N; Dawson-Hughes, B; Durazo-Arvizu, R; Thamm, M; Tian, L; Merkel, J M; Jones, J C; Carter, G D; Sempos, C T

    2017-10-01

    Substantial variability is associated with laboratory measurement of serum total 25-hydroxyvitamin D [25(OH)D]. The resulting chaos impedes development of consensus 25(OH)D values to define stages of vitamin D status. As resolving this situation requires standardized measurement of 25(OH)D, the Vitamin D Standardization Program (VDSP) developed methodology to standardize 25(OH)D measurement to the gold standard reference measurement procedures of NIST, Ghent University and CDC. Importantly, VDSP developed protocols for standardizing 25(OH)D values from prior research based on availability of stored serum samples. The effect of such retrospective standardization on prevalence of "low" vitamin D status in national studies reported here for The Third National Health and Nutrition Examination Survey (NHANES III, 1988-1994) and the German Health Interview and Examination Survey for Children and Adolescents (KIGGS, 2003-2006) was such that in NHANES III 25(OH)D values were lower than original values while higher in KIGGS. In NHANES III the percentage with values below 30, 50 and 75 nmol/L increased from 4% to 6%, 22% to 31% and 55% to 71%, respectively. Whereas in KIGGS after standardization the percentage below 30, 50, and 70 nmol/L decreased from 28% to 13%, 64% to 47% and 87% to 85% respectively. Moreover, in a hypothetical example, depending on whether the 25(OH)D assay was positively or negatively biased by 12%, the 25(OH)D concentration which maximally suppressed PTH could vary from 20 to 35ng/mL. These examples underscore the challenges (perhaps impossibility) of developing vitamin D guidelines using unstandardized 25(OH)D data. Retrospective 25(OH)D standardization can be applied to old studies where stored serum samples exist. As a way forward, we suggest an international effort to identify key prior studies with stored samples for re-analysis and standardization initially to define the 25(OH)D level associated with vitamin D deficiency (rickets

  12. Measuring Outcomes in Adult Weight Loss Studies That Include Diet and Physical Activity: A Systematic Review

    OpenAIRE

    Millstein, Rachel A.

    2014-01-01

    Background. Measuring success of obesity interventions is critical. Several methods measure weight loss outcomes but there is no consensus on best practices. This systematic review evaluates relevant outcomes (weight loss, BMI, % body fat, and fat mass) to determine which might be the best indicator(s) of success. Methods. Eligible articles described adult weight loss interventions that included diet and physical activity and a measure of weight or BMI change and body composition change. Resu...

  13. Analysis and Comparison of Thickness and Bending Measurements from Fabric Touch Tester (FTT and Standard Methods

    Directory of Open Access Journals (Sweden)

    Musa Atiyyah Binti Haji

    2018-03-01

    Full Text Available Fabric Touch Tester (FTT is a relatively new device from SDL Atlas to determine touch properties of fabrics. It simultaneously measures 13 touch-related fabric physical properties in four modules that include bending and thickness measurements. This study aims to comparatively analyze the thickness and bending measurements made by the FTT and the common standard methods used in the textile industry. The results obtained with the FTT for 11 different fabrics were compared with that of standard methods. Despite the different measurement principle, a good correlation was found between the two methods used for the assessment of thickness and bending. As FTT is a new tool for textile comfort measurement and no standard yet exists, these findings are essential to determine the reliability of the measurements and how they relate to the well-established standard methods.

  14. Applying OGC Standards to Develop a Land Surveying Measurement Model

    Directory of Open Access Journals (Sweden)

    Ioannis Sofos

    2017-02-01

    Full Text Available The Open Geospatial Consortium (OGC is committed to developing quality open standards for the global geospatial community, thus enhancing the interoperability of geographic information. In the domain of sensor networks, the Sensor Web Enablement (SWE initiative has been developed to define the necessary context by introducing modeling standards, like ‘Observation & Measurement’ (O&M and services to provide interaction like ‘Sensor Observation Service’ (SOS. Land surveying measurements on the other hand comprise a domain where observation information structures and services have not been aligned to the OGC observation model. In this paper, an OGC-compatible, aligned to the ‘Observation and Measurements’ standard, model for land surveying observations has been developed and discussed. Furthermore, a case study instantiates the above model, and an SOS implementation has been developed based on the 52° North SOS platform. Finally, a visualization schema is used to produce ‘Web Map Service (WMS’ observation maps. Even though there are elements that differentiate this work from classic ‘O&M’ modeling cases, the proposed model and flows are developed in order to provide the benefits of standardizing land surveying measurement data (cost reducing by reusability, higher precision level, data fusion of multiple sources, raw observation spatiotemporal repository access, development of Measurement-Based GIS (MBGIS to the geoinformation community.

  15. Standard guide for use of modeling for passive gamma measurements

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    2010-01-01

    1.1 This guide addresses the use of models with passive gamma-ray measurement systems. Mathematical models based on physical principles can be used to assist in calibration of gamma-ray measurement systems and in analysis of measurement data. Some nondestructive assay (NDA) measurement programs involve the assay of a wide variety of item geometries and matrix combinations for which the development of physical standards are not practical. In these situations, modeling may provide a cost-effective means of meeting user’s data quality objectives. 1.2 A scientific knowledge of radiation sources and detectors, calibration procedures, geometry and error analysis is needed for users of this standard. This guide assumes that the user has, at a minimum, a basic understanding of these principles and good NDA practices (see Guide C1592), as defined for an NDA professional in Guide C1490. The user of this standard must have at least a basic understanding of the software used for modeling. Instructions or further train...

  16. Standardization of Solar Mirror Reflectance Measurements - Round Robin Test: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Meyen, S.; Lupfert, E.; Fernandez-Garcia, A.; Kennedy, C.

    2010-10-01

    Within the SolarPaces Task III standardization activities, DLR, CIEMAT, and NREL have concentrated on optimizing the procedure to measure the reflectance of solar mirrors. From this work, the laboratories have developed a clear definition of the method and requirements needed of commercial instruments for reliable reflectance results. A round robin test was performed between the three laboratories with samples that represent all of the commercial solar mirrors currently available for concentrating solar power (CSP) applications. The results show surprisingly large differences in hemispherical reflectance (sh) of 0.007 and specular reflectance (ss) of 0.004 between the laboratories. These differences indicate the importance of minimum instrument requirements and standardized procedures. Based on these results, the optimal procedure will be formulated and validated with a new round robin test in which a better accuracy is expected. Improved instruments and reference standards are needed to reach the necessary accuracy for cost and efficiency calculations.

  17. Measuring Outcomes in Adult Weight Loss Studies That Include Diet and Physical Activity: A Systematic Review

    Directory of Open Access Journals (Sweden)

    Rachel A. Millstein

    2014-01-01

    Full Text Available Background. Measuring success of obesity interventions is critical. Several methods measure weight loss outcomes but there is no consensus on best practices. This systematic review evaluates relevant outcomes (weight loss, BMI, % body fat, and fat mass to determine which might be the best indicator(s of success. Methods. Eligible articles described adult weight loss interventions that included diet and physical activity and a measure of weight or BMI change and body composition change. Results. 28 full-text articles met inclusion criteria. Subjects, settings, intervention lengths, and intensities varied. All studies measured body weight (−2.9 to −17.3 kg, 9 studies measured BMI (−1.1 to −5.1 kg/m2, 20 studies measured % body fat (−0.7 to −10.2%, and 22 studies measured fat mass (−0.9 to −14.9 kg. All studies found agreement between weight or BMI and body fat mass or body fat % decreases, though there were discrepancies in degree of significance between measures. Conclusions. Nearly all weight or BMI and body composition measures agreed. Since body fat is the most metabolically harmful tissue type, it may be a more meaningful measure of health change. Future studies should consider primarily measuring % body fat, rather than or in addition to weight or BMI.

  18. Experience with local lymph node assay performance standards using standard radioactivity and nonradioactive cell count measurements.

    Science.gov (United States)

    Basketter, David; Kolle, Susanne N; Schrage, Arnhild; Honarvar, Naveed; Gamer, Armin O; van Ravenzwaay, Bennard; Landsiedel, Robert

    2012-08-01

    The local lymph node assay (LLNA) is the preferred test for identification of skin-sensitizing substances by measuring radioactive thymidine incorporation into the lymph node. To facilitate acceptance of nonradioactive variants, validation authorities have published harmonized minimum performance standards (PS) that the alternative endpoint assay must meet. In the present work, these standards were applied to a variant of the LLNA based on lymph node cell counts (LNCC) run in parallel as a control with the standard LLNA with radioactivity measurements, with threshold concentrations (EC3) being determined for the sensitizers. Of the 22 PS chemicals tested in this study, 21 yielded the same results from standard radioactivity and cell count measurements; only 2-mercaptobenzothiazole was positive by LLNA but negative by LNCC. Of the 16 PS positives, 15 were positive by LLNA and 14 by LNCC; methylmethacrylate was not identified as sensitizer by either of the measurements. Two of the six PS negatives tested negative in our study by both LLNA and LNCC. Of the four PS negatives which were positive in our study, chlorobenzene and methyl salicylate were tested at higher concentrations than the published PS, whereas the corresponding concentrations resulted in consistent negative results. Methylmethacrylate and nickel chloride tested positive within the concentration range used for the published PS. The results indicate cell counts and radioactive measurements are in good accordance within the same LLNA using the 22 PS test substances. Comparisons with the published PS results may, however, require balanced analysis rather than a simple checklist approach. Copyright © 2011 John Wiley & Sons, Ltd.

  19. Estimating Wet Bulb Globe Temperature Using Standard Meteorological Measurements

    International Nuclear Information System (INIS)

    Hunter, C.H.

    1999-01-01

    The heat stress management program at the Department of Energy''s Savannah River Site (SRS) requires implementation of protective controls on outdoor work based on observed values of wet bulb globe temperature (WBGT). To ensure continued compliance with heat stress program requirements, a computer algorithm was developed which calculates an estimate of WBGT using standard meteorological measurements. In addition, scripts were developed to generate a calculation every 15 minutes and post the results to an Intranet web site

  20. Development of measurement standards for verifying functional performance of surface texture measuring instruments

    Energy Technology Data Exchange (ETDEWEB)

    Fujii, A [Life and Industrial Product Development Department Olympus Corporation, 2951 Ishikawa-machi, Hachiouji-shi, Tokyo (Japan); Suzuki, H [Industrial Marketing and Planning Department Olympus Corporation, Shinjyuku Monolith, 3-1 Nishi-Shinjyuku 2-chome, Tokyo (Japan); Yanagi, K, E-mail: a_fujii@ot.olympus.co.jp [Department of Mechanical Engineering, Nagaoka University of Technology, 1603-1 Kamitomioka-machi, Nagaoka-shi, Niigata (Japan)

    2011-08-19

    A new measurement standard is proposed for verifying overall functional performance of surface texture measuring instruments. Its surface is composed of sinusoidal surface waveforms of chirp signals along horizontal cross sections of the material measure. One of the notable features is that the amplitude of each cycle in the chirp signal form is geometrically modulated so that the maximum slope is kept constant. The maximum slope of the chirp-like signal is gradually decreased according to movement in the lateral direction. We fabricated the measurement standard by FIB processing, and it was calibrated by AFM. We tried to evaluate the functional performance of Laser Scanning Microscope by this standard in terms of amplitude response with varying slope angles. As a result, it was concluded that the proposed standard can easily evaluate the performance of surface texture measuring instruments.

  1. A spectroscopic transfer standard for accurate atmospheric CO measurements

    Science.gov (United States)

    Nwaboh, Javis A.; Li, Gang; Serdyukov, Anton; Werhahn, Olav; Ebert, Volker

    2016-04-01

    Atmospheric carbon monoxide (CO) is a precursor of essential climate variables and has an indirect effect for enhancing global warming. Accurate and reliable measurements of atmospheric CO concentration are becoming indispensable. WMO-GAW reports states a compatibility goal of ±2 ppb for atmospheric CO concentration measurements. Therefore, the EMRP-HIGHGAS (European metrology research program - high-impact greenhouse gases) project aims at developing spectroscopic transfer standards for CO concentration measurements to meet this goal. A spectroscopic transfer standard would provide results that are directly traceable to the SI, can be very useful for calibration of devices operating in the field, and could complement classical gas standards in the field where calibration gas mixtures in bottles often are not accurate, available or stable enough [1][2]. Here, we present our new direct tunable diode laser absorption spectroscopy (dTDLAS) sensor capable of performing absolute ("calibration free") CO concentration measurements, and being operated as a spectroscopic transfer standard. To achieve the compatibility goal stated by WMO for CO concentration measurements and ensure the traceability of the final concentration results, traceable spectral line data especially line intensities with appropriate uncertainties are needed. Therefore, we utilize our new high-resolution Fourier-transform infrared (FTIR) spectroscopy CO line data for the 2-0 band, with significantly reduced uncertainties, for the dTDLAS data evaluation. Further, we demonstrate the capability of our sensor for atmospheric CO measurements, discuss uncertainty calculation following the guide to the expression of uncertainty in measurement (GUM) principles and show that CO concentrations derived using the sensor, based on the TILSAM (traceable infrared laser spectroscopic amount fraction measurement) method, are in excellent agreement with gravimetric values. Acknowledgement Parts of this work have been

  2. Spectroscopic metrology for isotope composition measurements and transfer standards

    Science.gov (United States)

    Anyangwe Nwaboh, Javis; Balslev-Harder, David; Kääriäinen, Teemu; Richmond, Craig; Manninen, Albert; Mohn, Joachim; Kiseleva, Maria; Petersen, Jan C.; Werhahn, Olav; Ebert, Volker

    2017-04-01

    The World Meteorological Organization (WMO) has identified greenhouse gases such as CO2, CH4 and N2O as critical for global climate monitoring. Other molecules such as CO that has an indirect effect of enhancing global warming are also monitored. WMO has stated compatibility goals for atmospheric concentration and isotope ratio measurements of these gases, e.g. 0.1 ppm for CO2 concentration measurements in the northern hemisphere and 0.01 ‰ for δ13C-CO2. For measurements of the concentration of greenhouse gases, gas analysers are typically calibrated with static gas standards e.g. traceable to the WMO scale or to the International System of Units (SI) through a national metrology institute. However, concentrations of target components, e.g. CO, in static gas standards have been observed to drift, and typically the gas matrix as well as the isotopic composition of the target component does not always reflect field gas composition, leading to deviations of the analyser response, even after calibration. The deviations are dependent on the measurement technique. To address this issue, part of the HIGHGAS (Metrology for high-impact greenhouse gases) project [1] focused on the development of optical transfer standards (OTSs) for greenhouse gases, e.g. CO2 and CO, potentially complementing gas standards. Isotope ratio mass spectrometry (IRMS) [2] is currently used to provide state-of-the-art high precision (in the 0.01 ‰ range) measurements for the isotopic composition of greenhouse gases. However, there is a need for field-deployable techniques such as optical isotope ratio spectroscopy (OIRS) that can be combined with metrological measurement methods. Within the HIGHGAS project, OIRS methods and procedures based on e.g. cavity enhanced spectroscopy (CES) and tunable diode laser absorption spectroscopy (TDLAS), matched to metrological principles have been established for the measurement of 13C/12C and 18O/16O ratios in CO2, 15N/14N ratios in N2O, and 13C/12C and 2H

  3. Investigation of the language tasks to include in a short-language measure for children in the early school years.

    Science.gov (United States)

    Matov, Jessica; Mensah, Fiona; Cook, Fallon; Reilly, Sheena

    2018-02-18

    The inaccurate estimation of language difficulties by teachers suggests the benefit of a short-language measure that could be used to support their decisions about who requires referral to a speech-language therapist. While the literature indicates the potential for the development of a short-language measure, evidence is lacking about which combination of language tasks it should include. To understand the number and nature of components/language tasks that should be included in a short-language measure for children in the early school years. Eight language tasks were administered to participants of the Early Language in Victoria Study (ELVS) at ages 5 (n = 995) and 7 (n = 1217). These included six language tasks measured by an omnibus language measure (which comprised a direction-following, morphological-completion, sentence-recall, sentence-formation, syntactic-understanding and word-association task) and a non-word repetition and a receptive vocabulary task, measured by two task-specific language measures. Scores were analyzed using principal component analysis (PCA), the Bland and Altman method, and receiver operating characteristic (ROC) curve analysis. PCA revealed one main component of language that was assessed by all language tasks. The most effective combination of two tasks that measured this component was a direction-following and a sentence-recall task. It showed the greatest agreement with an omnibus language measure and exceeded the criterion for good discriminant accuracy (sensitivity = 94%, specificity = 91%, accuracy = 91%, at 1 SD (standard deviation) below the mean). Findings support the combination of a direction-following and a sentence-recall task to assess language ability effectively in the early school years. The results could justify the future production of a novel short-language measure comprising a direction-following and a sentence-recall task to use as a screening tool in schools and to assess language ability in research

  4. Alcohol intake and colorectal cancer: a comparison of approaches for including repeated measures of alcohol consumption

    DEFF Research Database (Denmark)

    Thygesen, Lau Caspar; Wu, Kana; Grønbaek, Morten

    2008-01-01

    BACKGROUND: In numerous studies, alcohol intake has been found to be positively associated with colorectal cancer risk. However, the majority of studies included only one exposure measurement, which may bias the results if long-term intake is relevant.METHODS: We compared different approaches...... for including repeated measures of alcohol intake among 47,432 US men enrolled in the Health Professionals Follow-up Study. Questionnaires including questions on alcohol intake had been completed in 1986, 1990, 1994, and 1998. The outcome was incident colorectal cancer during follow-up from 1986 to 2002.RESULTS......: During follow-up, 868 members of the cohort experienced colorectal cancer. Baseline, updated, and cumulative average alcohol intakes were positively associated with colorectal cancer, with only minor differences among the approaches. These results support moderately increased risk for intake >30 g...

  5. Characterization of textile electrodes and conductors using standardized measurement setups

    International Nuclear Information System (INIS)

    Beckmann, L; Neuhaus, C; Medrano, G; Walter, M; Leonhardt, S; Jungbecker, N; Gries, T

    2010-01-01

    Textile electrodes and conductors are being developed and used in different monitoring scenarios, such as ECG or bioimpedance spectroscopy measurements. Compared to standard materials, conductive textile materials offer improved wearing comfort and enable long-term measurements. Unfortunately, the development and investigation of such materials often suffers from the non-reproducibility of the test scenarios. For example, the materials are generally tested on human skin which is difficult since the properties of human skin differ for each person and can change within hours. This study presents two test setups which offer reproducible measurement procedures for the systematic analysis of textile electrodes and conductors. The electrode test setup was designed with a special skin dummy which allows investigation of not only the electrical properties of textile electrodes but also the contact behavior between electrode and skin. Using both test setups, eight textile electrodes and five textile conductors were analyzed and compared

  6. Characterization of textile electrodes and conductors using standardized measurement setups.

    Science.gov (United States)

    Beckmann, L; Neuhaus, C; Medrano, G; Jungbecker, N; Walter, M; Gries, T; Leonhardt, S

    2010-02-01

    Textile electrodes and conductors are being developed and used in different monitoring scenarios, such as ECG or bioimpedance spectroscopy measurements. Compared to standard materials, conductive textile materials offer improved wearing comfort and enable long-term measurements. Unfortunately, the development and investigation of such materials often suffers from the non-reproducibility of the test scenarios. For example, the materials are generally tested on human skin which is difficult since the properties of human skin differ for each person and can change within hours. This study presents two test setups which offer reproducible measurement procedures for the systematic analysis of textile electrodes and conductors. The electrode test setup was designed with a special skin dummy which allows investigation of not only the electrical properties of textile electrodes but also the contact behavior between electrode and skin. Using both test setups, eight textile electrodes and five textile conductors were analyzed and compared.

  7. Accuracy of standard craniometric measurements using multiple data formats.

    Science.gov (United States)

    Richard, Adam H; Parks, Connie L; Monson, Keith L

    2014-09-01

    With continuing advancements in biomedical imaging technologies, anthropologists are increasingly making use of data derived from indirect measurement and analysis of skeletal material. To that end, the purpose of this study was to test the reliability of 26 standard craniometric measurements routinely utilized in forensic casework across several different imaging technologies. Measurements from five crania of known individuals were collected in duplicate by two anthropologists via computed tomography (CT) scans and three-dimensional (3D) laser scans of the known skulls. The laser scans were also used to create prototype models of the known skulls. These prototypes were, themselves, laser-scanned, and measurements were also collected from the prototypes and the laser scans of the prototypes. Measurement sets from each technology were then compared with one another using the previously collected osteometric measurements taken on the crania themselves as the ground truth. indicate that, while the majority of measurements showed no significant differences across data formats, a handful were found to be problematic for particular technologies. For instance, measurements taken in a supero-inferior direction (e.g., BBH, OBH) from CT scans were prone to greater deviation from direct measurements of the cranium than other technologies, especially for CT scans taken at 5 mm thickness and increment. Also, several measurements defined by Type 1 landmarks, particularly those occurring at complicated or indistinct suture junctures (e.g., ASB, ZMB), were found to have high variance across all technologies while measurements based on Type 3 landmarks proved to be highly reproducible. This is contrary to measurements taken directly on crania, in which measures defined by Type 1 landmarks are typically the most reliable, likely attributable to diminished or totally obscured suture definition in the scan data. If medical imaging data are to be increasingly utilized in

  8. Search for Non-Standard Model Behavior, including CP Violation, in Higgs Production and Decay to $ZZ^{*}$

    CERN Document Server

    Webster, Jordan

    This thesis presents an ATLAS study of the tensor structure characterizing the interaction of the scalar Higgs boson with two $Z$ bosons. The measurement is based on $25~\\textrm{fb}^{-1}$ of proton-proton collision data produced at the Large Hadron Collider with center of mass energy equal to 7 and $8~\\textrm{TeV}$. Nine discriminant observables in the four lepton final state are used to search for CP-odd and CP-even components in the Lagrangian beyond those of the Standard Model. These are represented by $(\\tilde{\\kappa}_{AVV}/\\kappa_{SM})\\tan\\alpha$ and $\\tilde{\\kappa}_{HVV}/\\kappa_{SM}$, respectively. Both of these parameters are found to be consistent with their predicted Standard Model values of zero. Values outside the intervals $-3.24<(\\tilde{\\kappa}_{AVV}/\\kappa_{SM})\\tan\\alpha<0.91$ and $-0.82<\\tilde{\\kappa}_{HVV}/\\kappa_{SM}<0.87$ are excluded at 95\\% confidence level. These results are combined with a search for the same structure in the interaction of the Higgs with pairs of $W$ boson...

  9. Radio Astronomers Set New Standard for Accurate Cosmic Distance Measurement

    Science.gov (United States)

    1999-06-01

    estimate of the age of the universe. In order to do this, you need an unambiguous, absolute distance to another galaxy. We are pleased that the NSF's VLBA has for the first time determined such a distance, and thus provided the calibration standard astronomers have always sought in their quest for accurate distances beyond the Milky Way," said Morris Aizenman, Executive Officer of the National Science Foundation's (NSF) Division of Astronomical Sciences. "For astronomers, this measurement is the golden meter stick in the glass case," Aizenman added. The international team of astronomers used the VLBA to measure directly the motion of gas orbiting what is generally agreed to be a supermassive black hole at the heart of NGC 4258. The orbiting gas forms a warped disk, nearly two light-years in diameter, surrounding the black hole. The gas in the disk includes water vapor, which, in parts of the disk, acts as a natural amplifier of microwave radio emission. The regions that amplify radio emission are called masers, and work in a manner similar to the way a laser amplifies light emission. Determining the distance to NGC 4258 required measuring motions of extremely small shifts in position of these masers as they rotate around the black hole. This is equivalent to measuring an angle one ten-thousandth the width of a human hair held at arm's length. "The VLBA is the only instrument in the world that could do this," said Moran. "This work is the culmination of a 20-year effort at the Harvard Smithsonian Center for Astrophysics to measure distances to cosmic masers," said Irwin Shapiro, Director of that institution. Collection of the data for the NGC 4258 project was begun in 1994 and was part of Herrnstein's Ph.D dissertation at Harvard University. Previous observations with the VLBA allowed the scientists to measure the speed at which the gas is orbiting the black hole, some 39 million times more massive than the Sun. They did this by observing the amount of change in the

  10. Biopolitcs and education. Measurement, standardization and regularisation of the population

    Directory of Open Access Journals (Sweden)

    Geo SAURA

    2015-12-01

    Full Text Available This paper analyzes scholar standardized testing as a dispositif of educational biopolitics. It describes a theoretical review of biopolitics, an analytic of power of the economic agencies that legitimize global education policies and a socio- technical mapping of teacher control technologies. The dispositif of educational biopolitics causes the legitimizing of disciplinary logic in students, along with the perpetuation of dividing lines (normality/abnormality and the hegemony of standardization as regularization of population sets. The effects brought about in teachers are presented with in-depth interviews of empirical research, including two qualities: «comparability», of which the characteristics are computing/center and relationship with everything, and «dividuality» where the properties are competitiveness and subjection.

  11. Joe Celko's Data, Measurements and Standards in SQL

    CERN Document Server

    Celko, Joe

    2009-01-01

    Joe Celko has looked deep into the code of SQL programmers and found a consistent and troubling pattern - a frightening lack of consistency between their individual encoding schemes and those of the industries in which they operate. This translates into a series of incompatible databases, each one an island unto itself that is unable to share information with others in an age of internationalization and business interdependence. Such incompatibility severely hinders information flow and the quality of company data. Data, Measurements and Standards in SQL reveals the shift these programmers nee

  12. — study of the use of two standard- and non-standard-measuring devices

    Directory of Open Access Journals (Sweden)

    Paweł Ostapkowicz

    2014-03-01

    Full Text Available This paper deals with leak detection in liquid transmission pipelines. Diagnostic method based on negative pressure wave detection is taken into account here. The paper focuses on variant of this method, related to the use of only two measurement points (devices, placed at the inlet and outlet of the pipeline. Standard transducers for measurement of pressure signals and non-standard elaborated technique for measurement of new diagnostic signals were used. New diagnostic signals, conventionally named the signals of weak interactions, result from the work of special devices (correctors joined to the pipeline. In order to compare both hardware solutions key performance attributes for the analyzed leak detection method were determined. The bases of such assessment were experimental tests. They were conducted with the use of a physical model of a pipeline. The pipeline was 380 meters long, 34 mm in internal diameter and made of polyethylene (PEHD pipes. The medium pumped through the pipeline was water. Carrying out such research, diagnostic procedures elaborated by the author were used and tested.[b]Keywords[/b]: technical diagnostics, pipelines, leak detection

  13. IntelliGO: a new vector-based semantic similarity measure including annotation origin

    Directory of Open Access Journals (Sweden)

    Devignes Marie-Dominique

    2010-12-01

    Full Text Available Abstract Background The Gene Ontology (GO is a well known controlled vocabulary describing the biological process, molecular function and cellular component aspects of gene annotation. It has become a widely used knowledge source in bioinformatics for annotating genes and measuring their semantic similarity. These measures generally involve the GO graph structure, the information content of GO aspects, or a combination of both. However, only a few of the semantic similarity measures described so far can handle GO annotations differently according to their origin (i.e. their evidence codes. Results We present here a new semantic similarity measure called IntelliGO which integrates several complementary properties in a novel vector space model. The coefficients associated with each GO term that annotates a given gene or protein include its information content as well as a customized value for each type of GO evidence code. The generalized cosine similarity measure, used for calculating the dot product between two vectors, has been rigorously adapted to the context of the GO graph. The IntelliGO similarity measure is tested on two benchmark datasets consisting of KEGG pathways and Pfam domains grouped as clans, considering the GO biological process and molecular function terms, respectively, for a total of 683 yeast and human genes and involving more than 67,900 pair-wise comparisons. The ability of the IntelliGO similarity measure to express the biological cohesion of sets of genes compares favourably to four existing similarity measures. For inter-set comparison, it consistently discriminates between distinct sets of genes. Furthermore, the IntelliGO similarity measure allows the influence of weights assigned to evidence codes to be checked. Finally, the results obtained with a complementary reference technique give intermediate but correct correlation values with the sequence similarity, Pfam, and Enzyme classifications when compared to

  14. Including health insurance in poverty measurement: The impact of Massachusetts health reform on poverty.

    Science.gov (United States)

    Korenman, Sanders D; Remler, Dahlia K

    2016-12-01

    We develop and implement what we believe is the first conceptually valid health-inclusive poverty measure (HIPM) - a measure that includes health care or insurance in the poverty needs threshold and health insurance benefits in family resources - and we discuss its limitations. Building on the Census Bureau's Supplemental Poverty Measure, we construct a pilot HIPM for the under-65 population under ACA-like health reform in Massachusetts. This pilot demonstrates the practicality, face validity and value of a HIPM. Results suggest that public health insurance benefits and premium subsidies accounted for a substantial, one-third reduction in the health inclusive poverty rate. Copyright © 2016 Elsevier B.V. All rights reserved.

  15. Standard guide for making quality nondestructive assay measurements

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    2009-01-01

    1.1 This guide is a compendium of Quality Measurement Practices for performing measurements of radioactive material using nondestructive assay (NDA) instruments. The primary purpose of the guide is to assist users in arriving at quality NDA results, that is, results that satisfy the end user’s needs. This is accomplished by providing an acceptable and uniform basis for the collection, analysis, comparison, and application of data. The recommendations are not compulsory or prerequisites to achieving quality NDA measurements, but are considered contributory in most areas. 1.2 This guide applies to the use of NDA instrumentation for the measurement of nuclear materials by the observation of spontaneous or stimulated nuclear radiations, including photons, neutrons, or the flow of heat. Recommended calibration, operating, and assurance methods represent guiding principles based on current NDA technology. The diversity of industry-wide nuclear materials measurement applications and instrumentation precludes disc...

  16. Cryogenic flow rate measurement with a laser Doppler velocimetry standard

    Science.gov (United States)

    Maury, R.; Strzelecki, A.; Auclercq, C.; Lehot, Y.; Loubat, S.; Chevalier, J.; Ben Rayana, F.

    2018-03-01

    A very promising alternative to the state-of-the-art static volume measurements for liquefied natural gas (LNG) custody transfer processes is the dynamic principle of flow metering. As the Designated Institute (DI) of the LNE (‘Laboratoire National de métrologie et d’Essais’, being the French National Metrology Institute) for high-pressure gas flow metering, Cesame-Exadebit is involved in various research and development programs. Within the framework of the first (2010-2013) and second (2014-2017) EURAMET Joint Research Project (JRP), named ‘Metrological support for LNG custody transfer and transport fuel applications’, Cesame-Exadebit explored a novel cryogenic flow metering technology using laser Doppler velocimetry (LDV) as an alternative to ultrasonic and Coriolis flow metering. Cesame-Exadebit is trying to develop this technique as a primary standard for cryogenic flow meters. Currently, cryogenic flow meters are calibrated at ambient temperatures with water. Results are then extrapolated to be in the Reynolds number range of real applications. The LDV standard offers a unique capability to perform online calibration of cryogenic flow meters in real conditions (temperature, pressure, piping and real flow disturbances). The primary reference has been tested on an industrial process in a LNG terminal during truck refuelling. The reference can calibrate Coriolis flow meters being used daily with all the real environmental constraints, and its utilisation is transparent for LNG terminal operators. The standard is traceable to Standard International units and the combined extended uncertainties have been determined and estimated to be lower than 0.6% (an ongoing improvement to reducing the correlation function uncertainty, which has a major impact in the uncertainty estimation).

  17. Spectral interferometry including the effect of transparent thin films to measure distances and displacements

    International Nuclear Information System (INIS)

    Hlubina, P.

    2004-01-01

    A spectral-domain interferometric technique is applied for measuring mirror distances and displacements in a dispersive Michelson interferometer when the effect of transparent thin films coated onto the interferometer beam splitter and compensator is known. We employ a low-resolution spectrometer in two experiments with different amounts of dispersion in a Michelson interferometer that includes fused-silica optical sample. Knowing the thickness of the optical sample and the nonlinear phase function of the thin films, the positions of the interferometer mirror are determined precisely by a least-squares fitting of the theoretical spectral interferograms to the recorded ones. We compare the results of the processing that include and do not include the effect of transparent thin films (Author)

  18. [Measurement of CO diffusion capacity (II): Standardization and quality criteria].

    Science.gov (United States)

    Salcedo Posadas, A; Villa Asensi, J R; de Mir Messa, I; Sardón Prado, O; Larramona, H

    2015-08-01

    The diffusion capacity is the technique that measures the ability of the respiratory system for gas exchange, thus allowing a diagnosis of the malfunction of the alveolar-capillary unit. The most important parameter to assess is the CO diffusion capacity (DLCO). New methods are currently being used to measure the diffusion using nitric oxide (NO). There are other methods for measuring diffusion, although in this article the single breath technique is mainly referred to, as it is the most widely used and best standardized. Its complexity, its reference equations, differences in equipment, inter-patient variability and conditions in which the DLCO is performed, lead to a wide inter-laboratory variability, although its standardization makes this a more reliable and reproductive method. The practical aspects of the technique are analyzed, by specifying the recommendations to carry out a suitable procedure, the calibration routine, calculations and adjustments. Clinical applications are also discussed. An increase in the transfer of CO occurs in diseases in which there is an increased volume of blood in the pulmonary capillaries, such as in the polycythemia and pulmonary hemorrhage. There is a decrease in DLCO in patients with alveolar volume reduction or diffusion defects, either by altered alveolar-capillary membrane (interstitial diseases) or decreased volume of blood in the pulmonary capillaries (pulmonary embolism or primary pulmonary hypertension). Other causes of decreased or increased DLCO are also highlighted. Copyright © 2014 Asociación Española de Pediatría. Published by Elsevier España, S.L.U. All rights reserved.

  19. Prenatal Triclosan Exposure and Anthropometric Measures Including Anogenital Distance in Danish Infants

    DEFF Research Database (Denmark)

    Lassen, Tina Harmer; Frederiksen, Hanne; Kyhl, Henriette Boye

    2016-01-01

    BACKGROUND: Triclosan (TCS) is widely used as an antibacterial agent in consumer products such as hand soap and toothpaste, and human exposure is widespread. TCS is suspected of having endocrine-disrupting properties, but few human studies have examined the developmental effects of prenatal TCS e......, Swan SH, Main KM, Andersson AM, Lind DV, Husby S, Wohlfahrt-Veje C, Skakkebæk NE, Jensen TK. 2016. Prenatal triclosan exposure and anthropometric measures including anogenital distance in Danish infants. Environ Health Perspect 124:1261-1268; http://dx.doi.org/10.1289/ehp.1409637....

  20. Commentary on guidelines for radiation measurement and treatment of substances including naturally occurring radioactive materials

    International Nuclear Information System (INIS)

    Sakurai, Naoyuki; Ishiguro, Hideharu

    2007-01-01

    Study group on safety regulation on research reactors in Ministry of Education, Culture, Sports, Science and Technology (MEXT) reported the guidelines of 'Guidelines on radiation measurement and treatment of naturally occurring radioactive materials (NORM)' on 6 February 2006. RANDEC made the website contents 'Study on use and safety of the substances including uranium or thorium', based on the contract with MEXT to make theirs contents. This paper describes the outline of the website in MEXT homepage, background and contents of NORM guidelines in order to understand easily and visually the NORM guidelines, adding in some flowcharts and figures. (author)

  1. Evaluation of measurement reproducibility using the standard-sites data, 1994 Fernald field characterization demonstration project

    International Nuclear Information System (INIS)

    Rautman, C.A.

    1996-02-01

    The US Department of Energy conducted the 1994 Fernald (Ohio) field characterization demonstration project to evaluate the performance of a group of both industry-standard and proposed alternative technologies in describing the nature and extent of uranium contamination in surficial soils. Detector stability and measurement reproducibility under actual operating conditions encountered in the field is critical to establishing the credibility of the proposed alternative characterization methods. Comparability of measured uranium activities to those reported by conventional, US Environmental Protection Agency (EPA)-certified laboratory methods is also required. The eleven (11) technologies demonstrated included (1) EPA-standard soil sampling and laboratory mass-spectroscopy analyses, and currently-accepted field-screening techniques using (2) sodium-iodide scintillometers, (3) FIDLER low-energy scintillometers, and (4) a field-portable x-ray fluorescence spectrometer. Proposed advanced characterization techniques included (5) alpha-track detectors, (6) a high-energy beta scintillometer, (7) electret ionization chambers, (8) and (9) a high-resolution gamma-ray spectrometer in two different configurations, (10) a field-adapted laser ablation-inductively coupled plasma-atomic emission spectroscopy (ICP-AES) technique, and (11) a long-range alpha detector. Measurement reproducibility and the accuracy of each method were tested by acquiring numerous replicate measurements of total uranium activity at each of two ''standard sites'' located within the main field demonstration area. Meteorological variables including temperature, relative humidity. and 24-hour rainfall quantities were also recorded in conjunction with the standard-sites measurements

  2. Standard test method for measurement of web/roller friction characteristics

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    2003-01-01

    1.1 This test method covers the simulation of a roller/web transport tribosystem and the measurement of the static and kinetic coefficient of friction of the web/roller couple when sliding occurs between the two. The objective of this test method is to provide users with web/roller friction information that can be used for process control, design calculations, and for any other function where web/roller friction needs to be known. 1.2 The values stated in SI units are to be regarded as standard. No other units of measurement are included in this standard. 1.3 This standard does not purport to address all of the safety concerns, if any, associated with its use. It is the responsibility of the user of this standard to establish appropriate safety and health practices and determine the applicability of regulatory limitations prior to use.

  3. Standard test method for measurement of corrosion potentials of Aluminum alloys

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    1997-01-01

    1.1 This test method covers a procedure for measurement of the corrosion potential (see Note 1) of an aluminum alloy in an aqueous solution of sodium chloride with enough hydrogen peroxide added to provide an ample supply of cathodic reactant. Note 1—The corrosion potential is sometimes referred to as the open-circuit solution or rest potential. 1.2 The values stated in SI units are to be regarded as standard. No other units of measurement are included in this standard. 1.3 This standard does not purport to address all of the safety concerns, if any, associated with its use. It is the responsibility of the user of this standard to establish appropriate safety and health practices and determine the applicability of regulatory limitations prior to use.

  4. Standard Test Method for Measuring Heat-Transfer Rate Using a Thermal Capacitance (Slug) Calorimeter

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    2008-01-01

    1.1 This test method describes the measurement of heat transfer rate using a thermal capacitance-type calorimeter which assumes one-dimensional heat conduction into a cylindrical piece of material (slug) with known physical properties. 1.2 This standard does not purport to address all of the safety concerns, if any, associated with its use. It is the responsibility of the user of this standard to establish appropriate safety and health practices and determine the applicability of regulatory limitations prior to use. 1.3 The values stated in SI units are to be regarded as standard. No other units of measurement are included in this standard. Note 1—For information see Test Methods E 285, E 422, E 458, E 459, and E 511.

  5. Technical support document: Energy conservation standards for consumer products: Dishwashers, clothes washers, and clothes dryers including: Environmental impacts; regulatory impact analysis

    Energy Technology Data Exchange (ETDEWEB)

    1990-12-01

    The Energy Policy and Conservation Act as amended (P.L. 94-163), establishes energy conservation standards for 12 of the 13 types of consumer products specifically covered by the Act. The legislation requires the Department of Energy (DOE) to consider new or amended standards for these and other types of products at specified times. This Technical Support Document presents the methodology, data and results from the analysis of the energy and economic impacts of standards on dishwashers, clothes washers, and clothes dryers. The economic impact analysis is performed in five major areas: An Engineering Analysis, which establishes technical feasibility and product attributes including costs of design options to improve appliance efficiency. A Consumer Analysis at two levels: national aggregate impacts, and impacts on individuals. The national aggregate impacts include forecasts of appliance sales, efficiencies, energy use, and consumer expenditures. The individual impacts are analyzed by Life-Cycle Cost (LCC), Payback Periods, and Cost of Conserved Energy (CCE), which evaluate the savings in operating expenses relative to increases in purchase price; A Manufacturer Analysis, which provides an estimate of manufacturers' response to the proposed standards. Their response is quantified by changes in several measures of financial performance for a firm. An Industry Impact Analysis shows financial and competitive impacts on the appliance industry. A Utility Analysis that measures the impacts of the altered energy-consumption patterns on electric utilities. A Environmental Effects analysis, which estimates changes in emissions of carbon dioxide, sulfur oxides, and nitrogen oxides, due to reduced energy consumption in the home and at the power plant. A Regulatory Impact Analysis collects the results of all the analyses into the net benefits and costs from a national perspective. 47 figs., 171 tabs. (JF)

  6. Three-dimensional hindfoot alignment measurements based on biplanar radiographs: comparison with standard radiographic measurements

    Energy Technology Data Exchange (ETDEWEB)

    Sutter, Reto; Pfirrmann, Christian W.A.; Buck, Florian M. [University Hospital Balgrist, Department of Radiology, Zurich (Switzerland); University of Zurich, Zurich (Switzerland); Espinosa, Norman [University Hospital Balgrist, Department of Orthopedic Surgery, Zurich (Switzerland); University of Zurich, Zurich (Switzerland)

    2013-04-15

    To establish a hindfoot alignment measurement technique based on low-dose biplanar radiographs and compare with hindfoot alignment measurements on long axial view radiographs, which is the current reference standard. Long axial view radiographs and low-dose biplanar radiographs of a phantom consisting of a human foot skeleton embedded in acrylic glass (phantom A) and a plastic model of a human foot in three different hindfoot positions (phantoms B1-B3) were imaged in different foot positions (20 internal to 20 external rotation). Two independent readers measured hindfoot alignment on long axial view radiographs and performed 3D hindfoot alignment measurements based on biplanar radiographs on two different occasions. Time for three-dimensional (3D) measurements was determined. Intraclass correlation coefficients (ICC) were calculated. Hindfoot alignment measurements on long axial view radiographs were characterized by a large positional variation, with a range of 14 /13 valgus to 22 /27 varus (reader 1/2 for phantom A), whereas the range of 3D hindfoot alignment measurements was 7.3 /6.0 to 9.0 /10.5 varus (reader 1/2 for phantom A), with a mean and standard deviation of 8.1 {+-} 0.6/8.7 {+-} 1.4 respectively. Interobserver agreement was high (ICC = 0.926 for phantom A, and ICC = 0.886 for phantoms B1-B3), and agreement between different readouts was high (ICC = 0.895-0.995 for reader 1, and ICC = 0.987-0.994 for reader 2) for 3D measurements. Mean duration of 3D measurements was 84 {+-} 15/113 {+-} 15 s for reader 1/2. Three-dimensional hindfoot alignment measurements based on biplanar radiographs were independent of foot positioning during image acquisition and reader independent. In this phantom study, the 3D measurements were substantially more precise than the standard radiographic measurements. (orig.)

  7. Standardized fluoroscopy-based technique to measure intraoperative cup anteversion.

    Science.gov (United States)

    Zingg, Matthieu; Boudabbous, Sana; Hannouche, Didier; Montet, Xavier; Boettner, Friedrich

    2017-10-01

    Direct anterior approach (DAA) with the patient lying supine has facilitated the use of intraoperative fluoroscopy and allows for standardized positioning of the patient. The current study presents a new technique to measure acetabular component anteversion using intraoperative fluoroscopy. The current paper describes a mathematical formula to calculate true acetabular component anteversion based on the acetabular component abduction angle and the c-arm tilt angle (CaT). The CaT is determined by tilting the c-arm until an external pelvic oblique radiograph with the equatorial plane of the acetabular component perpendicular to the fluoroscopy receptor is obtained. CaT is determined by direct reading on the C-arm device. The technique was validated using a radiopaque synbone model comparing the described technique to computed tomography anteversion measurement. The experiment was repeated 25 times. The difference in anteversion between the two measuring techniques was on average 0.2° (range -3.0-3.1). The linear regression coefficients evaluating the agreement between the experimental and control methods were 0.99 (95%CI 0.88-1.10, p < 0.001) and 0.33 (95%CI -1.53-2.20, p = 0.713) for the slope and intercept, respectively. The current study confirms that the described three-step c-arm acetabular cup measuring technique can reproducibly and reliably assess acetabular component anteversion in the supine position, as compared to CT-imaging. © 2017 Orthopaedic Research Society. Published by Wiley Periodicals, Inc. J Orthop Res 35:2307-2312, 2017. © 2017 Orthopaedic Research Society. Published by Wiley Periodicals, Inc.

  8. Multicenter Evaluation of Cystatin C Measurement after Assay Standardization.

    Science.gov (United States)

    Bargnoux, Anne-Sophie; Piéroni, Laurence; Cristol, Jean-Paul; Kuster, Nils; Delanaye, Pierre; Carlier, Marie-Christine; Fellahi, Soraya; Boutten, Anne; Lombard, Christine; González-Antuña, Ana; Delatour, Vincent; Cavalier, Etienne

    2017-04-01

    Since 2010, a certified reference material ERM-DA471/IFCC has been available for cystatin C (CysC). This study aimed to assess the sources of uncertainty in results for clinical samples measured using standardized assays. This evaluation was performed in 2015 and involved 7 clinical laboratories located in France and Belgium. CysC was measured in a panel of 4 serum pools using 8 automated assays and a candidate isotope dilution mass spectrometry reference measurement procedure. Sources of uncertainty (imprecision and bias) were evaluated to calculate the relative expanded combined uncertainty for each CysC assay. Uncertainty was judged against the performance specifications derived from the biological variation model. Only Siemens reagents on the Siemens systems and, to a lesser extent, DiaSys reagents on the Cobas system, provided results that met the minimum performance criterion calculated according to the intraindividual and interindividual biological variations. Although the imprecision was acceptable for almost all assays, an increase in the bias with concentration was observed for Gentian reagents, and unacceptably high biases were observed for Abbott and Roche reagents on their own systems. This comprehensive picture of the market situation since the release of ERM-DA471/IFCC shows that bias remains the major component of the combined uncertainty because of possible problems associated with the implementation of traceability. Although some manufacturers have clearly improved their calibration protocols relative to ERM-DA471, most of them failed to meet the criteria for acceptable CysC measurements. © 2016 American Association for Clinical Chemistry.

  9. Standard practice for calculation of corrosion rates and related information from electrochemical measurements

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    1989-01-01

    1.1 This practice covers the providing of guidance in converting the results of electrochemical measurements to rates of uniform corrosion. Calculation methods for converting corrosion current density values to either mass loss rates or average penetration rates are given for most engineering alloys. In addition, some guidelines for converting polarization resistance values to corrosion rates are provided. 1.2 The values stated in SI units are to be regarded as standard. No other units of measurement are included in this standard.

  10. Standard Test Method for Measuring Binocular Disparity in Transparent Parts

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    2009-01-01

    1.1 This test method covers the amount of binocular disparity that is induced by transparent parts such as aircraft windscreens, canopies, HUD combining glasses, visors, or goggles. This test method may be applied to parts of any size, shape, or thickness, individually or in combination, so as to determine the contribution of each transparent part to the overall binocular disparity present in the total “viewing system” being used by a human operator. 1.2 This test method represents one of several techniques that are available for measuring binocular disparity, but is the only technique that yields a quantitative figure of merit that can be related to operator visual performance. 1.3 This test method employs apparatus currently being used in the measurement of optical angular deviation under Method F 801. 1.4 The values stated in inch-pound units are to be regarded as standard. The values given in parentheses are mathematical conversions to SI units that are provided for information only and are not con...

  11. Constraints on inflation revisited. An analysis including the latest local measurement of the Hubble constant

    Energy Technology Data Exchange (ETDEWEB)

    Guo, Rui-Yun [Northeastern University, Department of Physics, College of Sciences, Shenyang (China); Zhang, Xin [Northeastern University, Department of Physics, College of Sciences, Shenyang (China); Peking University, Center for High Energy Physics, Beijing (China)

    2017-12-15

    We revisit the constraints on inflation models by using the current cosmological observations involving the latest local measurement of the Hubble constant (H{sub 0} = 73.00 ± 1.75 km s{sup -1} Mpc{sup -1}). We constrain the primordial power spectra of both scalar and tensor perturbations with the observational data including the Planck 2015 CMB full data, the BICEP2 and Keck Array CMB B-mode data, the BAO data, and the direct measurement of H{sub 0}. In order to relieve the tension between the local determination of the Hubble constant and the other astrophysical observations, we consider the additional parameter N{sub eff} in the cosmological model. We find that, for the ΛCDM+r+N{sub eff} model, the scale invariance is only excluded at the 3.3σ level, and ΔN{sub eff} > 0 is favored at the 1.6σ level. Comparing the obtained 1σ and 2σ contours of (n{sub s},r) with the theoretical predictions of selected inflation models, we find that both the convex and the concave potentials are favored at 2σ level, the natural inflation model is excluded at more than 2σ level, the Starobinsky R{sup 2} inflation model is only favored at around 2σ level, and the spontaneously broken SUSY inflation model is now the most favored model. (orig.)

  12. Measuring and diagnosing unilateral neglect: a standardized statistical procedure.

    Science.gov (United States)

    Toraldo, Alessio; Romaniello, Cristian; Sommaruga, Paolo

    Unilateral neglect is usually investigated by adminstering stimuli (targets) in different positions, with targets being responded to by the patient (Hit) or omitted. In spite of this homogeneity of data type, neglect indices and diagnostic criteria vary considerably, causing inconsistencies in both clinical and experimental settings. We aimed at deriving a standard analysis which would apply to all tasks sharing this data form. A-priori theoretical reasoning demonstrated that the mean position of Hits in space (MPH) is an optimal index for correctly diagnosing and quantifying neglect. Crucially MPH eliminates the confounding effects of deficits that are different from neglect (non-lateral) but which decrease Hit rate. We ran a Monte Carlo study to assess MPH's (so far overlooked) statistical behavior as a function of numbers of targets and Hits. While average MPH was indeed insensitive to non-lateral deficits, MPH's variance (like that of all other neglect indices) increased dramatically with increasing non-lateral deficits. This instability would lead to alarmingly high false-positive rates (FPRs) when applying a classical diagnostic procedure that compares one patient with a control sample. We solved the problem by developing an equation that takes into account MPH instability and provides correct cut-offs and close-to-nominal FPRs, even without control subjects. We developed a computerized program which, given the raw data, yields the MPH, a z-score and a p-value. We provided a standard method that allows clinical and experimental neuropsychologists to diagnose and measure neglect in a consistent way across the vast majority of tasks.

  13. Breast composition measurements using retrospective standard mammogram form (SMF)

    International Nuclear Information System (INIS)

    Highnam, R; Pan, X; Warren, R; Jeffreys, M; Smith, G Davey; Brady, M

    2006-01-01

    The standard mammogram form (SMF) representation of an x-ray mammogram is a standardized, quantitative representation of the breast from which the volume of non-fat tissue and breast density can be easily estimated, both of which are of significant interest in determining breast cancer risk. Previous theoretical analysis of SMF had suggested that a complete and substantial set of calibration data (such as mAs and kVp) would be needed to generate realistic breast composition measures and yet there are many interesting trials that have retrospectively collected images with no calibration data. The main contribution of this paper is to revisit our previous theoretical analysis of SMF with respect to errors in the calibration data and to show how and why that theoretical analysis did not match the results from the practical implementations of SMF. In particular, we show how by estimating breast thickness for every image we are, effectively, compensating for any errors in the calibration data. To illustrate our findings, the current implementation of SMF (version 2.2β) was run over 4028 digitized film-screen mammograms taken from six sites over the years 1988-2002 with and without using the known calibration data. Results show that the SMF implementation running without any calibration data at all generates results which display a strong relationship with when running with a complete set of calibration data, and, most importantly, to an expert's visual assessment of breast composition using established techniques. SMF shows considerable promise in being of major use in large epidemiological studies related to breast cancer which require the automated analysis of large numbers of films from many years previously where little or no calibration data is available

  14. Future global manpower shortages in nuclear industries with special reference to india including remedial measures

    International Nuclear Information System (INIS)

    Ghosh Hazra, G.S.

    2008-01-01

    -2050. Service sector in India accounts for about 50% of GDP which will continue to increase further and will provide more jobs and better paid jobs than core industries and there will be continued shift of choice of employment towards service sector creating deep gap of manpower resource requirement in basic and core industries. There are reports that some countries may have to abandon some future projects because of non availability skilled manpower in core industries. The installed capacity of nuclear power in India in the year 2052 will be about 200 G We from the present about 4 G We which will be a manifold increase. This will need about estimated 1,30,000 skilled manpower from the present about 12,000 persons in nuclear industries. Moreover, the need for competent persons in nuclear industries because of high safety requirements of nuclear installations will further add to the problem. The following short-term strategies to retain and attract new employees in nuclear industries may be envisaged amongst others: - Recruit employees prior to the departure of experienced technical staff to facilitate knowledge transfer in time. - Increase compensation and the number of higher level positions. - Increase permanent entry-level intake of skilled manpower taking into account historical turn-over rate. - Implement attractive student loan repayment programs by tying up with banks and financial institutions. - Implement well researched strategies and measures including reassessing the practical capacity which nations including India can achieve in power generation in future taking practical aspects of manpower shortage. - Implement advanced technology which requires lesser manpower. - Implement higher level of automation in nuclear industries. The paper aims to highlight the acute problems of future manpower shortages in nuclear industries globally with special reference to India and discusses some remedial measures which may be taken to address the issue. (author)

  15. Performance Measurement Implementation Of Minimum Service Standards For Basic Education Based On The Balanced Scorecard

    Directory of Open Access Journals (Sweden)

    Budiman Rusli

    2015-08-01

    Full Text Available Policies Minimum Service Standards for Basic Education has rolled out since 2002 by the minister in accordance with the Decree No. 129a U 2004 About Minimum Service Standards Education is continually updated and lastly Regulation of the Minister of Education and Culture No. 23 of 2013. All of the district government town should achieve the target of achieving 100 per cent in each of the indicators listed in the minimum service standards for the end of 2014. achievement pad on each indicator is just one measure of the performance of the local government department of education. Unfortunately from the announced target for 27 indicators that exist almost all regions including local governments do not reach Tangerang Regency. It is necessary for measuring the performance of local authorities particularly the education department. One performance measure modern enough that measurements can be done that The Balance Scorecard BSc. In the Balanced Scorecard is a management tool contemporare complete measure company performance not only of the financial perspective but also non-financial performance such as Customer Perspective Internal Business Processes and Learning and Growth. This approach is actually ideally suited for multinational companies because this approach requires very expensive but can be used to measure the profit performance of the company in addition to the combination of a long-term strategic and short-strategic. Balanced Scorecard it can also be done in measuring the performance of public sector services as well by modifying a few things so it can be used to measure the performance of the public sector including the Performance Measurement Minimum Service Standards for Basic Education.

  16. Protocol of the COSMIN study: COnsensus-based Standards for the selection of health Measurement INstruments

    Directory of Open Access Journals (Sweden)

    Patrick DL

    2006-01-01

    Full Text Available Abstract Background Choosing an adequate measurement instrument depends on the proposed use of the instrument, the concept to be measured, the measurement properties (e.g. internal consistency, reproducibility, content and construct validity, responsiveness, and interpretability, the requirements, the burden for subjects, and costs of the available instruments. As far as measurement properties are concerned, there are no sufficiently specific standards for the evaluation of measurement properties of instruments to measure health status, and also no explicit criteria for what constitutes good measurement properties. In this paper we describe the protocol for the COSMIN study, the objective of which is to develop a checklist that contains COnsensus-based Standards for the selection of health Measurement INstruments, including explicit criteria for satisfying these standards. We will focus on evaluative health related patient-reported outcomes (HR-PROs, i.e. patient-reported health measurement instruments used in a longitudinal design as an outcome measure, excluding health care related PROs, such as satisfaction with care or adherence. The COSMIN standards will be made available in the form of an easily applicable checklist. Method An international Delphi study will be performed to reach consensus on which and how measurement properties should be assessed, and on criteria for good measurement properties. Two sources of input will be used for the Delphi study: (1 a systematic review of properties, standards and criteria of measurement properties found in systematic reviews of measurement instruments, and (2 an additional literature search of methodological articles presenting a comprehensive checklist of standards and criteria. The Delphi study will consist of four (written Delphi rounds, with approximately 30 expert panel members with different backgrounds in clinical medicine, biostatistics, psychology, and epidemiology. The final checklist will

  17. Quality requirements for vegetables and fruit products in the European Union : training manual, product quality standards including UN-ECE quality standards for unions

    NARCIS (Netherlands)

    Voort, van der M.P.J.; Baricicova, V.; Dandar, M.; Grzegorzewska, M.; Schoorlemmer, H.B.; Szabo, C.; Zmarlicji, K.

    2007-01-01

    This training manual is part of the pilot on agricultural quality standards. The objective of this pilot is the development and testing of a training course on quality requirements. The training manual informs growers and trainers on the basic quality requirements and the relationship of these

  18. Estimations of isoprenoid emission capacity from enclosure studies: measurements, data processing, quality and standardized measurement protocols

    Science.gov (United States)

    Niinemets, Ü.; Kuhn, U.; Harley, P. C.; Staudt, M.; Arneth, A.; Cescatti, A.; Ciccioli, P.; Copolovici, L.; Geron, C.; Guenther, A.; Kesselmeier, J.; Lerdau, M. T.; Monson, R. K.; Peñuelas, J.

    2011-08-01

    The capacity for volatile isoprenoid production under standardized environmental conditions at a certain time (ES, the emission factor) is a key characteristic in constructing isoprenoid emission inventories. However, there is large variation in published ES estimates for any given species partly driven by dynamic modifications in ES due to acclimation and stress responses. Here we review additional sources of variation in ES estimates that are due to measurement and analytical techniques and calculation and averaging procedures, and demonstrate that estimations of ES critically depend on applied experimental protocols and on data processing and reporting. A great variety of experimental setups has been used in the past, contributing to study-to-study variations in ES estimates. We suggest that past experimental data should be distributed into broad quality classes depending on whether the data can or cannot be considered quantitative based on rigorous experimental standards. Apart from analytical issues, the accuracy of ES values is strongly driven by extrapolation and integration errors introduced during data processing. Additional sources of error, especially in meta-database construction, can further arise from inconsistent use of units and expression bases of ES. We propose a standardized experimental protocol for BVOC estimations and highlight basic meta-information that we strongly recommend to report with any ES measurement. We conclude that standardization of experimental and calculation protocols and critical examination of past reports is essential for development of accurate emission factor databases.

  19. Normal standards for kidney length as measured with US in premature infants

    International Nuclear Information System (INIS)

    Schlesinger, A.E.; Hedlund, G.L.; Pierson, W.P.; Null, D.M.

    1986-01-01

    In order to develop normal standards for kidney length in premature infants, the authors measured kidney length by US imaging in 39 (to date) premature infants less than 72 hours old and without known renal disease. Kidney length was compared with four different parameters of body size, including gestational age, birth weight, birth length, and body surface area. Similar standards have been generated previously for normal renal length as measured by US imaging in full-term infants and older children. These standards have proven utility in cases of congenital and acquired disorders that abnormally increase or decrease renal size. Scatter plots of kidney length versus body weight and kidney length versus body surface area conformed well to a logarithmic distribution, with a high correlation coefficient and close-fitting 95% confidence limits (SEE = 2.05)

  20. Slit-scanning technique using standard cell sorter instruments for analyzing and sorting nonacrocentric human chromosomes, including small ones

    NARCIS (Netherlands)

    Rens, W.; van Oven, C. H.; Stap, J.; Jakobs, M. E.; Aten, J. A.

    1994-01-01

    We have investigated the performance of two types of standard flow cell sorter instruments, a System 50 Cytofluorograph and a FACSTar PLUS cell sorter, for the on-line centromeric index (CI) analysis of human chromosomes. To optimize the results, we improved the detection efficiency for centromeres

  1. Pitfalls in the measurement of muscle mass: a need for a reference standard

    Science.gov (United States)

    Landi, Francesco; Cesari, Matteo; Fielding, Roger A.; Visser, Marjolein; Engelke, Klaus; Maggi, Stefania; Dennison, Elaine; Al‐Daghri, Nasser M.; Allepaerts, Sophie; Bauer, Jurgen; Bautmans, Ivan; Brandi, Maria Luisa; Bruyère, Olivier; Cederholm, Tommy; Cerreta, Francesca; Cherubini, Antonio; Cooper, Cyrus; Cruz‐Jentoft, Alphonso; McCloskey, Eugene; Dawson‐Hughes, Bess; Kaufman, Jean‐Marc; Laslop, Andrea; Petermans, Jean; Reginster, Jean‐Yves; Rizzoli, René; Robinson, Sian; Rolland, Yves; Rueda, Ricardo; Vellas, Bruno; Kanis, John A.

    2018-01-01

    Abstract Background All proposed definitions of sarcopenia include the measurement of muscle mass, but the techniques and threshold values used vary. Indeed, the literature does not establish consensus on the best technique for measuring lean body mass. Thus, the objective measurement of sarcopenia is hampered by limitations intrinsic to assessment tools. The aim of this study was to review the methods to assess muscle mass and to reach consensus on the development of a reference standard. Methods Literature reviews were performed by members of the European Society for Clinical and Economic Aspects of Osteoporosis and Osteoarthritis working group on frailty and sarcopenia. Face‐to‐face meetings were organized for the whole group to make amendments and discuss further recommendations. Results A wide range of techniques can be used to assess muscle mass. Cost, availability, and ease of use can determine whether the techniques are better suited to clinical practice or are more useful for research. No one technique subserves all requirements but dual energy X‐ray absorptiometry could be considered as a reference standard (but not a gold standard) for measuring muscle lean body mass. Conclusions Based on the feasibility, accuracy, safety, and low cost, dual energy X‐ray absorptiometry can be considered as the reference standard for measuring muscle mass. PMID:29349935

  2. Detailed examination of 'standard elementary particle theories' based on measurement with Tristan

    International Nuclear Information System (INIS)

    Kamae, Tsuneyoshi

    1989-01-01

    The report discusses possible approaches to detailed analysis of 'standard elementary particle theories' on the basis of measurements made with Tristan. The first section of the report addresses major elementary particles involved in the 'standard theories'. The nature of the gauge particles, leptons, quarks and Higgs particle are briefly outlined. The Higgs particle and top quark have not been discovered, though the Higgs particle is essential in the Weiberg-Salam theory. Another important issue in this field is the cause of the collapse of the CP symmetry. The second section deals with problems which arise in universalizing the concept of the 'standard theories'. What are required to solve these problems include the discovery of supersymmetric particles, discovery of conflicts in the 'standard theories', and accurate determination of fundamental constants used in the 'standard theories' by various different methods. The third and fourth sections address the Weinberg-Salam theory and quantum chromodynamics (QCD). There are four essential parameters for the 'standard theories', three of which are associated with the W-S theory. The mass of the W and Z bosons measured in proton-antiproton collision experiments is compared with that determined by applying the W-S theory to electron-positron experiments. For QCD, it is essential to determine the lambda constant. (N.K.)

  3. Basement Construction of Measurement Standardization for Thermal Property and Basement Preparation of Industrial Technology

    Energy Technology Data Exchange (ETDEWEB)

    Kang, Kweon Ho; Song, Kee Chan; Park, Chang Je

    2007-02-15

    There are three main categories in this report : 1)Basement construction of measurement standardization for nuclear material thermal property, 2) Reliability evaluation of measurement instrument, and 3) Standardization and industrial propagation.

  4. Meeting the measurement uncertainty and traceability requirements of ISO/AEC standard 17025 in chemical analysis.

    Science.gov (United States)

    King, B

    2001-11-01

    The new laboratory accreditation standard, ISO/IEC 17025, reflects current thinking on good measurement practice by requiring more explicit and more demanding attention to a number of activities. These include client interactions, method validation, traceability, and measurement uncertainty. Since the publication of the standard in 1999 there has been extensive debate about its interpretation. It is the author's view that if good quality practices are already in place and if the new requirements are introduced in a manner that is fit for purpose, the additional work required to comply with the new requirements can be expected to be modest. The paper argues that the rigour required in addressing the issues should be driven by customer requirements and the factors that need to be considered in this regard are discussed. The issues addressed include the benefits, interim arrangements, specifying the analytical requirement, establishing traceability, evaluating the uncertainty and reporting the information.

  5. Abstracts from the fourth annual meeting of the council on ionizing radiation measurements and standards (CIRMS)

    International Nuclear Information System (INIS)

    Anon.

    1995-01-01

    The Council on Ionizing Radiation Measurements and Standards held its fourth annual meeting at the National Institute of Standards and Technology, Gaithersburg, Maryland on November 28-30, 1995. The organization represents thousands of users of ionizing radiation and radioactive sources engaged in industrial radiation processing and sterilization, medical radiation diagnostics and therapy, nuclear power, and worker radiation protection programs. CIRMS provides a forum for discussing ionizing radiation issues; identifying, defining and prioritizing needed work; disseminating information on standards; and organizing workshops and meetings to advance ionizing radiation technology. Over 100 participants attended the meeting, which highlighted advanced techniques in radiation dosimetry and radioactivity measurements for the different ionizing radiation communities. Representatives attended from 28 corporations, 10 federal agencies, 8 national laboratories, 12 universities, and 1 state. Advanced techniques and future measurement needs were discussed in four sessions: (I) Medical Dosimetry, Radiology and Nuclear Medicine, (II) Occupational and Radiation Protection Dosimetry, (III) Measurement Techniques for Public and Environmental Radiation Protection, and (IV) Measurement Techniques for Radiation Effects on Materials. An additional session (Session V) was added to this annual meeting on the implementation of ISO 9000 for those CIRMS members involved in instrument and product manufacturing, and those providing radiation measurement services. Abstracts are also included from the poster session (Session VI) held on the final day of the meeting. The 4th Annual Meeting was organized by the Chairman of the Science and Technology Committee, Mr. Joseph C. McDonald of the Battelle Pacific Northwest Laboratory

  6. 34 CFR 106.43 - Standards for measuring skill or progress in physical education classes.

    Science.gov (United States)

    2010-07-01

    ... 34 Education 1 2010-07-01 2010-07-01 false Standards for measuring skill or progress in physical... Education Programs or Activities Prohibited § 106.43 Standards for measuring skill or progress in physical education classes. If use of a single standard of measuring skill or progress in physical education classes...

  7. ENDF/B-5 Standards Data Library (including modifications made in 1986). Summary of contents and documentation

    International Nuclear Information System (INIS)

    DayDay, N.; Lemmel, H.D.

    1986-01-01

    This document summarizes the contents and documentation of the ENDF/B-5 Standards Data Library (EN5-ST) released in September 1979. The library contains complete evaluations for all significant neutron reactions in the energy range 10 -5 eV to 20 MeV for H-1, He-3, Li-6, B-10, C-12, Au-197 and U-235 isotopes. In 1986 the files for C-12, Au-197 and U-235 were slightly modified. The entire library or selective retrievals from it can be obtained free of charge from the IAEA Nuclear Data Section. (author)

  8. A manual on methods for measuring primary production in aquatic environments: including a chapter on bacteria

    National Research Council Canada - National Science Library

    Vollenweider, Richard A; Talling, J. F; Westlake, D. F

    1969-01-01

    The present manual starts from methods used to assess standing crops of phytoplankton, periphyton and higher aquatic, and proceeds to techniques of rate measurement currently available for these three...

  9. Hydrocarbon gas standards at the pmol/mol level to support ambient atmospheric measurements.

    Science.gov (United States)

    Rhoderick, George C; Duewer, David L; Ning, Li; DeSirant, Kathryn

    2010-02-01

    Studies of climate change increasingly recognize the diverse influences exerted by hydrocarbons in the atmosphere, including roles in particulates and ozone formation. Measurements of key non-methane hydrocarbons (NMHCs) suggest atmospheric concentrations ranging from low pmol/mol to nmol/mol, depending on location and compound. To accurately establish concentration trends and to relate measurement records from many laboratories and researchers, it is essential to have good calibration standards. Several of the world's National Metrology Institutes (NMIs) are developing primary and secondary reference gas standards at the nmol/mol level. While the U.S. NMI, the National Institute of Standards and Technology (NIST), has developed pmol/mol standards for halocarbons and some volatile organics, the feasibility of preparing well-characterized, stable standards for NMHCs at the pmol/mol level is not yet established. NIST recently developed a suite of primary standards by gravimetric dilution that contains 18 NMHCs covering the concentration range of 60 pmol/mol to 230 pmol/mol. Taking into account the small but chemically significant contribution of NMHCs in the high-purity diluent nitrogen used in their preparation, the relative concentrations and short-term stability (2 to 3 months) of these NMHCs in the primary standards have been confirmed by chromatographic analysis. The gravimetric values assigned from the methods used to prepare the materials and the analytical concentrations determined from chromatographic analysis generally agree to within +/-2 pmol/mol. However, anomalous results for several of the compounds reflect the difficulties inherent in avoiding contamination and making accurate measurements at these very low levels.

  10. PDF uncertainties in precision electroweak measurements, including the W mass, in ATLAS

    CERN Document Server

    Cooper-Sarkar, Amanda; The ATLAS collaboration

    2015-01-01

    Now that the Higgs mass is known all the parameters of the SM are known- but with what accuracy? Precision EW measurements test the self-consistency of the SM- and thus can give hints of BSM physics. Precision measurements of $sin^2\\theta _W$ and the W mass are limited by PDF uncertainties This contribution discusses these uncertainties and what can be done to improve them.

  11. Estimations of isoprenoid emission capacity from enclosure studies: measurements, data processing, quality and standardized measurement protocols

    Directory of Open Access Journals (Sweden)

    Ü. Niinemets

    2011-08-01

    Full Text Available The capacity for volatile isoprenoid production under standardized environmental conditions at a certain time (ES, the emission factor is a key characteristic in constructing isoprenoid emission inventories. However, there is large variation in published ES estimates for any given species partly driven by dynamic modifications in ES due to acclimation and stress responses. Here we review additional sources of variation in ES estimates that are due to measurement and analytical techniques and calculation and averaging procedures, and demonstrate that estimations of ES critically depend on applied experimental protocols and on data processing and reporting. A great variety of experimental setups has been used in the past, contributing to study-to-study variations in ES estimates. We suggest that past experimental data should be distributed into broad quality classes depending on whether the data can or cannot be considered quantitative based on rigorous experimental standards. Apart from analytical issues, the accuracy of ES values is strongly driven by extrapolation and integration errors introduced during data processing. Additional sources of error, especially in meta-database construction, can further arise from inconsistent use of units and expression bases of ES. We propose a standardized experimental protocol for BVOC estimations and highlight basic meta-information that we strongly recommend to report with any ES measurement. We conclude that standardization of experimental and calculation protocols and critical examination of past reports is essential for development of accurate emission factor databases.

  12. Effect of measurement conditions on three-dimensional roughness values, and development of measurement standard

    International Nuclear Information System (INIS)

    Fabre, A; Brenier, B; Raynaud, S

    2011-01-01

    Friction or corrosion behaviour, fatigue lifetime for mechanical components are influenced by their boundary and subsurface properties. The surface integrity is studied on mechanical component in order to improve the service behaviour of them. Roughness is one of the main geometrical properties, which is to be qualified and quantified. Components can be obtained using a complex process: forming, machining and treatment can be combined to realize parts with complex shape. Then, three-dimensional roughness is needed to characterize these parts with complex shape and textured surface. With contact or non-contact measurements (contact stylus, confocal microprobe, interferometer), three-dimensional roughness is quantified using the calculation of pertinent parameters defined by the international standard PR EN ISO 25178-2:2008. An analysis will identify the influence of measurement conditions on three-dimensional parameters. The purpose of this study is to analyse the variation of roughness results using contact stylus or optical apparatus. The second aim of this work is to develop a measurement standard well adapted to qualify the contact and non-contact apparatus.

  13. Effect of measurement conditions on three-dimensional roughness values, and development of measurement standard

    Energy Technology Data Exchange (ETDEWEB)

    Fabre, A; Brenier, B [Arts et Metiers ParisTech, MecaSurf Laboratory, 2, Cours des Arts et Metiers, 13617 Aix-en-Provence (France); Raynaud, S, E-mail: agnes.fabre@ensam.eu [INSA Lyon, MIP2 Laboratory, 27 Avenue Jean Capelle, Bat Jacquard, 69100 Villeurbanne (France)

    2011-08-19

    Friction or corrosion behaviour, fatigue lifetime for mechanical components are influenced by their boundary and subsurface properties. The surface integrity is studied on mechanical component in order to improve the service behaviour of them. Roughness is one of the main geometrical properties, which is to be qualified and quantified. Components can be obtained using a complex process: forming, machining and treatment can be combined to realize parts with complex shape. Then, three-dimensional roughness is needed to characterize these parts with complex shape and textured surface. With contact or non-contact measurements (contact stylus, confocal microprobe, interferometer), three-dimensional roughness is quantified using the calculation of pertinent parameters defined by the international standard PR EN ISO 25178-2:2008. An analysis will identify the influence of measurement conditions on three-dimensional parameters. The purpose of this study is to analyse the variation of roughness results using contact stylus or optical apparatus. The second aim of this work is to develop a measurement standard well adapted to qualify the contact and non-contact apparatus.

  14. Including Pressure Measurements in Supervision of Energy Efficiency of Wastewater Pump Systems

    DEFF Research Database (Denmark)

    Larsen, Torben; Arensman, Mareike; Nerup-Jensen, Ole

    2016-01-01

    energy). This article presents a method for a continuous supervision of the performance of both the pump and the pipeline in order to maintain the initial specific energy consumption as close as possible to the original value from when the system was commissioned. The method is based on pressure...... measurements only. The flow is determined indirectly from pressure fluctuations during pump run-up....

  15. Using the PhenX Toolkit to Add Standard Measures to a Study.

    Science.gov (United States)

    Hendershot, Tabitha; Pan, Huaqin; Haines, Jonathan; Harlan, William R; Marazita, Mary L; McCarty, Catherine A; Ramos, Erin M; Hamilton, Carol M

    2015-07-01

    The PhenX (consensus measures for Phenotypes and eXposures) Toolkit (https://www.phenxtoolkit.org/) offers high-quality, well-established measures of phenotypes and exposures for use by the scientific community. The goal is to promote the use of standard measures, enhance data interoperability, and help investigators identify opportunities for collaborative and translational research. The Toolkit contains 395 measures drawn from 22 research domains (fields of research), along with additional collections of measures for Substance Abuse and Addiction (SAA) research, Mental Health Research (MHR), and Tobacco Regulatory Research (TRR). Additional measures for TRR that are expected to be released in 2015 include Obesity, Eating Disorders, and Sickle Cell Disease. Measures are selected by working groups of domain experts using a consensus process that includes input from the scientific community. The Toolkit provides a description of each PhenX measure, the rationale for including it in the Toolkit, protocol(s) for collecting the measure, and supporting documentation. Users can browse measures in the Toolkit or can search the Toolkit using the Smart Query Tool or a full text search. PhenX Toolkit users select measures of interest to add to their Toolkit. Registered Toolkit users can save their Toolkit and return to it later to revise or complete. They then have options to download a customized Data Collection Worksheet that specifies the data to be collected, and a Data Dictionary that describes each variable included in the Data Collection Worksheet. The Toolkit also has a Register Your Study feature that facilitates cross-study collaboration by allowing users to find other investigators using the same PhenX measures. Copyright © 2015 John Wiley & Sons, Inc.

  16. Calibration Standards for Surface Topography Measuring Systems down to Nanometric Range

    DEFF Research Database (Denmark)

    Trumpold, H.; De Chiffre, Leonardo; Andreasen, Jan Lasson

    Background For the precise and accurate measurement of surface topography a whole range of surface detection systems is available. With their application in research and production problems arise due to the lack of traceable standard artefacts for the instrument calibration in X, Y and Z directions...... to be replicated during all stages of the replication processes. Procedures for cleaning glass, PVC, PC, PMM and Ni-surfaces have been developed and tested. Calibration procedures for calibration standards and for calibrating instruments in X-, Y- and Z-direction have been developed and tested. Proposals...... in industries ranging from automotive manufacture to ultraprecision manufacture of data storage systems and compact discs. Objectives The project is concerned with developing calibration standards including their production methods and calibration procedures as a consistent means of calibrating different types...

  17. Standard test method for measurement of 235U fraction using enrichment meter principle

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    2008-01-01

    1.1 This test method covers the quantitative determination of the fraction of 235U in uranium using measurement of the 185.7 keV gamma-ray produced during the decay of 235U. 1.2 This test method is applicable to items containing homogeneous uranium-bearing materials of known chemical composition in which the compound is considered infinitely thick with respect to 185.7 keV gamma-rays. 1.3 This test method can be used for the entire range of 235U fraction as a weight percent, from depleted (0.2 % 235U) to highly enriched (97.5 % 235U). 1.4 Measurement of items that have not reached secular equilibrium between 238U and 234Th may not produce the stated bias when low-resolution detectors are used with the computational method listed in Annex A2. 1.5 The values stated in SI units are to be regarded as standard. No other units of measurement are included in this standard. 1.6 This standard may involve hazardous materials, operations, and equipment. This standard does not purport to address all of the safety co...

  18. Standard Test Method for Measuring Fast-Neutron Reaction Rates by Radioactivation of Nickel

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    2008-01-01

    1.1 This test method covers procedures for measuring reaction rates by the activation reaction 58Ni(n,p)58Co. 1.2 This activation reaction is useful for measuring neutrons with energies above approximately 2.1 MeV and for irradiation times up to about 200 days in the absence of high thermal neutron fluence rates (for longer irradiations, see Practice E 261). 1.3 With suitable techniques fission-neutron fluence rates densities above 107 cm−2·s−1 can be determined. 1.4 Detailed procedures for other fast-neutron detectors are referenced in Practice E 261. 1.5 The values stated in SI units are to be regarded as standard. No other units of measurement are included in this standard. 1.6 This standard does not purport to address all of the safety concerns, if any, associated with its use. It is the responsibility of the user of this standard to establish appropriate safety and health practices and determine the applicability of regulatory limitations prior to use. Note—The burnup corrections were com...

  19. [Mobile Health: IEEE Standard for Wearable Cuffless Blood Pressure Measuring Devices].

    Science.gov (United States)

    Zhou, Xia; Wu, Wenli; Bao, Shudi

    2015-07-01

    IEEE Std 1708-2014 breaks through the traditional standards of cuff based blood pressure measuring devices and establishes a normative definition of wearable cuffless blood pressure measuring devices and the objective performance evaluation of this kind of devices. This study firstly introduces the background of the new standard. Then, the standard details will be described, and the impact of cuffless blood pressure measuring devices with the new standard on manufacturers and end users will be addressed.

  20. Multiple shooting applied to robust reservoir control optimization including output constraints on coherent risk measures

    DEFF Research Database (Denmark)

    Codas, Andrés; Hanssen, Kristian G.; Foss, Bjarne

    2017-01-01

    . In this work, we propose a new formulation for robust optimization of reservoir well controls. It is inspired by the multiple shooting (MS) method which permits a broad range of parallelization opportunities and output constraint handling. This formulation exploits coherent risk measures, a concept...... traditionally used in finance, to bound the risk on constraint violation. We propose a reduced sequential quadratic programming (rSQP) algorithm to solve the underlying optimization problem. This algorithm exploits the structure of the coherent risk measures, thus a large set of constraints are solved within...... sub-problems. Moreover, a variable elimination procedure allows solving the optimization problem in a reduced space and an iterative active-set method helps to handle a large set of inequality constraints. Finally, we demonstrate the application of constraints to bound the risk of water production...

  1. A methodological evaluation of volumetric measurement techniques including three-dimensional imaging in breast surgery

    OpenAIRE

    HOEFFELIN, Harry; JACQUEMIN, Denise; Defaweux, Valérie; NIZET, Jean-Luc

    2014-01-01

    Breast surgery currently remains very subjective and each intervention depends on the ability and experience of the operator. To date, no objective measurement of this anatomical region can codify surgery. In this light, we wanted to compare and validate a new technique for 3D scanning (LifeViz 3D) and its clinical application. Materials and methods. - We tested the use of the 3D LifeViz system (Quantificare) to perform volumetric calculations in various settings ("in situ" in cadaveric di...

  2. A Methodological Evaluation of Volumetric Measurement Techniques including Three-Dimensional Imaging in Breast Surgery

    OpenAIRE

    H. Hoeffelin; D. Jacquemin; V. Defaweux; J L. Nizet

    2014-01-01

    Breast surgery currently remains very subjective and each intervention depends on the ability and experience of the operator. To date, no objective measurement of this anatomical region can codify surgery. In this light, we wanted to compare and validate a new technique for 3D scanning (LifeViz 3D) and its clinical application. We tested the use of the 3D LifeViz system (Quantificare) to perform volumetric calculations in various settings (in situ in cadaveric dissection, of control prosthese...

  3. pH-Free Measurement of Relative Acidities, Including Isotope Effects.

    Science.gov (United States)

    Perrin, Charles L

    2017-01-01

    A powerful pH-free multicomponent NMR titration method can measure relative acidities, even of closely related compounds, with excellent accuracy. The history of the method is presented, along with details of its implementation and a comparison with earlier NMR titrations using a pH electrode. Many of its areas of applicability are described, especially equilibrium isotope effects. The advantages of the method, some practical considerations, and potential pitfalls are considered. © 2017 Elsevier Inc. All rights reserved.

  4. NedWind 25 Blade Testing at NREL for the European Standards Measurement and Testing Program

    Energy Technology Data Exchange (ETDEWEB)

    Larwood, S.; Musial, W.; Freebury, G.; Beattie, A.G.

    2001-04-19

    In the mid-90s the European community initiated the Standards, Measurements, and Testing (SMT) program to harmonize testing and measurement procedures in several industries. Within the program, a project was carried out called the European Wind Turbine Testing Procedure Development. The second part of that project, called Blade Test Methods and Techniques, included the United States and was devised to help blade-testing laboratories harmonize their testing methods. This report provides the results of those tests conducted by the National Renewable Energy Laboratory.

  5. The Standardization of Bra Cup Measurements: Redefining Bra Sizing Language.

    Science.gov (United States)

    Bengtson, Bradley P; Glicksman, Caroline A

    2015-10-01

    There are many challenges in developing a standardized bra cup system, the most significant being that bra cup sizes are a continuum. Women's breasts occur as a fluid range of shapes, sizes, and volumes. Patients have specific expectations regarding bra cup size, and failure to achieve expectations remains the leading cause of patient dissatisfaction. Implant selection that determines eventual bra cup size is critical in patient education and management of patient expectations; however, this is not achievable until all speak the same bra cup language. Patient and surgeon perceptions may never be exact, but it is important to establish guidelines and standards to bridge this gap. Copyright © 2015 Elsevier Inc. All rights reserved.

  6. INSECTS INCLUDED IN THE RED BOOK OF MOLDOVA: LIMITATION FACTORS AND PROTECTION MEASURES

    Directory of Open Access Journals (Sweden)

    Asea M. Timuş

    2017-07-01

    Full Text Available This paper presents the analysis of insect species in Moldova with a rarity status: vulnerable, critically endangered and endangered, officially included in the "Red Book of the Republic of Moldova", in two editions: 37 species in the 2nd edition (2001 and 80 species in the 3rd edition (2015. The 80 insects of the 3rd edition of the "Red Book of the Republic of Moldova" belong to 8 orders (Odonatoptera, Mantodea, Orthoptera, Coleoptera, Neuroptera, Lepidoptera, Hymenoptera, Diptera. These species are classified according to the rarity status: vulnerable (VU – 33 species, critically endangered (CR – 39 species and endangered (EN – 8 species. The third edition also contains 35 species not included in the previous editions, which for the first time obtained a rarity status: VU – 16 species, CR – 17 and EN – 2 (2 species of the order Odonatoptera, 1 of Mantodea, 1 of Orthoptera, 10 of Coleoptera, 18 of Lepidoptera, and 3 of Hymenoptera.

  7. Simultaneous measurements of work function and H‒ density including caesiation of a converter surface

    Science.gov (United States)

    Cristofaro, S.; Friedl, R.; Fantz, U.

    2017-08-01

    Negative hydrogen ion sources rely on the surface conversion of neutral atomic hydrogen and positive hydrogen ions to H-. The efficiency of this process depends on the actual work function of the converter surface. By introducing caesium into the source the work function decreases, enhancing the negative ion yield. In order to study the impact of the work function on the H- surface production at similar conditions to the ones in ion sources for fusion devices like ITER and DEMO, fundamental investigations are performed in a flexible laboratory experiment. The work function of the converter surface can be absolutely measured by photoelectric effect, while a newly installed cavity ring-down spectroscopy system (CRDS) measures the H- density. The CRDS is firstly tested and characterized by investigations on H- volume production. Caesiation of a stainless steel sample is then performed in vacuum and the plasma effect on the Cs layer is investigated also for long plasma-on times. A minimum work function of (1.9±0.1) eV is reached after some minutes of plasma treatment, resulting in a reduction by a value of 0.8 eV compared to vacuum measurements. The H- density above the surface is (2.1±0.5)×1015 m-3. With further plasma exposure of the caesiated surface, the work function increases up to 3.75 eV, due to the impinging plasma particles which gradually remove the Cs layer. As a result, the H- density decreases by a factor of at least 2.

  8. Human calcium metabolism including bone resorption measured with {sup 41}Ca tracer

    Energy Technology Data Exchange (ETDEWEB)

    Freeman, S.P.H.T. [Lawrence Livermore National Lab., CA (United States); King, J.C. [California Univ., Berkeley, CA (United States). Dept. of Nutritional Science; Vieira, N.E. [National Inst. of Child Health and Human Development, Bethesda, MD (United States); Woodhouse, L.R. [California Univ., Berkeley, CA (United States). Dept. of Nutritional Science; Yergey, A.L. [National Inst. of Child Health and Human Development, Bethesda, MD (United States)

    1996-08-01

    Accelerator mass spectrometry is so sensitive to small quantities of {sup 41}Ca that it might be used as a tracer in the study of human calcium kinetics to generate unique kinds of data. In contrast with the use of other Ca isotopic tracers, {sup 41}Ca tracer can be so administered that the tracer movements between the various body pools achieve a quasi steady state. Resorbing bone may thus be directly measured. We have tested such a protocol against a conventional stable isotope experiment with good agreement.

  9. A methodological evaluation of volumetric measurement techniques including three-dimensional imaging in breast surgery.

    Science.gov (United States)

    Hoeffelin, H; Jacquemin, D; Defaweux, V; Nizet, J L

    2014-01-01

    Breast surgery currently remains very subjective and each intervention depends on the ability and experience of the operator. To date, no objective measurement of this anatomical region can codify surgery. In this light, we wanted to compare and validate a new technique for 3D scanning (LifeViz 3D) and its clinical application. We tested the use of the 3D LifeViz system (Quantificare) to perform volumetric calculations in various settings (in situ in cadaveric dissection, of control prostheses, and in clinical patients) and we compared this system to other techniques (CT scanning and Archimedes' principle) under the same conditions. We were able to identify the benefits (feasibility, safety, portability, and low patient stress) and limitations (underestimation of the in situ volume, subjectivity of contouring, and patient selection) of the LifeViz 3D system, concluding that the results are comparable with other measurement techniques. The prospects of this technology seem promising in numerous applications in clinical practice to limit the subjectivity of breast surgery.

  10. A Methodological Evaluation of Volumetric Measurement Techniques including Three-Dimensional Imaging in Breast Surgery

    Directory of Open Access Journals (Sweden)

    H. Hoeffelin

    2014-01-01

    Full Text Available Breast surgery currently remains very subjective and each intervention depends on the ability and experience of the operator. To date, no objective measurement of this anatomical region can codify surgery. In this light, we wanted to compare and validate a new technique for 3D scanning (LifeViz 3D and its clinical application. We tested the use of the 3D LifeViz system (Quantificare to perform volumetric calculations in various settings (in situ in cadaveric dissection, of control prostheses, and in clinical patients and we compared this system to other techniques (CT scanning and Archimedes’ principle under the same conditions. We were able to identify the benefits (feasibility, safety, portability, and low patient stress and limitations (underestimation of the in situ volume, subjectivity of contouring, and patient selection of the LifeViz 3D system, concluding that the results are comparable with other measurement techniques. The prospects of this technology seem promising in numerous applications in clinical practice to limit the subjectivity of breast surgery.

  11. Practical estimation of the uncertainty of analytical measurement standards

    NARCIS (Netherlands)

    Peters, R.J.B.; Elbers, I.J.W.; Klijnstra, M.D.; Stolker, A.A.M.

    2011-01-01

    Nowadays, a lot of time and resources are used to determine the quality of goods and services. As a consequence, the quality of measurements themselves, e.g., the metrological traceability of the measured quantity values is essential to allow a proper evaluation of the results with regard to

  12. Atlantooccipital junction: standards for measurement in normal children.

    Science.gov (United States)

    Kaufman, R A; Carroll, C D; Buncher, C R

    1987-01-01

    This study describes a simple method for measuring the distance between the occiput and atlas when a distraction-dislocation injury is suspected in a child. Measurements were made at five evenly spaced locations along the atlantooccipital joint on cross-table lateral skull radiographs in 100 normal children. These data were compared with similar measurements in eight patients with proved atlantooccipital dislocation. The mean normal measurement fell between 1.96-2.63 mm for all five points. For boys or girls aged 1-15 years, the normal distance should not exceed 5 mm at any point in the joint. The likelihood that any normal child will have a measurement greater than or equal to 4.5 mm at any point is between 0.4-5.85% (expected false-positive rate).

  13. Measurement standards and the general problem of reference points in chemical analysis

    International Nuclear Information System (INIS)

    Richter, W.; Dube, G.

    2002-01-01

    Besides the measurement standards available in general metrology in the form of the realisations of the units of measurement, measurement standards of chemical composition are needed for the vast field of chemical measurement (measurements of the chemical composition), because it is the main aim of such measurements to quantify non-isolated substances, often in complicated matrices, to which the 'classical' measurement standards and their lower- level derivatives are not directly applicable. At present, material artefacts as well as standard measurement devices serve as chemical measurement standards. These are measurement standards in the full metrological sense only, however, if they are firmly linked to the SI unit in which the composition represented by the standard is expressed. This requirement has the consequence that only a very restricted number of really reliable chemical measurement standards exist at present. Since it is very difficult and time consuming to increase this number substantially and, on the other hand, reliable reference points are increasingly needed for all kinds of chemical measurements, primary methods of measurement and high-level reference measurements will play an increasingly important role for the establishment of worldwide comparability and hence mutual acceptance of chemical measurement results. (author)

  14. Design and Optimization of Capacitated Supply Chain Networks Including Quality Measures

    Directory of Open Access Journals (Sweden)

    Krystel K. Castillo-Villar

    2014-01-01

    Full Text Available This paper presents (1 a novel capacitated model for supply chain network design which considers manufacturing, distribution, and quality costs (named SCND-COQ model and (2 five combinatorial optimization methods, based on nonlinear optimization, heuristic, and metaheuristic approaches, which are used to solve realistic instances of practical size. The SCND-COQ model is a mixed-integer nonlinear problem which can be used at a strategic planning level to design a supply chain network that maximizes the total profit subject to meeting an overall quality level of the final product at minimum costs. The SCND-COQ model computes the quality-related costs for the whole supply chain network considering the interdependencies among business entities. The effectiveness of the proposed solution approaches is shown using numerical experiments. These methods allow solving more realistic (capacitated supply chain network design problems including quality-related costs (inspections, rework, opportunity costs, and others within a reasonable computational time.

  15. Cost and benefit including value of life, health and environmental damage measured in time units

    DEFF Research Database (Denmark)

    Ditlevsen, Ove Dalager; Friis-Hansen, Peter

    2009-01-01

    Key elements of the authors' work on money equivalent time allocation to costs and benefits in risk analysis are put together as an entity. This includes the data supported dimensionless analysis of an equilibrium relation between total population work time and gross domestic product leading...... of this societal value over the actual costs, used by the owner for economically optimizing an activity, motivates a simple risk accept criterion suited to be imposed on the owner by the public. An illustration is given concerning allocation of economical means for mitigation of loss of life and health on a ferry...... in fire. Finally a definition is suggested for a nature preservation willingness index, which by an invariance postulate leads to a rational format for allocating means to avoid pollution accidents....

  16. Standards for measurements and testing of wind turbine power quality

    Energy Technology Data Exchange (ETDEWEB)

    Soerensen, P. [Risoe National Lab., Roskilde (Denmark); Gerdes, G.; Klosse, R.; Santjer, F. [DEWI, Wilhelmshaven (Germany); Robertson, N.; Davy, W. [NEL, Glasgow (United Kingdom); Koulouvari, M.; Morfiadakis, E. [CRES, Pikermi (Greece); Larsson, Aa. [Chalmers Univ. of Technology, Goeteborg (Sweden)

    1999-03-01

    The present paper describes the work done in power quality sub-task of the project `European Wind Turbine Testing Procedure Developments` funded by the EU SMT program. The objective of the power quality sub-task has been to make analyses and new recommendation(s) for the standardisation of measurement and verification of wind turbine power quality. The work has been organised in three major activities. The first activity has been to propose measurement procedures and to verify existing and new measurement procedures. This activity has also involved a comparison of the measurements and data processing of the participating partners. The second activity has been to investigate the influence of terrain, grid properties and wind farm summation on the power quality of wind turbines with constant rotor speed. The third activity has been to investigate the influence of terrain, grid properties and wind farm summation on the power quality of wind turbines with variable rotor speed. (au)

  17. Standardized voluntary force measurement in a lower extremity rehabilitation robot

    OpenAIRE

    Bolliger, M; Banz, R; Dietz, V; Lünenburger, L

    2008-01-01

    Abstract Background Isometric force measurements in the lower extremity are widely used in rehabilitation of subjects with neurological movement disorders (NMD) because walking ability has been shown to be related to muscle strength. Therefore muscle strength measurements can be used to monitor and control the effects of training programs. A new method to assess isometric muscle force was implemented in the driven gait orthosis (DGO) Lokomat. To evaluate the capabilities of this new measureme...

  18. An assessment of PCB degradation by microogransims including methods for measuring mineralization

    International Nuclear Information System (INIS)

    Hadden, C.; Edenborn, H.; Osborne, T.; Holdsworth, G.; Revis, N.

    1990-01-01

    These studies sought to isolate and identify organism(s) from PCB contaminated soil and sediment that degrade PCB; to provide information on the potential of organisms in soil samples taken from a PCB-contaminated area to mineralize or dechlorinate PCB congeners; to assess potential enhancement of PCB biodegradation as a result of nutritional amendment of the samples; and to carry out analyses of successive lysimeter samples to determine whether field treatments have had an effect on the capacity of soil microbes to mineralize PCBS. We have expended considerable effort to validate the fractionation procedure used to assess mineralization and conversion of PCB substrates. The assessment relies on the ability to measure [ 14 C]-labeled CO 2 in the presence of potentially volatile [ 14 C]-labeled PCB and degradation products to differentiate between volatile and non-volatile [ 14 C]-labeled compounds between water-soluble products of metabolism and a mixture of unchanged substrate and other water-insoluble products and between metabolism and loss or non-extractability of the substrate

  19. Simulation and Evaluation of Urban Growth for Germany Including Climate Change Mitigation and Adaptation Measures

    Directory of Open Access Journals (Sweden)

    Jana Hoymann

    2016-06-01

    Full Text Available Decision-makers in the fields of urban and regional planning in Germany face new challenges. High rates of urban sprawl need to be reduced by increased inner-urban development while settlements have to adapt to climate change and contribute to the reduction of greenhouse gas emissions at the same time. In this study, we analyze conflicts in the management of urban areas and develop integrated sustainable land use strategies for Germany. The spatial explicit land use change model Land Use Scanner is used to simulate alternative scenarios of land use change for Germany for 2030. A multi-criteria analysis is set up based on these scenarios and based on a set of indicators. They are used to measure whether the mitigation and adaptation objectives can be achieved and to uncover conflicts between these aims. The results show that the built-up and transport area development can be influenced both in terms of magnitude and spatial distribution to contribute to climate change mitigation and adaptation. Strengthening the inner-urban development is particularly effective in terms of reducing built-up and transport area development. It is possible to reduce built-up and transport area development to approximately 30 ha per day in 2030, which matches the sustainability objective of the German Federal Government for the year 2020. In the case of adaptation to climate change, the inclusion of extreme flood events in the context of spatial planning requirements may contribute to a reduction of the damage potential.

  20. An assessment of PCB degradation by microogransims including methods for measuring mineralization

    Energy Technology Data Exchange (ETDEWEB)

    Hadden, C.; Edenborn, H.; Osborne, T.; Holdsworth, G.; Revis, N.

    1990-12-31

    These studies sought to isolate and identify organism(s) from PCB contaminated soil and sediment that degrade PCB; to provide information on the potential of organisms in soil samples taken from a PCB-contaminated area to mineralize or dechlorinate PCB congeners; to assess potential enhancement of PCB biodegradation as a result of nutritional amendment of the samples; and to carry out analyses of successive lysimeter samples to determine whether field treatments have had an effect on the capacity of soil microbes to mineralize PCBS. We have expended considerable effort to validate the fractionation procedure used to assess mineralization and conversion of PCB substrates. The assessment relies on the ability to measure [{sup 14}C]-labeled CO{sub 2} in the presence of potentially volatile [{sup 14}C]-labeled PCB and degradation products to differentiate between volatile and non-volatile [{sup 14}C]-labeled compounds between water-soluble products of metabolism and a mixture of unchanged substrate and other water-insoluble products and between metabolism and loss or non-extractability of the substrate.

  1. Electronic trigger for capacitive touchscreen and extension of ISO 15781 standard time lag measurements to smartphones

    Science.gov (United States)

    Bucher, François-Xavier; Cao, Frédéric; Viard, Clément; Guichard, Frédéric

    2014-03-01

    We present in this paper a novel capacitive device that stimulates the touchscreen interface of a smartphone (or of any imaging device equipped with a capacitive touchscreen) and synchronizes triggering with the DxO LED Universal Timer to measure shooting time lag and shutter lag according to ISO 15781:2013. The device and protocol extend the time lag measurement beyond the standard by including negative shutter lag, a phenomenon that is more and more commonly found in smartphones. The device is computer-controlled, and this feature, combined with measurement algorithms, makes it possible to automatize a large series of captures so as to provide more refined statistical analyses when, for example, the shutter lag of "zero shutter lag" devices is limited by the frame time as our measurements confirm.

  2. Measuring enzyme activities under standardized in vivo-like conditions for Systems Biology

    NARCIS (Netherlands)

    van Eunen, K.; Bouwman, J.; Daran-Lapujade, P.A.L.; Postmus, J.; Canelas, A.; Mensonides, F.I.C.; Orij, R.; Tuzun, I.; van der Brink, J.; Smits, G.J.; van Gulik, W.M.; Brul, S.; Heijnen, J.J.; de Winde, J.H.; Teixeira de Mattos, M.J.; Kettner, C.; Nielsen, J.; Westerhoff, H.V.; Bakker, B.M.

    2010-01-01

    Realistic quantitative models require data from many laboratories. Therefore, standardization of experimental systems and assay conditions is crucial. Moreover, standards should be representative of the in vivo conditions. However, most often, enzyme-kinetic parameters are measured under assay

  3. Measuring enzyme activities under standardized in vivo-like conditions for systems biology

    NARCIS (Netherlands)

    van Eunen, Karen; Bouwman, Jildau; Daran-Lapujade, Pascale; Postmus, Jarne; Canelas, Andre B.; Mensonides, Femke I. C.; Orij, Rick; Tuzun, Isil; van den Brink, Joost; Smits, Gertien J.; van Gulik, Walter M.; Brul, Stanley; de Winde, Johannes H.; de Mattos, M. J. Teixeira; Kettner, Carsten; Nielsen, Jens; Westerhoff, Hans V.; Bakker, Barbara M.; Heijnen, J.J.

    Realistic quantitative models require data from many laboratories. Therefore, standardization of experimental systems and assay conditions is crucial. Moreover, standards should be representative of the in vivo conditions. However, most often, enzyme-kinetic parameters are measured under assay

  4. Hygroscopic growth of common organic aerosol solutes, including humic substances, as derived from water activity measurements

    Science.gov (United States)

    Zamora, Idania R.; Tabazadeh, Azadeh; Golden, David M.; Jacobson, Mark Z.

    2011-12-01

    Studies have shown that organic matter often constitutes up to 50% by mass of tropospheric aerosols. These organics may considerably affect the water uptake properties of these aerosols, impacting Earth's climate and atmosphere. However, considerable uncertainties still exist about hygroscopic properties of organic carbon (OC) in particles. In this study, we have assembled an apparatus to measure equilibrium water vapor pressure over bulk solutions. We used these results to calculate the hygroscopic growth curve and deliquescence relative humidity (DRH) of representative compounds in three OC categories: saccharides, mono/dicarboxylic acids, and HULIS (Humic-Like Substances). To our knowledge, this is the first study to examine the hygroscopic growth of HULIS by means of a bulk method on representative compounds such as fulvic and humic acids. We also explored the temperature effect on hygroscopic growth within the 0°C-30°C temperature range and found no effect. The DRH and hygroscopic growth obtained were in excellent agreement with published tandem differential mobility analyzer (TDMA), electrodynamic balance, and bulk data for sodium chloride, ammonium sulfate, d-glucose, levoglucosan, succinic acid, and glutaric acid. However, we found a hygroscopic growth factor of 1.0 at a relative humidity of 90% for phthalic, oxalic, humic, and two fulvic acids; these results disagree with various TDMA studies. The TDMA is used widely to study water uptake of organic particles but can be affected by particle microstructural arrangements before the DRH and by the inability to fully dry particles. Thus, in the future it will be important to confirm TDMA data for nondeliquescent organic particles with alternate methods.

  5. Optimising the Number of Replicate- Versus Standard Measurements for Carbonate Clumped Isotope Thermometry

    Science.gov (United States)

    Kocken, I.; Ziegler, M.

    2017-12-01

    Clumped isotope measurements on carbonates are a quickly developing and promising palaeothermometry proxy1-3. Developments in the field have brought down the necessary sample amount and improved the precision and accuracy of the measurements. The developments have included inter-laboratory comparison and the introduction of an absolute reference frame4, determination of acid fractionation effects5, correction for the pressure baseline6, as well as improved temperature calibrations2, and most recently new approaches to improve efficiency in terms of sample gas usage7. However, a large-scale application of clumped isotope thermometry is still hampered by required large sample amounts, but also the time-consuming analysis. In general, a lot of time is goes into the measurement of standards. Here we present a study on the optimal ratio between standard- and sample measurements using the Kiel Carbonate Device method. We also consider the optimal initial signal intensity. We analyse ETH-standard measurements from several months to determine the measurement regime with the highest precision and optimised measurement time management.References 1. Eiler, J. M. Earth Planet. Sci. Lett. 262, 309-327 (2007).2. Kelson, J. R., et al. Geochim. Cosmochim. Acta 197, 104-131 (2017).3. Kele, S. et al. Geochim. Cosmochim. Acta 168, 172-192 (2015).4. Dennis, K. J. et al. Geochim. Cosmochim. Acta 75, 7117-7131 (2011).5. Müller, I. A. et al. Chem. Geol. 449, 1-14 (2017).6. Meckler, A. N. et al. Rapid Commun. Mass Spectrom. 28, 1705-1715 (2014).7. Hu, B. et al. Rapid Commun. Mass Spectrom. 28, 1413-1425 (2014).

  6. The sensitivity of standard radiographic foot measures to misalignment.

    Science.gov (United States)

    Willauer, Patrick; Sangeorzan, Bruce J; Whittaker, Eric C; Shofer, Jane B; Ledoux, William R

    2014-12-01

    The purpose of this study was to identify the effects that X-ray source misalignment has on common measurements made from anterior-poster (AP) and medial-lateral (ML) view foot radiographs. A cadaveric foot model was used to obtain ML radiographs with ±25 degree transverse plane misalignment. From these images the calcaneal pitch angle (CPA) and lateral talometatarsal angle (LTMA) were measured. AP images were captured with up to 30 degree sagittal plane misalignment as well as ±15 degree misalignment in the transverse plane at each sagittal angle. From these images the talonavicular coverage angle (TNCA) and talometatarsal angle (TMA) were measured. On the ML images, the CPA was sensitive to transverse plane misalignment from -10 to -25 degrees and from 15 to 25 degrees (P plane misalignment. On the AP images, the TNCA and TMA were not sensitive to sagittal plane misalignment alone. However, at 0, 10, and 15 degrees sagittal misalignment the TNCA showed sensitivity to transverse plane misalignment (P foot radiographic parameters, especially the CPA when there is transverse plane misalignment and the TNCA when there is both sagittal and transverse plane misalignment. The LTMA and TMA can be measured reliably, even with significant misalignment present. If a researcher or clinician is interested in measuring the CPA or TNCA, the current best practices guidelines for obtaining ML and AP images should be closely followed. © The Author(s) 2014.

  7. Standardized voluntary force measurement in a lower extremity rehabilitation robot.

    Science.gov (United States)

    Bolliger, Marc; Banz, Raphael; Dietz, Volker; Lünenburger, Lars

    2008-10-28

    Isometric force measurements in the lower extremity are widely used in rehabilitation of subjects with neurological movement disorders (NMD) because walking ability has been shown to be related to muscle strength. Therefore muscle strength measurements can be used to monitor and control the effects of training programs. A new method to assess isometric muscle force was implemented in the driven gait orthosis (DGO) Lokomat. To evaluate the capabilities of this new measurement method, inter- and intra-rater reliability were assessed. Reliability was assessed in subjects with and without NMD. Subjects were tested twice on the same day by two different therapists to test inter-rater reliability and on two separate days by the same therapist to test intra-rater reliability. Results showed fair to good reliability for the new measurement method to assess isometric muscle force of lower extremities. In subjects without NMD, intraclass correlation coefficients (ICC) for inter-rater reliability ranged from 0.72 to 0.97 and intra-rater reliability from 0.71 to 0.90. In subjects with NMD, ICC ranged from 0.66 to 0.97 for inter-rater and from 0.50 to 0.96 for intra-rater reliability. Inter- and intra- rater reliability of an assessment method for measuring maximal voluntary isometric muscle force of lower extremities was demonstrated. We suggest that this method is a valuable tool for documentation and controlling of the rehabilitation process in patients using a DGO.

  8. Advisory Committee for the Calibration Standards of Ionizing Radiation Measurement: Section 3. Neutron measurements

    International Nuclear Information System (INIS)

    1982-01-01

    Section III (Mesures neutroniques) of the Comite Consultatif pour les Etalons de Mesure des Rayonnements Ionisants held its fifth meeting in May 1981. Recent work carried out at BIPM in the field of neutron measurements was reported. The status of a full-scale 252 Cf neutron source intercomparison (10 7 s - 1 ) and of several restricted comparisons was discussed. Intercomparisons of fast neutron fluence rates are in progress ( 115 In(n,n') 115 Insup(m); NB/Zr) or will take place in the near future ( 115 n(n,#betta#) 116 Insup(m); 235 U and 238 U fission chambers). An intercomparison of neutron dosimetry standards by circulating tissue-equivalent ion chambers will be prepared and organized by BIPM. Finally, there was a broad exchange of information on work in progress at the various laboratories represented at the meeting [fr

  9. Various approaches to standardization and the importance of measurement accuracy

    NARCIS (Netherlands)

    Gram, J.; Jespersen, J.; Kluft, C.; Declerck, P.

    1996-01-01

    Biochemical measurements of quantities, i.e. analytes, of the haemostatic system are the basis of evaluating patients with potentially serious or lifethreatening disorders. Therefore, there is a need of a high level of certainty of the results. Experience based on the comprehensive international

  10. Concurrent Validity of Standardized Measures of Written Expression.

    Science.gov (United States)

    Riccio, Cynthia A.; Boan, Candace H.; Staniszewski, Deborah; Hynd, George W.

    1997-01-01

    A study involving 120 school-aged children that investigated the concurrent validity of measures of written language found that the Wechsler Individual Achievement Test Written Expression subtest correlates moderately with the Written Expression subtest of the Peabody Individual Achievement Test-Revised and the Spontaneous Writing Quotient of the…

  11. Conditional Standard Errors of Measurement for Composite Scores Using IRT

    Science.gov (United States)

    Kolen, Michael J.; Wang, Tianyou; Lee, Won-Chan

    2012-01-01

    Composite scores are often formed from test scores on educational achievement test batteries to provide a single index of achievement over two or more content areas or two or more item types on that test. Composite scores are subject to measurement error, and as with scores on individual tests, the amount of error variability typically depends on…

  12. Standard Test Method for Measuring Fast-Neutron Reaction Rates by Radioactivation of Titanium

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    2008-01-01

    1.1 This test method covers procedures for measuring reaction rates by the activation reactions 46Ti(n,p) 46Sc + 47Ti(n, np)46Sc. Note 1—Since the cross section for the (n,np) reaction is relatively small for energies less than 12 MeV and is not easily distinguished from that of the (n,p) reaction, this test method will refer to the (n,p) reaction only. 1.2 The reaction is useful for measuring neutrons with energies above approximately 4.4 MeV and for irradiation times up to about 250 days (for longer irradiations, see Practice E 261). 1.3 With suitable techniques, fission-neutron fluence rates above 109 cm–2·s–1 can be determined. However, in the presence of a high thermal-neutron fluence rate, 46Sc depletion should be investigated. 1.4 Detailed procedures for other fast-neutron detectors are referenced in Practice E 261. 1.5 The values stated in SI units are to be regarded as standard. No other units of measurement are included in this standard. 1.6 This standard does not purport to address all...

  13. Standardizing measurement, sampling and reporting for public exposure assessments

    Energy Technology Data Exchange (ETDEWEB)

    Rochedo, Elaine R.R. [Instituto de Radioprotecao e Dosimetria, Comissao Nacional de Energia Nuclear, Av. Salvador Allende s/No. CEP 22780-160 Rio de Janeiro, RJ (Brazil)], E-mail: elaine@ird.gov.br

    2008-11-15

    UNSCEAR assesses worldwide public exposure from natural and man-made sources of ionizing radiation based on information submitted to UNSCEAR by United Nations Member States and from peer reviewed scientific literature. These assessments are used as a basis for radiation protection programs of international and national regulatory and research organizations. Although UNSCEAR describes its assessment methodologies, the data are based on various monitoring approaches. In order to reduce uncertainties and improve confidence in public exposure assessments, it would be necessary to harmonize the methodologies used for sampling, measuring and reporting of environmental results.

  14. MEASURES REFERENCES OF ABNT: Instrument for the standardization of clothing products

    Directory of Open Access Journals (Sweden)

    Maicon Douglas Livramento Nishimura

    2016-12-01

    Full Text Available In order to standardize the units of clothing measurements, ABNT (Brazilian Association of Technical Standards conducts research for the development of national measures reference in children, male and female segments. The standards of child and male measures are already in place, however, still works for the improvement of a female norm. Regarding the normalization of measures of Brazilian clothing, sought to identify the system methodology adopted for the development of standards of existing measures and to analyze how this information is absorbed by the fashion market. The research was based on bibliographic and documentary survey, considering mainly the standards of Brazilian measures benchmarks and projects these rules had little or no anthropometric study in its methodology. Consequently, it was noticed certain fragility of reference measures currently used, however, the accuracy in sight with the studies that are under development and the possible outcomes that will soon be available for analysis.

  15. NASA'S Standard Measures During Bed Rest: Adaptations in the Cardiovascular System

    Science.gov (United States)

    Lee, Stuart M. C.; Feiveson, Alan H.; Martin, David S.; Cromwell, Roni L.; Platts, Steven H.; Stenger, Michael B.

    2016-01-01

    Bed rest is a well-accepted analog of space flight that has been used extensively to investigate physiological adaptations in a larger number of subjects in a shorter amount of time than can be studied with space flight and without the confounding effects associated with normal mission operations. However, comparison across studies of different bed rest durations, between sexes, and between various countermeasure protocols have been hampered by dissimilarities in bed rest conditions, measurement protocols, and testing schedules. To address these concerns, NASA instituted standard bed rest conditions and standard measures for all physiological disciplines participating in studies conducted at the Flight Analogs Research Unit (FARU) at the University of Texas-Medical Branch. Investigators for individual studies employed their own targeted study protocols to address specific hypothesis-driven questions, but standard measures tests were conducted within these studies on a non-interference basis to maximize data availability while reducing the need to implement multiple bed rest studies to understand the effects of a specific countermeasure. When possible, bed rest standard measures protocols were similar to tests nominally used for medically-required measures or research protocols conducted before and after Space Shuttle and International Space Station missions. Specifically, bed rest standard measures for the cardiovascular system implemented before, during, and after bed rest at the FARU included plasma volume (carbon monoxide rebreathing), cardiac mass and function (2D, 3D and Doppler echocardiography), and orthostatic tolerance testing (15- or 30-minutes of 80 degree head-up tilt). Results to-date indicate that when countermeasures are not employed, plasma volume decreases and the incidence of presyncope during head-up tilt is more frequent even after short-duration bed rest while reductions in cardiac function and mass are progressive as bed rest duration

  16. Standard Measurement & Verification Plan for Lighting Equipment Retrofit or Replacement Projects

    Energy Technology Data Exchange (ETDEWEB)

    Richman, Eric E.

    2009-11-04

    This document provides a framework for a standard Measurement and Verification (M&V) plan for lighting projects. It was developed to support cost-effective retrofits (partial and complete replacements) of lighting systems and is intended to provide a foundation for an M&V plan for a lighting retrofit utilizing a "best practice" approach, and to provide guidance to site owners, contractors, and other involved organizations on what is essential for a robust M&V plan for lighting projects. This document provides examples of appropriate elements of an M&V plan, including the calculation of expected energy savings. The standard M&V plan, as provided, also allows for consistent comparison with other similar lighting projects. Although intended for lighting retrofit applications, M&V plans developed per this framework document may also be used for other non-lighting technology retrofits and new installations.

  17. Comparing Multiple Evapotranspiration-calculating Methods, Including Eddy Covariance and Surface Renewal, Using Empirical Measurements from Alfalfa Fields in the Sacramento-San Joaquin River Delta

    Science.gov (United States)

    Clay, J.; Kent, E. R.; Leinfelder-Miles, M.; Lambert, J. J.; Little, C.; Paw U, K. T.; Snyder, R. L.

    2016-12-01

    Eddy covariance and surface renewal measurements were used to estimate evapotranspiration (ET) over a variety of crop fields in the Sacramento-San Joaquin River Delta during the 2016 growing season. However, comparing and evaluating multiple measurement systems and methods for determining ET was focused upon at a single alfalfa site. The eddy covariance systems included two systems for direct measurement of latent heat flux: one using a separate sonic anemometer and an open path infrared gas analyzer and another using a combined system (Campbell Scientific IRGASON). For these methods, eddy covariance was used with measurements from the Campbell Scientific CSAT3, the LI-COR 7500a, the Campbell Scientific IRGASON, and an additional R.M. Young sonic anemometer. In addition to those direct measures, the surface renewal approach included several energy balance residual methods in which net radiation, ground heat flux, and sensible heat flux (H) were measured. H was measured using several systems and different methods, including using multiple fast-response thermocouple measurements and using the temperatures measured by the sonic anemometers. The energy available for ET was then calculated as the residual of the surface energy balance equation. Differences in ET values were analyzed between the eddy covariance and surface renewal methods, using the IRGASON-derived values of ET as the standard for accuracy.

  18. Application of the dynamic control rod reactivity measurement method to Korea standard nuclear power plants

    International Nuclear Information System (INIS)

    Lee, E. K.; Shin, H. C.; Bae, S. M.; Lee, Y. G.

    2004-01-01

    To measure and validate the worth of control bank or shutdown bank, the dynamic control rod reactivity measurement (DCRM) technique has been developed and applied to six cases of Low Power Physics Tests of PWRs including Korea Standard Nuclear Power plant (KSNP) based on the CE System 80 NSSS. Through the DORT results for each two ex-ore detector response and the three dimensional core transient simulations for rod movements, the key parameters of DCRM method are determined to implement into the Direct Digital Reactivity Computer System (DDRCS). A total of 9 bank worths of two KSNP plants were measured to compare with the worths of the conventional rod worth measurement method. The results show that the average error of DCRM method is nearly the same as the conventional Rod Swap and Boron Dilution Method but lower standard deviation. It takes about twenty minutes from the beginning of rod movement to final estimation of the integral static worth of a control bank. (authors)

  19. Clarifying the use of aggregated exposures in multilevel models: self-included vs. self-excluded measures.

    Directory of Open Access Journals (Sweden)

    Etsuji Suzuki

    Full Text Available Multilevel analyses are ideally suited to assess the effects of ecological (higher level and individual (lower level exposure variables simultaneously. In applying such analyses to measures of ecologies in epidemiological studies, individual variables are usually aggregated into the higher level unit. Typically, the aggregated measure includes responses of every individual belonging to that group (i.e. it constitutes a self-included measure. More recently, researchers have developed an aggregate measure which excludes the response of the individual to whom the aggregate measure is linked (i.e. a self-excluded measure. In this study, we clarify the substantive and technical properties of these two measures when they are used as exposures in multilevel models.Although the differences between the two aggregated measures are mathematically subtle, distinguishing between them is important in terms of the specific scientific questions to be addressed. We then show how these measures can be used in two distinct types of multilevel models-self-included model and self-excluded model-and interpret the parameters in each model by imposing hypothetical interventions. The concept is tested on empirical data of workplace social capital and employees' systolic blood pressure.Researchers assume group-level interventions when using a self-included model, and individual-level interventions when using a self-excluded model. Analytical re-parameterizations of these two models highlight their differences in parameter interpretation. Cluster-mean centered self-included models enable researchers to decompose the collective effect into its within- and between-group components. The benefit of cluster-mean centering procedure is further discussed in terms of hypothetical interventions.When investigating the potential roles of aggregated variables, researchers should carefully explore which type of model-self-included or self-excluded-is suitable for a given situation

  20. Clarifying the use of aggregated exposures in multilevel models: self-included vs. self-excluded measures.

    Science.gov (United States)

    Suzuki, Etsuji; Yamamoto, Eiji; Takao, Soshi; Kawachi, Ichiro; Subramanian, S V

    2012-01-01

    Multilevel analyses are ideally suited to assess the effects of ecological (higher level) and individual (lower level) exposure variables simultaneously. In applying such analyses to measures of ecologies in epidemiological studies, individual variables are usually aggregated into the higher level unit. Typically, the aggregated measure includes responses of every individual belonging to that group (i.e. it constitutes a self-included measure). More recently, researchers have developed an aggregate measure which excludes the response of the individual to whom the aggregate measure is linked (i.e. a self-excluded measure). In this study, we clarify the substantive and technical properties of these two measures when they are used as exposures in multilevel models. Although the differences between the two aggregated measures are mathematically subtle, distinguishing between them is important in terms of the specific scientific questions to be addressed. We then show how these measures can be used in two distinct types of multilevel models-self-included model and self-excluded model-and interpret the parameters in each model by imposing hypothetical interventions. The concept is tested on empirical data of workplace social capital and employees' systolic blood pressure. Researchers assume group-level interventions when using a self-included model, and individual-level interventions when using a self-excluded model. Analytical re-parameterizations of these two models highlight their differences in parameter interpretation. Cluster-mean centered self-included models enable researchers to decompose the collective effect into its within- and between-group components. The benefit of cluster-mean centering procedure is further discussed in terms of hypothetical interventions. When investigating the potential roles of aggregated variables, researchers should carefully explore which type of model-self-included or self-excluded-is suitable for a given situation, particularly

  1. Standard test method for measurement of fatigue crack growth rates

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    2015-01-01

    1.1 This test method covers the determination of fatigue crack growth rates from near-threshold to Kmax controlled instability. Results are expressed in terms of the crack-tip stress-intensity factor range (ΔK), defined by the theory of linear elasticity. 1.2 Several different test procedures are provided, the optimum test procedure being primarily dependent on the magnitude of the fatigue crack growth rate to be measured. 1.3 Materials that can be tested by this test method are not limited by thickness or by strength so long as specimens are of sufficient thickness to preclude buckling and of sufficient planar size to remain predominantly elastic during testing. 1.4 A range of specimen sizes with proportional planar dimensions is provided, but size is variable to be adjusted for yield strength and applied force. Specimen thickness may be varied independent of planar size. 1.5 The details of the various specimens and test configurations are shown in Annex A1-Annex A3. Specimen configurations other than t...

  2. The misinterpretation of the standard error of measurement in medical education: a primer on the problems, pitfalls and peculiarities of the three different standard errors of measurement.

    Science.gov (United States)

    McManus, I C

    2012-01-01

    In high-stakes assessments in medical education, such as final undergraduate examinations and postgraduate assessments, an attempt is frequently made to set confidence limits on the probable true score of a candidate. Typically, this is carried out using what is referred to as the standard error of measurement (SEM). However, it is often the case that the wrong formula is applied, there actually being three different formulae for use in different situations. To explain and clarify the calculation of the SEM, and differentiate three separate standard errors, which here are called the standard error of measurement (SEmeas), the standard error of estimation (SEest) and the standard error of prediction (SEpred). Most accounts describe the calculation of SEmeas. For most purposes, though, what is required is the standard error of estimation (SEest), which has to be applied not to a candidate's actual score but to their estimated true score after taking into account the regression to the mean that occurs due to the unreliability of an assessment. A third formula, the standard error of prediction (SEpred) is less commonly used in medical education, but is useful in situations such as counselling, where one needs to predict a future actual score on an examination from a previous actual score on the same examination. The various formulae can produce predictions that differ quite substantially, particularly when reliability is not particularly high, and the mark in question is far removed from the average performance of candidates. That can have important, unintended consequences, particularly in a medico-legal context.

  3. Standard Test Method for Measuring Optical Angular Deviation of Transparent Parts Using the Double-Exposure Method

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    2010-01-01

    1.1 This test method covers the measurement of the optical angular deviation of a light ray imposed by flat transparent parts such as a commercial or military aircraft windshield, canopy or cabin window. 1.2 The values stated in SI units are to be regarded as standard. No other units of measurement are included in this standard. 1.2.1 Exceptions—The values given in parentheses are for information only. Also, print size is provided in inch-pound measurements. 1.3 This standard does not purport to address all of the safety concerns, if any, associated with its use. It is the responsibility of the user of this standard to establish appropriate safety and health practices and determine the applicability of regulatory limitations prior to use.

  4. Standards for Clinical Trials in Male and Female Sexual Dysfunction: II. Patient-Reported Outcome Measures.

    Science.gov (United States)

    Fisher, William A; Gruenwald, Ilan; Jannini, Emmanuele A; Lev-Sagie, Ahinoam; Lowenstein, Lior; Pyke, Robert E; Reisman, Yakov; Revicki, Dennis A; Rubio-Aurioles, Eusebio

    2016-12-01

    The second article in this series, Standards for Clinical Trials in Male and Female Sexual Dysfunction, focuses on measurement of patient-reported outcomes (PROs). Together with the design of appropriate phase I to phase IV clinical trials, the development, validation, choice, and implementation of valid PRO measurements-the focus of the present article-form the foundation of research on treatments for male and female sexual dysfunctions. PRO measurements are assessments of any aspect of a patient's health status that come directly from the patient (ie, without the interpretation of the patient's responses by a physician or anyone else). PROs are essential for assessing male and female sexual dysfunction and treatment response, including symptom frequency and severity, personal distress, satisfaction, and other measurements of sexual and general health-related quality of life. Although there are some relatively objective measurements of sexual dysfunction (ie, intravaginal ejaculatory latency time, frequency of sexual activity, etc), these measurements do not comprehensively assess the occurrence and extent of sexual dysfunction or treatment on the patient's symptoms, functioning, and well-being. Data generated by a PRO instrument can provide evidence of a treatment benefit from the patient's perspective. Copyright © 2016 International Society for Sexual Medicine. Published by Elsevier Inc. All rights reserved.

  5. Determination of soil properties from standard penetration test complemented by torque measurement (SPT-T

    Directory of Open Access Journals (Sweden)

    Anna S. P. Peixoto

    2014-09-01

    Full Text Available The major problem on geotechnical work is to ensure that no settlements occur during the life cycle of the construction. This involves proper design of foundations and their bearing capacity. The Brazilian standard for design and execution of foundations, ABNT (2010 NBR 6122 imposes the utilization of field tests when designing building foundations. The Standard Penetration Test, SPT, ABNT (2001 NBR 6484, is still the most common in-situ test for those purposes. Ranzini (1988 suggested supplementing the conventional SPT with the measurement of the torque (SPT-T required to turn the split spoon after driving, in order to provide a ‘static’ component to a ‘dynamic’ test. The adhesion between the soil and the sampler, obtained by the torque measurement, could be used to calculate the lateral skin friction of piles. This paper describes the SPT-T procedure including both a supplementary equipment and practical aspects. Also it presents an accurate torque measurement, a prediction method to calculate the bearing capacity of piles used in building foundations using the SPT-T test and a comparison between the estimated bearing capacities of building foundations with instrumented load tests in order to validate the method.

  6. Standard and routine metabolic rates of juvenile sandbar sharks (Carcharhinus plumbeus), including the effects of body mass and acute temperature change

    OpenAIRE

    Dowd, William Wesley; Brill, R W; Bushnell, P G; Musick, J A

    2006-01-01

    Standard and routine metabolic rates (SMRs and RMRs, respectively) of juvenile sandbar sharks (Carcharhinus plumbeus) were measured over a range of body sizes (n=34) and temperatures normally associated with western Atlantic coastal nursery areas. The mean SMR Q(10) (increase in metabolic rate with temperature) was 2.9 +/- 0.2. Heart rate decreased with increasing body mass but increased with temperature at a Q(10) of 1.8-2.2. Self-paired measures of SMR and RMR were obtained for 15 individua...

  7. The Frontlines of Medicine Project: a proposal for the standardized communication of emergency department data for public health uses including syndromic surveillance for biological and chemical terrorism.

    Science.gov (United States)

    Barthell, Edward N; Cordell, William H; Moorhead, John C; Handler, Jonathan; Feied, Craig; Smith, Mark S; Cochrane, Dennis G; Felton, Christopher W; Collins, Michael A

    2002-04-01

    The Frontlines of Medicine Project is a collaborative effort of emergency medicine (including emergency medical services and clinical toxicology), public health, emergency government, law enforcement, and informatics. This collaboration proposes to develop a nonproprietary, "open systems" approach for reporting emergency department patient data. The common element is a standard approach to sending messages from individual EDs to regional oversight entities that could then analyze the data received. ED encounter data could be used for various public health initiatives, including syndromic surveillance for chemical and biological terrorism. The interlinking of these regional systems could also permit public health surveillance at a national level based on ED patient encounter data. Advancements in the Internet and Web-based technologies could allow the deployment of these standardized tools in a rapid time frame.

  8. The Czech national long distances measuring standard Koštice - State of play

    Directory of Open Access Journals (Sweden)

    Ladislav Červinka

    2009-11-01

    Full Text Available This article gives information about new Czech national long distances measuring standard, which has been preparedat the distance base near the Koštice village. Submitter of the project is the Czech Office for Standards, Metrology and Testing.Research and document preparation for creation of the measuring standard were ensured by the Research Institute of Geodesy,Topography and Cartography. Interlaboratory comparisons were made by staff of the Bundeswehr University in Munich. The paperreports about works, which will be carried out on national standard in the second half of this year. Purpose of this works is to improvecharacteristics of accuracy of national etalon.

  9. Research standardization tools: pregnancy measures in the PhenX Toolkit.

    Science.gov (United States)

    Malinowski, Ann Kinga; Ananth, Cande V; Catalano, Patrick; Hines, Erin P; Kirby, Russell S; Klebanoff, Mark A; Mulvihill, John J; Simhan, Hyagriv; Hamilton, Carol M; Hendershot, Tabitha P; Phillips, Michael J; Kilpatrick, Lisa A; Maiese, Deborah R; Ramos, Erin M; Wright, Rosalind J; Dolan, Siobhan M

    2017-09-01

    Only through concerted and well-executed research endeavors can we gain the requisite knowledge to advance pregnancy care and have a positive impact on maternal and newborn health. Yet the heterogeneity inherent in individual studies limits our ability to compare and synthesize study results, thus impeding the capacity to draw meaningful conclusions that can be trusted to inform clinical care. The PhenX Toolkit (http://www.phenxtoolkit.org), supported since 2007 by the National Institutes of Health, is a web-based catalog of standardized protocols for measuring phenotypes and exposures relevant for clinical research. In 2016, a working group of pregnancy experts recommended 15 measures for the PhenX Toolkit that are highly relevant to pregnancy research. The working group followed the established PhenX consensus process to recommend protocols that are broadly validated, well established, nonproprietary, and have a relatively low burden for investigators and participants. The working group considered input from the pregnancy experts and the broader research community and included measures addressing the mode of conception, gestational age, fetal growth assessment, prenatal care, the mode of delivery, gestational diabetes, behavioral and mental health, and environmental exposure biomarkers. These pregnancy measures complement the existing measures for other established domains in the PhenX Toolkit, including reproductive health, anthropometrics, demographic characteristics, and alcohol, tobacco, and other substances. The preceding domains influence a woman's health during pregnancy. For each measure, the PhenX Toolkit includes data dictionaries and data collection worksheets that facilitate incorporation of the protocol into new or existing studies. The measures within the pregnancy domain offer a valuable resource to investigators and clinicians and are well poised to facilitate collaborative pregnancy research with the goal to improve patient care. To achieve this

  10. Technical support document: Energy efficiency standards for consumer products: Refrigerators, refrigerator-freezers, and freezers including draft environmental assessment, regulatory impact analysis

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1995-07-01

    The Energy Policy and Conservation Act (P.L. 94-163), as amended by the National Appliance Energy Conservation Act of 1987 (P.L. 100-12) and by the National Appliance Energy Conservation Amendments of 1988 (P.L. 100-357), and by the Energy Policy Act of 1992 (P.L. 102-486), provides energy conservation standards for 12 of the 13 types of consumer products` covered by the Act, and authorizes the Secretary of Energy to prescribe amended or new energy standards for each type (or class) of covered product. The assessment of the proposed standards for refrigerators, refrigerator-freezers, and freezers presented in this document is designed to evaluate their economic impacts according to the criteria in the Act. It includes an engineering analysis of the cost and performance of design options to improve the efficiency of the products; forecasts of the number and average efficiency of products sold, the amount of energy the products will consume, and their prices and operating expenses; a determination of change in investment, revenues, and costs to manufacturers of the products; a calculation of the costs and benefits to consumers, electric utilities, and the nation as a whole; and an assessment of the environmental impacts of the proposed standards.

  11. Solution standards for quality control of nuclear-material analytical measurements

    International Nuclear Information System (INIS)

    Clark, J.P.

    1981-01-01

    Analytical chemistry measurement control depends upon reliable solution standards. At the Savannah River Plant Control Laboratory over a thousand analytical measurements are made daily for process control, product specification, accountability, and nuclear safety. Large quantities of solution standards are required for a measurement quality control program covering the many different analytical chemistry methods. Savannah River Plant produced uranium, plutonium, neptunium, and americium metals or oxides are dissolved to prepare stock solutions for working or Quality Control Standards (QCS). Because extensive analytical effort is required to characterize or confirm these solutions, they are prepared in large quantities. These stock solutions are diluted and blended with different chemicals and/or each other to synthesize QCS that match the matrices of different process streams. The target uncertainty of a standard's reference value is 10% of the limit of error of the methods used for routine measurements. Standard Reference Materials from NBS are used according to special procedures to calibrate the methods used in measuring the uranium and plutonium standards so traceability can be established. Special precautions are required to minimize the effects of temperature, radiolysis, and evaporation. Standard reference values are periodically corrected to eliminate systematic errors caused by evaporation or decay products. Measurement control is achieved by requiring analysts to analyze a blind QCS each shift a measurement system is used on plant samples. Computer evaluation determines whether or not a measurement is within the +- 3 sigma control limits. Monthly evaluations of the QCS measurements are made to determine current bias correction factors for accountability measurements and detect significant changes in the bias and precision statistics. The evaluations are also used to plan activities for improving the reliability of the analytical chemistry measurements

  12. Standards on Noise Measurements, Rating Schemes, and Definitions: A Compilation. NBS Special Publication 386, 1976 Edition.

    Science.gov (United States)

    Quindry, Thomas L., Ed.

    This compilation deals with material assembled from the various standards set forth by industrial and trade organizations, or technical and scientific societies concerned with acoustics. There has been no attempt to review or evaluate the standards, but rather just to list documents covering measurement techniques, calibration methods,…

  13. 38 CFR 21.7670 - Measurement of courses leading to a standard, undergraduate college degree.

    Science.gov (United States)

    2010-07-01

    ... leading to a standard, undergraduate college degree. 21.7670 Section 21.7670 Pensions, Bonuses, and... leading to a standard, undergraduate college degree. Except as provided in § 21.7672, VA will measure a...) Other requirements. Notwithstanding any other provision of this section, in administering benefits...

  14. A novel framework for validating and applying standardized small area measurement strategies

    Directory of Open Access Journals (Sweden)

    Murray Christopher JL

    2010-09-01

    Full Text Available Abstract Background Local measurements of health behaviors, diseases, and use of health services are critical inputs into local, state, and national decision-making. Small area measurement methods can deliver more precise and accurate local-level information than direct estimates from surveys or administrative records, where sample sizes are often too small to yield acceptable standard errors. However, small area measurement requires careful validation using approaches other than conventional statistical methods such as in-sample or cross-validation methods because they do not solve the problem of validating estimates in data-sparse domains. Methods A new general framework for small area estimation and validation is developed and applied to estimate Type 2 diabetes prevalence in US counties using data from the Behavioral Risk Factor Surveillance System (BRFSS. The framework combines the three conventional approaches to small area measurement: (1 pooling data across time by combining multiple survey years; (2 exploiting spatial correlation by including a spatial component; and (3 utilizing structured relationships between the outcome variable and domain-specific covariates to define four increasingly complex model types - coined the Naive, Geospatial, Covariate, and Full models. The validation framework uses direct estimates of prevalence in large domains as the gold standard and compares model estimates against it using (i all available observations for the large domains and (ii systematically reduced sample sizes obtained through random sampling with replacement. At each sampling level, the model is rerun repeatedly, and the validity of the model estimates from the four model types is then determined by calculating the (average concordance correlation coefficient (CCC and (average root mean squared error (RMSE against the gold standard. The CCC is closely related to the intraclass correlation coefficient and can be used when the units are

  15. Best practices in selecting performance measures and standards for effective asset management.

    Science.gov (United States)

    2011-06-01

    "This report assesses and provides guidance on best practices in performance measurement, management and standards : setting for effective Transportation Asset Management (TAM). The study is conducted through a literature review, a : survey of the 50...

  16. Measuring the activity of BioBrick promoters using an in vivo reference standard

    Directory of Open Access Journals (Sweden)

    Kelly Jason R

    2009-03-01

    Full Text Available Abstract Background The engineering of many-component, synthetic biological systems is being made easier by the development of collections of reusable, standard biological parts. However, the complexity of biology makes it difficult to predict the extent to which such efforts will succeed. As a first practical example, the Registry of Standard Biological Parts started at MIT now maintains and distributes thousands of BioBrick™ standard biological parts. However, BioBrick parts are only standardized in terms of how individual parts are physically assembled into multi-component systems, and most parts remain uncharacterized. Standardized tools, techniques, and units of measurement are needed to facilitate the characterization and reuse of parts by independent researchers across many laboratories. Results We found that the absolute activity of BioBrick promoters varies across experimental conditions and measurement instruments. We choose one promoter (BBa_J23101 to serve as an in vivo reference standard for promoter activity. We demonstrated that, by measuring the activity of promoters relative to BBa_J23101, we could reduce variation in reported promoter activity due to differences in test conditions and measurement instruments by ~50%. We defined a Relative Promoter Unit (RPU in order to report promoter characterization data in compatible units and developed a measurement kit so that researchers might more easily adopt RPU as a standard unit for reporting promoter activity. We distributed a set of test promoters to multiple labs and found good agreement in the reported relative activities of promoters so measured. We also characterized the relative activities of a reference collection of BioBrick promoters in order to further support adoption of RPU-based measurement standards. Conclusion Relative activity measurements based on an in vivoreference standard enables improved measurement of promoter activity given variation in measurement

  17. Design Of Measurements For Evaluating Readiness Of Technoware Components To Meet The Required Standard Of Products

    Science.gov (United States)

    Fauzi, Ilham; Muharram Hasby, Fariz; Irianto, Dradjad

    2018-03-01

    Although government is able to make mandatory standards that must be obeyed by the industry, the respective industries themselves often have difficulties to fulfil the requirements described in those standards. This is especially true in many small and medium sized enterprises that lack the required capital to invest in standard-compliant equipment and machineries. This study aims to develop a set of measurement tools for evaluating the level of readiness of production technology with respect to the requirements of a product standard based on the quality function deployment (QFD) method. By combining the QFD methodology, UNESCAP Technometric model [9] and Analytic Hierarchy Process (AHP), this model is used to measure a firm’s capability to fulfill government standard in the toy making industry. Expert opinions from both the governmental officers responsible for setting and implementing standards and the industry practitioners responsible for managing manufacturing processes are collected and processed to find out the technological capabilities that should be improved by the firm to fulfill the existing standard. This study showed that the proposed model can be used successfully to measure the gap between the requirements of the standard and the readiness of technoware technological component in a particular firm.

  18. Development of a Primary Standard for Calibration of [18F]FDG Activity Measurement Systems

    International Nuclear Information System (INIS)

    Capogni, M; Felice, P De; Fazio, A; Simonelli, F; Abbas, K

    2006-01-01

    The 18 F national primary standard was developed by the INMRI-ENEA using the 4πβ Liquid Scintillation Spectrometry Method with 3 H-Standard Efficiency Tracing. Measurements were performed at JRCIspra under a scientific collaboration between the Institute for Health and Consumer Production, the Amersham Health and the National Institute for Occupational Safety and Prevention (ISPESL). The goal of the work was to calibrate, with minimum uncertainty, the INMRI-ENEA transfer standard portable well-type ionisation chamber as well as other JRC-Ispra and Amersham Health reference Ionising Chambers used for FDG activity measurement

  19. Development of a Primary Standard for Calibration of [{sup 18}F]FDG Activity Measurement Systems

    Energy Technology Data Exchange (ETDEWEB)

    Capogni, M [ENEA Istituto Nazionale di Metrologia delle Radiazioni Ionizzanti (INMRI), Centro Ricerche Casaccia, I-00060 Rome (Italy); Felice, P De [ENEA Istituto Nazionale di Metrologia delle Radiazioni Ionizzanti (INMRI), Centro Ricerche Casaccia, I-00060 Rome (Italy); Fazio, A [ENEA Istituto Nazionale di Metrologia delle Radiazioni Ionizzanti (INMRI), Centro Ricerche Casaccia, I-00060 Rome (Italy); Simonelli, F [Institute for Health and Consumer Protection, Joint Research Centre (JRC), European Commission, I-21020 Ispra (Vatican City State, Holy See,), Italy; D' Ursi, V [Amersham Health Srl (AH), I-13040 Saluggia (Saint Vincent and the Grenadines), Italy; Pecorale, A [Amersham Health Srl (AH), I-13040 Saluggia (Saint Vincent and the Grenadines), Italy; Giliberti, C [Istituto Superiore per la Prevenzione e la Sicurezza del Lavoro (ISPESL), I-00184 Rome (Italy); Abbas, K [Institute for Health and Consumer Protection, Joint Research Centre (JRC), European Commission, I-21020 Ispra (Vatican City State, Holy See,), Italy

    2006-05-15

    The {sup 18}F national primary standard was developed by the INMRI-ENEA using the 4{pi}{beta} Liquid Scintillation Spectrometry Method with {sup 3}H-Standard Efficiency Tracing. Measurements were performed at JRCIspra under a scientific collaboration between the Institute for Health and Consumer Production, the Amersham Health and the National Institute for Occupational Safety and Prevention (ISPESL). The goal of the work was to calibrate, with minimum uncertainty, the INMRI-ENEA transfer standard portable well-type ionisation chamber as well as other JRC-Ispra and Amersham Health reference Ionising Chambers used for FDG activity measurement.

  20. Secondary standards (non-activation) for neutron data measurements above 20 MeV

    International Nuclear Information System (INIS)

    Haight, R.C.

    1991-01-01

    In addition to H(n,p) scattering and 235,238 U(n,f) reactions, secondary standards for neutron flux determination may be useful for neutron energies above 20 MeV. For experiments where gamma rays are detected, reference gamma-ray production cross sections are relevant. For neutron-induced charged particle production, standard (n,p) and (n,alpha) cross sections would be helpful. Total cross section standards would serve to check the accuracy of these measurements. These secondary standards are desirable because they can be used with the same detector systems employed in measuring the quantities of interest. Uncertainties due to detector efficiency, geometrical effects, timing and length of flight paths can therefore be significantly reduced. Several secondary standards that do not depend on activation techniques are proposed. 14 refs

  1. Associations Between Physician Empathy, Physician Characteristics, and Standardized Measures of Patient Experience.

    Science.gov (United States)

    Chaitoff, Alexander; Sun, Bob; Windover, Amy; Bokar, Daniel; Featherall, Joseph; Rothberg, Michael B; Misra-Hebert, Anita D

    2017-10-01

    To identify correlates of physician empathy and determine whether physician empathy is related to standardized measures of patient experience. Demographic, professional, and empathy data were collected during 2013-2015 from Cleveland Clinic Health System physicians prior to participation in mandatory communication skills training. Empathy was assessed using the Jefferson Scale of Empathy. Data were also collected for seven measures (six provider communication items and overall provider rating) from the visit-specific and 12-month Consumer Assessment of Healthcare Providers and Systems Clinician and Group (CG-CAHPS) surveys. Associations between empathy and provider characteristics were assessed by linear regression, ANOVA, or a nonparametric equivalent. Significant predictors were included in a multivariable linear regression model. Correlations between empathy and CG-CAHPS scores were assessed using Spearman rank correlation coefficients. In bivariable analysis (n = 847 physicians), female sex (P empathy scores. In multivariable analysis, female sex (P empathy scores. Of the seven CG-CAHPS measures, scores on five for the 583 physicians with visit-specific data and on three for the 277 physicians with 12-month data were positively correlated with empathy. Specialty and sex were independently associated with physician empathy. Empathy was correlated with higher scores on multiple CG-CAHPS items, suggesting improving physician empathy might play a role in improving patient experience.

  2. Standard Test Method for Measuring Heat Flux Using Surface-Mounted One-Dimensional Flat Gages

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    2009-01-01

    1.1 This test method describes the measurement of the net heat flux normal to a surface using flat gages mounted onto the surface. Conduction heat flux is not the focus of this standard. Conduction applications related to insulation materials are covered by Test Method C 518 and Practices C 1041 and C 1046. The sensors covered by this test method all use a measurement of the temperature difference between two parallel planes normal to the surface to determine the heat that is exchanged to or from the surface in keeping with Fourier’s Law. The gages operate by the same principles for heat transfer in either direction. 1.2 This test method is quite broad in its field of application, size and construction. Different sensor types are described in detail in later sections as examples of the general method for measuring heat flux from the temperature gradient normal to a surface (1). Applications include both radiation and convection heat transfer. The gages have broad application from aerospace to biomedical en...

  3. Standard test method for measurement of oxidation-reduction potential (ORP) of soil

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    2009-01-01

    1.1 This test method covers a procedure and related test equipment for measuring oxidation-reduction potential (ORP) of soil samples removed from the ground. 1.2 The procedure in Section 9 is appropriate for field and laboratory measurements. 1.3 Accurate measurement of oxidation-reduction potential aids in the analysis of soil corrosivity and its impact on buried metallic structure corrosion rates. 1.4 The values stated in inch-pound units are to be regarded as standard. The values given in parentheses are mathematical conversions to SI units that are provided for information only and are not considered standard. 1.5 This standard does not purport to address all of the safety concerns, if any, associated with its use. It is the responsibility of the user of this standard to establish appropriate safety and health practices and determine the applicability of regulatory limitations prior to use.

  4. Development of Standardized Mobile Tracer Correlation Approach for Large Area Emission Measurements (DRAFT UNDER EPA REVIEW)

    Science.gov (United States)

    Foster-wittig, T. A.; Thoma, E.; Green, R.; Hater, G.; Swan, N.; Chanton, J.

    2013-12-01

    Improved understanding of air emissions from large area sources such as landfills, waste water ponds, open-source processing, and agricultural operations is a topic of increasing environmental importance. In many cases, the size of the area source, coupled with spatial-heterogeneity, make direct (on-site) emission assessment difficult; methane emissions, from landfills for example, can be particularly complex [Thoma et al, 2009]. Recently, whole-facility (remote) measurement approaches based on tracer correlation have been utilized [Scheutz et al, 2011]. The approach uses a mobile platform to simultaneously measure a metered-release of a conservative gas (the tracer) along with the target compound (methane in the case of landfills). The known-rate tracer release provides a measure of atmospheric dispersion at the downwind observing location allowing the area source emission to be determined by a ratio calculation [Green et al, 2010]. Although powerful in concept, the approach has been somewhat limited to research applications due to the complexities and cost of the high-sensitivity measurement equipment required to quantify the part-per billion levels of tracer and target gas at kilometer-scale distances. The advent of compact, robust, and easy to use near-infrared optical measurement systems (such as cavity ring down spectroscopy) allow the tracer correlation approach to be investigated for wider use. Over the last several years, Waste Management Inc., the U.S. EPA, and collaborators have conducted method evaluation activities to determine the viability of a standardized approach through execution of a large number of field measurement trials at U.S. landfills. As opposed to previous studies [Scheutz et al, 2011] conducted at night (optimal plume transport conditions), the current work evaluated realistic use-scenarios; these scenarios include execution by non-scientist personnel, daylight operation, and full range of atmospheric condition (all plume transport

  5. Precision and accuracy, two steps towards the standardization of XRPD measurements

    Energy Technology Data Exchange (ETDEWEB)

    Berti, G. [Pisa Univ. (Italy). Dept. of Earth Sciences

    1996-09-01

    Any standardization process requires to get at results comprehensible, reproducible and traceable. Precision and accuracy of the measurements play a key role in getting at these requirements. The adoption of either physical (standard) or mathematical models allows for describing the whole diffraction measurement process with the necessary physical significance. From an other hand, the adoption of procedure, which are capable of controlling the measurement process, renders it reproducible and traceable. The falling of those requirements make difficult to transfer or replicate elsewhere experiences which may give even excellent result in a given laboratory.

  6. [Blood pressure measurement by primary care physicians: comparison with the standard method].

    Science.gov (United States)

    Asai, Y; Kawamoto, R; Nago, N; Kajii, E

    2000-04-01

    To examine the usual methods of blood pressure (BP) measurement by primary care physicians and to compare them with the standard methods. Cross-sectional survey by self-administered questionnaire. Primary care physicians who graduated from Jichi Medical School and were working at clinics. Each standard method for 20 items was defined as the one that was most frequently recommended by 6 guidelines (USA 3, UK 1, Canada 1, Japan 1) and a recent comprehensive review about BP measurement. Of 333 physicians, 190 (58%) responded (median age 33, range 26 to 45 years). Standard methods and percentages of physicians who follow them are: [BP measurement, 17 items] supported arm 96%; measurement to 2 mmHg 91%; sitting position 86%; mercury sphygmomanometer 83%; waiting > or = 1 minute between readings 58%; palpation to assess systolic BP before auscultation 57%; check accuracy of home BP monitor 56%; Korotkoff Phase V for diastolic BP 51%; bilateral measurements on initial visit 44%; small cuff available 41%; > or = 2 readings in patients with atrial fibrillation 38%; > or = 2 readings on one visit 20%; cuff deflation rate of 2 mmHg/pulse 14%; large cuff available 13%; check accuracy of monitor used for home visit 8%; waiting time > or = 5 minute 3%; readings from the arm with the higher BP 1%. [Knowledge about BP monitor, 2 items] appropriate size bladder: length 11%; width 11%. [Check of sphygmomanometer for leakage, inflate to 200 mmHg then close valve for 1 minute] leakage < 2 mmHg 6%; median 10 (range 0-200) mmHg. Average percentage of all 20 items was 39%. Number of methods physicians follow as standard: median 8 (range 4 to 15) and this number did not correlate with any background characteristics of the physicians. Furthermore, we also obtained information on methods not compared with the standard. Fifty-four percentage of physicians used more standard methods in deciding the start or change of treatment than in measuring BP of patients with good control. About 80% of

  7. Including indigestible carbohydrates in the evening meal of healthy subjects improves glucose tolerance, lowers inflammatory markers, and increases satiety after a subsequent standardized breakfast

    DEFF Research Database (Denmark)

    Nilsson, A.C.; Ostman, E.M.; Holst, Jens Juul

    2008-01-01

    tolerance and related variables after a subsequent standardized breakfast in healthy subjects (n = 15). At breakfast, blood was sampled for 3 h for analysis of blood glucose, serum insulin, serum FFA, serum triacylglycerides, plasma glucagon, plasma gastric-inhibitory peptide, plasma glucagon-like peptide-1...... (GLP-1), serum interleukin (IL)-6, serum IL-8, and plasma adiponectin. Satiety was subjectively rated after breakfast and the gastric emptying rate (GER) was determined using paracetamol as a marker. Breath hydrogen was measured as an indicator of colonic fermentation. Evening meals with barley kernel......-kernel bread compared with WWB. Breath hydrogen correlated positively with satiety (r = 0.27; P metabolic risk variables at breakfast...

  8. Including indigestible carbohydrates in the evening meal of healthy subjects improves glucose tolerance, lowers inflammatory markers, and increases satiety after a subsequent standardized breakfast

    DEFF Research Database (Denmark)

    Nilsson, Anne C; Ostman, Elin M; Holst, Jens Juul

    2008-01-01

    tolerance and related variables after a subsequent standardized breakfast in healthy subjects (n = 15). At breakfast, blood was sampled for 3 h for analysis of blood glucose, serum insulin, serum FFA, serum triacylglycerides, plasma glucagon, plasma gastric-inhibitory peptide, plasma glucagon-like peptide-1...... (GLP-1), serum interleukin (IL)-6, serum IL-8, and plasma adiponectin. Satiety was subjectively rated after breakfast and the gastric emptying rate (GER) was determined using paracetamol as a marker. Breath hydrogen was measured as an indicator of colonic fermentation. Evening meals with barley kernel...... based bread (ordinary, high-amylose- or beta-glucan-rich genotypes) or an evening meal with white wheat flour bread (WWB) enriched with a mixture of barley fiber and resistant starch improved glucose tolerance at the subsequent breakfast compared with unsupplemented WWB (P breakfast...

  9. Nanomechanical motion measured with an imprecision below that at the standard quantum limit.

    Science.gov (United States)

    Teufel, J D; Donner, T; Castellanos-Beltran, M A; Harlow, J W; Lehnert, K W

    2009-12-01

    Nanomechanical oscillators are at the heart of ultrasensitive detectors of force, mass and motion. As these detectors progress to even better sensitivity, they will encounter measurement limits imposed by the laws of quantum mechanics. If the imprecision of a measurement of the displacement of an oscillator is pushed below a scale set by the standard quantum limit, the measurement must perturb the motion of the oscillator by an amount larger than that scale. Here we show a displacement measurement with an imprecision below the standard quantum limit scale. We achieve this imprecision by measuring the motion of a nanomechanical oscillator with a nearly shot-noise limited microwave interferometer. As the interferometer is naturally operated at cryogenic temperatures, the thermal motion of the oscillator is minimized, yielding an excellent force detector with a sensitivity of 0.51 aN Hz(-1/2). This measurement is a critical step towards observing quantum behaviour in a mechanical object.

  10. Advisory Committee for the Calibration Standards of Ionizing Radiation Measurement. Section 2. Radionucleide Measurement

    International Nuclear Information System (INIS)

    1982-01-01

    Section II (Mesure des radionucleides) of the Comite Consultatif pour les Etalons de Mesure des Rayonnements Ionisants held its sixth meeting in May 1981. The results of an international comparison of 55 Fe, organized by the National Physical Laboratory, and of a trial comparison of 133 Ba were discussed. A full-scale comparison of 137 Cs activity measurements and a repetition of the 133 Ba trial comparison are to take place within the next two years. A trial comparison of 109 Cd is also proposed. Recent work in radioactivity carried out at BIPM was reported. The usefulness of the international reference system for measuring the activity of gamma-ray emitters was generally acknowledged. The new ''selective sampling'' method which avoids measuring coincidences attracted much attention. The Working Party reports and a new monograph (BIPM-3) were presented. Finally, there was a broad exchange of information on work in progress at the various laboratories represented at the meeting [fr

  11. Absolute standard hydrogen electrode potential measured by reduction of aqueous nanodrops in the gas phase.

    Science.gov (United States)

    Donald, William A; Leib, Ryan D; O'Brien, Jeremy T; Bush, Matthew F; Williams, Evan R

    2008-03-19

    In solution, half-cell potentials are measured relative to those of other half cells, thereby establishing a ladder of thermochemical values that are referenced to the standard hydrogen electrode (SHE), which is arbitrarily assigned a value of exactly 0 V. Although there has been considerable interest in, and efforts toward, establishing an absolute electrochemical half-cell potential in solution, there is no general consensus regarding the best approach to obtain this value. Here, ion-electron recombination energies resulting from electron capture by gas-phase nanodrops containing individual [M(NH3)6]3+, M = Ru, Co, Os, Cr, and Ir, and Cu2+ ions are obtained from the number of water molecules that are lost from the reduced precursors. These experimental data combined with nanodrop solvation energies estimated from Born theory and solution-phase entropies estimated from limited experimental data provide absolute reduction energies for these redox couples in bulk aqueous solution. A key advantage of this approach is that solvent effects well past two solvent shells, that are difficult to model accurately, are included in these experimental measurements. By evaluating these data relative to known solution-phase reduction potentials, an absolute value for the SHE of 4.2 +/- 0.4 V versus a free electron is obtained. Although not achieved here, the uncertainty of this method could potentially be reduced to below 0.1 V, making this an attractive method for establishing an absolute electrochemical scale that bridges solution and gas-phase redox chemistry.

  12. The primary exposure standard for Co-60 gamma radiation: characteristics and measurements procedures

    International Nuclear Information System (INIS)

    Laitano, R.F.; Toni, M.P.

    1983-01-01

    A description is given of a cavity ionization chamber used, as a primary exposure standard, at the Laboratorio di Metrologia delle Radiazioni Ionizzanti of the ENEA in Italy. The primary standard is designed to make absolute measurements of exposure due to the Co-60 gamma radiation. The procedures for the realizationof the exposure unit are also described. Finally results of some international comparisons are reported

  13. Standardization and quality assurance in fluorescence measurements I state-of-the art and future challenges

    CERN Document Server

    Resch-Genger, Ute

    2008-01-01

    The validation and standardization of fluorescence methods is still in its infancy as compared to other prominent analytical and bioanalytical methods. Appropriate quality assurance standards are however a prerequisite for applications in highly regulated fields such as medical diagnostics, drug development, or food analysis. For the first time, a team of recognized international experts has documented the present status of quality assurance in fluorescence measurements, and outlines concepts for establishing standards in this field. This first of two volumes covers basic aspects and various techniques such as steady-state and time-resolved fluorometry, polarization techniques, and fluorescent chemical sensors

  14. A standardization of the physical tests for external irradiation measuring detectors

    International Nuclear Information System (INIS)

    1977-05-01

    This report is the result of a standardization work, realized within the Radioprotection Services of the A.E.C., of the physical tests for dectors measuring external irradiations. Among the various tests mentionned, calibration and the establishment of the relative spectral response are treated in details. As far as calibration is concerned, the standardization refers to: the reference detector, the reference radiation source, the installation and calibration procedure. As for the relative spectral response the standardization refers to: the reference detector, the radiation sources to be used. High flux detectors and those for pulse electromagnetic radiations are also dealt with [fr

  15. INTERNATIONAL STANDARDS ON FOOD AND ENVIRONMENTAL RADIOACTIVITY MEASUREMENT FOR RADIOLOGICAL PROTECTION: STATUS AND PERSPECTIVES.

    Science.gov (United States)

    Calmet, D; Ameon, R; Bombard, A; Brun, S; Byrde, F; Chen, J; Duda, J-M; Forte, M; Fournier, M; Fronka, A; Haug, T; Herranz, M; Husain, A; Jerome, S; Jiranek, M; Judge, S; Kim, S B; Kwakman, P; Loyen, J; LLaurado, M; Michel, R; Porterfield, D; Ratsirahonana, A; Richards, A; Rovenska, K; Sanada, T; Schuler, C; Thomas, L; Tokonami, S; Tsapalov, A; Yamada, T

    2017-04-01

    Radiological protection is a matter of concern for members of the public and thus national authorities are more likely to trust the quality of radioactivity data provided by accredited laboratories using common standards. Normative approach based on international standards aims to ensure the accuracy or validity of the test result through calibrations and measurements traceable to the International System of Units. This approach guarantees that radioactivity test results on the same types of samples are comparable over time and space as well as between different testing laboratories. Today, testing laboratories involved in radioactivity measurement have a set of more than 150 international standards to help them perform their work. Most of them are published by the International Standardization Organization (ISO) and the International Electrotechnical Commission (IEC). This paper reviews the most essential ISO standards that give guidance to testing laboratories at different stages from sampling planning to the transmission of the test report to their customers, summarizes recent activities and achievements and present the perspectives on new standards under development by the ISO Working Groups dealing with radioactivity measurement in connection with radiological protection. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  16. Standardized Approach to Quantitatively Measure Residual Limb Skin Health in Individuals with Lower Limb Amputation

    Science.gov (United States)

    Rink, Cameron L.; Wernke, Matthew M.; Powell, Heather M.; Tornero, Mark; Gnyawali, Surya C.; Schroeder, Ryan M.; Kim, Jayne Y.; Denune, Jeffrey A.; Albury, Alexander W.; Gordillo, Gayle M.; Colvin, James M.; Sen, Chandan K.

    2017-01-01

    Objective: (1) Develop a standardized approach to quantitatively measure residual limb skin health. (2) Report reference residual limb skin health values in people with transtibial and transfemoral amputation. Approach: Residual limb health outcomes in individuals with transtibial (n = 5) and transfemoral (n = 5) amputation were compared to able-limb controls (n = 4) using noninvasive imaging (hyperspectral imaging and laser speckle flowmetry) and probe-based approaches (laser doppler flowmetry, transcutaneous oxygen, transepidermal water loss, surface electrical capacitance). Results: A standardized methodology that employs noninvasive imaging and probe-based approaches to measure residual limb skin health are described. Compared to able-limb controls, individuals with transtibial and transfemoral amputation have significantly lower transcutaneous oxygen tension, higher transepidermal water loss, and higher surface electrical capacitance in the residual limb. Innovation: Residual limb health as a critical component of prosthesis rehabilitation for individuals with lower limb amputation is understudied in part due to a lack of clinical measures. Here, we present a standardized approach to measure residual limb health in people with transtibial and transfemoral amputation. Conclusion: Technology advances in noninvasive imaging and probe-based measures are leveraged to develop a standardized approach to quantitatively measure residual limb health in individuals with lower limb loss. Compared to able-limb controls, resting residual limb physiology in people that have had transfemoral or transtibial amputation is characterized by lower transcutaneous oxygen tension and poorer skin barrier function. PMID:28736682

  17. Standardized Approach to Quantitatively Measure Residual Limb Skin Health in Individuals with Lower Limb Amputation.

    Science.gov (United States)

    Rink, Cameron L; Wernke, Matthew M; Powell, Heather M; Tornero, Mark; Gnyawali, Surya C; Schroeder, Ryan M; Kim, Jayne Y; Denune, Jeffrey A; Albury, Alexander W; Gordillo, Gayle M; Colvin, James M; Sen, Chandan K

    2017-07-01

    Objective: (1) Develop a standardized approach to quantitatively measure residual limb skin health. (2) Report reference residual limb skin health values in people with transtibial and transfemoral amputation. Approach: Residual limb health outcomes in individuals with transtibial ( n  = 5) and transfemoral ( n  = 5) amputation were compared to able-limb controls ( n  = 4) using noninvasive imaging (hyperspectral imaging and laser speckle flowmetry) and probe-based approaches (laser doppler flowmetry, transcutaneous oxygen, transepidermal water loss, surface electrical capacitance). Results: A standardized methodology that employs noninvasive imaging and probe-based approaches to measure residual limb skin health are described. Compared to able-limb controls, individuals with transtibial and transfemoral amputation have significantly lower transcutaneous oxygen tension, higher transepidermal water loss, and higher surface electrical capacitance in the residual limb. Innovation: Residual limb health as a critical component of prosthesis rehabilitation for individuals with lower limb amputation is understudied in part due to a lack of clinical measures. Here, we present a standardized approach to measure residual limb health in people with transtibial and transfemoral amputation. Conclusion: Technology advances in noninvasive imaging and probe-based measures are leveraged to develop a standardized approach to quantitatively measure residual limb health in individuals with lower limb loss. Compared to able-limb controls, resting residual limb physiology in people that have had transfemoral or transtibial amputation is characterized by lower transcutaneous oxygen tension and poorer skin barrier function.

  18. Acoustics. Measurement of sound insulation in buildings and of building elements. Laboratory measurements of the reduction of transmitted impact noise by floor coverings on a heavyweight standard floor

    CERN Document Server

    British Standards Institution. London

    1998-01-01

    Acoustics. Measurement of sound insulation in buildings and of building elements. Laboratory measurements of the reduction of transmitted impact noise by floor coverings on a heavyweight standard floor

  19. [Measurement of the knee range of motion: standard goniometer or smartphone?].

    Science.gov (United States)

    Rwakabayiza, Sylvia; Pereira, Luis Carlos; Lécureux, Estelle; Jolles-Haeberli, Brigitte

    2013-12-18

    Universal standard goniometer is an essential tool to measure articulations' range of motion (ROM). In this time of technological advances and increasing use of smartphones, new measurement's tools appear as specific smartphone applications. This article compares the iOS application "Knee Goniometer" with universal standard goniometer to assess knee ROM. To our knowledge, this is the first study that uses a goniometer application in a clinical context. The purpose of this study is to determine if this application could be used in clinical practice.

  20. Comparing biomarker measurements to a normal range: when to use standard error of the mean (SEM) or standard deviation (SD) confidence intervals tests.

    Science.gov (United States)

    Pleil, Joachim D

    2016-01-01

    This commentary is the second of a series outlining one specific concept in interpreting biomarkers data. In the first, an observational method was presented for assessing the distribution of measurements before making parametric calculations. Here, the discussion revolves around the next step, the choice of using standard error of the mean or the calculated standard deviation to compare or predict measurement results.

  1. The effect of slurry treatment including ozonation on odorant reduction measured by in-situ PTR-MS

    Science.gov (United States)

    Liu, Dezhao; Feilberg, Anders; Adamsen, Anders P. S.; Jonassen, Kristoffer E. N.

    2011-07-01

    The emission of odorous compounds from intensive pig production facilities is a nuisance for neighbors. Slurry ozonation for odor abatement has previously been demonstrated in laboratory scale. In this study, the effect of slurry ozonation (combined with solid-liquid pre-separation and acidification) on emissions of odorous compounds was tested in an experimental full-scale growing pig facility using Proton-Transfer-Reaction Mass Spectrometry (PTR-MS) for online analysis of odorants. The measurements were performed to gain a better understanding of the effects of ozone treatment on emissions odorous compounds and to identify potential options for optimization of ozone treatment. The compounds monitored included volatile sulfur compounds, amine, carboxylic acids, ketones, phenols and indoles. Measurements were performed during nearly a one-month period in summertime. The compounds with the highest concentrations observed in the ventilation exhaust duct were acetic acid, hydrogen sulfide, propanoic acid and butanoic acid. The compounds with the highest removal efficiencies were hydrogen sulfide, 3-methyl-indole, phenol and acetic acid. Based on odor threshold values, methanethiol, butanoic acid, 4-methylphenol, hydrogen sulfide and C 5 carboxylic acids are estimated to contribute significantly to the odor nuisance. Emissions of odorous compounds were observed to be strongly correlated with temperature with the exception of hydrogen sulfide. Emission peaks of sulfur compounds were seen during slurry handling activities. Discharging of the slurry pit led to reduced hydrogen sulfide emissions, but emissions of most other odorants were not affected. The results indicate that emissions of odorants other than hydrogen sulfide mainly originate from sources other than the treated slurry, which limits the potential for further optimization. The PTR-MS measurements are demonstrated to provide a quantitative, accurate and detailed evaluation of ozone treatment for emission

  2. Mathematical modeling of HIV prevention measures including pre-exposure prophylaxis on HIV incidence in South Korea.

    Science.gov (United States)

    Kim, Sun Bean; Yoon, Myoungho; Ku, Nam Su; Kim, Min Hyung; Song, Je Eun; Ahn, Jin Young; Jeong, Su Jin; Kim, Changsoo; Kwon, Hee-Dae; Lee, Jeehyun; Smith, Davey M; Choi, Jun Yong

    2014-01-01

    Multiple prevention measures have the possibility of impacting HIV incidence in South Korea, including early diagnosis, early treatment, and pre-exposure prophylaxis (PrEP). We investigated how each of these interventions could impact the local HIV epidemic, especially among men who have sex with men (MSM), who have become the major risk group in South Korea. A mathematical model was used to estimate the effects of each these interventions on the HIV epidemic in South Korea over the next 40 years, as compared to the current situation. We constructed a mathematical model of HIV infection among MSM in South Korea, dividing the MSM population into seven groups, and simulated the effects of early antiretroviral therapy (ART), early diagnosis, PrEP, and combination interventions on the incidence and prevalence of HIV infection, as compared to the current situation that would be expected without any new prevention measures. Overall, the model suggested that the most effective prevention measure would be PrEP. Even though PrEP effectiveness could be lessened by increased unsafe sex behavior, PrEP use was still more beneficial than the current situation. In the model, early diagnosis of HIV infection was also effectively decreased HIV incidence. However, early ART did not show considerable effectiveness. As expected, it would be most effective if all interventions (PrEP, early diagnosis and early treatment) were implemented together. This model suggests that PrEP and early diagnosis could be a very effective way to reduce HIV incidence in South Korea among MSM.

  3. Measurement Methods for Humeral Retroversion Using Two-Dimensional Computed Tomography Scans: Which Is Most Concordant with the Standard Method?

    Science.gov (United States)

    Oh, Joo Han; Kim, Woo; Cayetano, Angel A

    2017-06-01

    Humeral retroversion is variable among individuals, and there are several measurement methods. This study was conducted to compare the concordance and reliability between the standard method and 5 other measurement methods on two-dimensional (2D) computed tomography (CT) scans. CT scans from 21 patients who underwent shoulder arthroplasty (19 women and 2 men; mean age, 70.1 years [range, 42 to 81 years]) were analyzed. The elbow transepicondylar axis was used as a distal reference. Proximal reference points included the central humeral head axis (standard method), the axis of the humeral center to 9 mm posterior to the posterior margin of the bicipital groove (method 1), the central axis of the bicipital groove -30° (method 2), the base axis of the triangular shaped metaphysis +2.5° (method 3), the distal humeral head central axis +2.4° (method 4), and contralateral humeral head retroversion (method 5). Measurements were conducted independently by two orthopedic surgeons. The mean humeral retroversion was 31.42° ± 12.10° using the standard method, and 29.70° ± 11.66° (method 1), 30.64° ± 11.24° (method 2), 30.41° ± 11.17° (method 3), 32.14° ± 11.70° (method 4), and 34.15° ± 11.47° (method 5) for the other methods. Interobserver reliability and intraobserver reliability exceeded 0.75 for all methods. On the test to evaluate the equality of the standard method to the other methods, the intraclass correlation coefficients (ICCs) of method 2 and method 4 were different from the ICC of the standard method in surgeon A ( p method 2 and method 3 were different form the ICC of the standard method in surgeon B ( p method 1) would be most concordant with the standard method even though all 5 methods showed excellent agreements.

  4. Advisory Committee for the calibration standards of ionizing radiation measurement. Section 2. Radionuclide measurement

    International Nuclear Information System (INIS)

    1980-01-01

    Section II (Mesure des radionucleides) of Comite Consultatif pour les Etalons de Mesure des Rayonnements Ionisants held its fifth meeting in April 1979. The results of a full scale international comparison of 134 Cs and two limited comparisons of 137 Cs and 55 Fe were discussed. No other comparison is planned in the immediate future but a working group has been set up to study new projects. The BIPM international reference system for measuring the activity of gamma-ray emitters is working satisfactorily. Three working parties presented reports and a monograph in process of completion. BIPM described work recently carried out in field of interest and finally there was a brief exchange of information on the work in progress at the various laboratories represented at the meeting [fr

  5. Comparison of the gold standard of hemoglobin measurement with the clinical standard (BGA) and noninvasive hemoglobin measurement (SpHb) in small children: a prospective diagnostic observational study.

    Science.gov (United States)

    Wittenmeier, Eva; Bellosevich, Sophia; Mauff, Susanne; Schmidtmann, Irene; Eli, Michael; Pestel, Gunther; Noppens, Ruediger R

    2015-10-01

    Collecting a blood sample is usually necessary to measure hemoglobin levels in children. Especially in small children, noninvasively measuring the hemoglobin level could be extraordinarily helpful, but its precision and accuracy in the clinical environment remain unclear. In this study, noninvasive hemoglobin measurement and blood gas analysis were compared to hemoglobin measurement in a clinical laboratory. In 60 healthy preoperative children (0.2-7.6 years old), hemoglobin was measured using a noninvasive method (SpHb; Radical-7 Pulse Co-Oximeter), a blood gas analyzer (clinical standard, BGAHb; ABL 800 Flex), and a laboratory hematology analyzer (reference method, labHb; Siemens Advia). Agreement between the results was assessed by Bland-Altman analysis and by determining the percentage of outliers. Sixty SpHb measurements, 60 labHb measurements, and 59 BGAHb measurements were evaluated. In 38% of the children, the location of the SpHb sensor had to be changed more than twice for the signal quality to be sufficient. The bias/limits of agreement between SpHb and labHb were -0.65/-3.4 to 2.1 g·dl(-1) . Forty-four percent of the SpHb values differed from the reference value by more than 1 g·dl(-1) . Age, difficulty of measurement, and the perfusion index (PI) had no influence on the accuracy of SpHb. The bias/limits of agreement between BGAHb and labHb were 1.14/-1.6 to 3.9 g·dl(-1) . Furthermore, 66% of the BGAHb values differed from the reference values by more than 1 g·dl(-1) . The absolute mean difference between SpHb and labHb (1.1 g·dl(-1) ) was smaller than the absolute mean difference between BGAHb and labHb (1.5 g·dl(-1) /P = 0.024). Noninvasive measurement of hemoglobin agrees more with the reference method than the measurement of hemoglobin using a blood gas analyzer. However, both methods can show clinically relevant differences from the reference method (ClinicalTrials.gov: NCT01693016). © 2015 John Wiley & Sons Ltd.

  6. Reliability of linear distance measurement for dental implant length with standardized periapical radiographs

    International Nuclear Information System (INIS)

    Wakoh, Mamoru; Harada, Takuya; Otonari, Takamichi

    2006-01-01

    The purpose of this study was to investigate the accuracy of distance measurements of implant length based on periapical radiographs compared with that of other modalities. We carried out an experimental trial to compare precision in distance measurement. Dental implant fixtures were buried in the canine and first molar regions. These were then subjected to periapical (PE) radiography, panoramic (PA) radiography conventional (CV) and medical computed (CT) tomography. The length of the implant fixture on each film was measured by nine observers and degree of precision was statistically analyzed. The precision of both PE radiographs and CT tomograms was closest at the highest level. Standardized PE radiography, in particular, was superior to CT tomography in the first molar region. This suggests that standardized PE radiographs should be utilized as a reliable modality for longitudinal and linear distance measurement, depending on implant length at local implantation site. (author)

  7. Uncertainty analysis of standardized measurements of random-incidence absorption and scattering coefficients.

    Science.gov (United States)

    Müller-Trapet, Markus; Vorländer, Michael

    2015-01-01

    This work presents an analysis of the effect of some uncertainties encountered when measuring absorption or scattering coefficients in the reverberation chamber according to International Organization for Standardization/American Society for Testing and Materials standards. This especially relates to the uncertainty due to spatial fluctuations of the sound field. By analyzing the mathematical definition of the respective coefficient, a relationship between the properties of the chamber and the test specimen and the uncertainty in the measured quantity is determined and analyzed. The validation of the established equations is presented through comparisons with measurement data. This study analytically explains the main sources of error and provides a method to obtain the product of the necessary minimum number of measurement positions and the band center frequency to achieve a given maximum uncertainty in the desired quantity. It is shown that this number depends on the ratio of room volume to sample surface area and the reverberation time of the empty chamber.

  8. Laboratory Evaluation of Air Flow Measurement Methods for Residential HVAC Returns for New Instrument Standards

    Energy Technology Data Exchange (ETDEWEB)

    Walker, Iain [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Stratton, Chris [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2015-08-01

    This project improved the accuracy of air flow measurements used in commissioning California heating and air conditioning systems in Title 24 (Building and Appliance Efficiency Standards), thereby improving system performance and efficiency of California residences. The research team at Lawrence Berkeley National Laboratory addressed the issue that typical tools used by contractors in the field to test air flows may not be accurate enough to measure return flows used in Title 24 applications. The team developed guidance on performance of current diagnostics as well as a draft test method for use in future evaluations. The study team prepared a draft test method through ASTM International to determine the uncertainty of air flow measurements at residential heating ventilation and air conditioning returns and other terminals. This test method, when finalized, can be used by the Energy Commission and other entities to specify required accuracy of measurement devices used to show compliance with standards.

  9. Standard test method for measurement of soil resistivity using the two-electrode soil box method

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    2005-01-01

    1.1 This test method covers the equipment and a procedure for the measurement of soil resistivity, for samples removed from the ground, for use in the control of corrosion of buried structures. 1.2 Procedures allow for this test method to be used n the field or in the laboratory. 1.3 The test method procedures are for the resistivity measurement of soil samples in the saturated condition and in the as-received condition. 1.4 The values stated in SI units are to be regarded as the standard. The values given in parentheses are for information only. Soil resistivity values are reported in ohm-centimeter. This standard does not purport to address all of the safety concerns, if any, associated with its use. It is the responsibility of the user of this standard to establish appropriate safety and health practices and to determine the applicability of regulatory limitations prior to use.

  10. Standard Test Method for Electronic Measurement for Hydrogen Embrittlement From Cadmium-Electroplating Processes

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    1996-01-01

    1.1 This test method covers an electronic hydrogen detection instrument procedure for measurement of plating permeability to hydrogen. This method measures a variable related to hydrogen absorbed by steel during plating and to the hydrogen permeability of the plate during post plate baking. A specific application of this method is controlling cadmium-plating processes in which the plate porosity relative to hydrogen is critical, such as cadmium on high-strength steel. This standard does not purport to address all of the safety concerns, if any, associated with its use. It is the responsibility of the user of this standard to establish appropriate safety and health practices and determine the applicability of regulatory limitations prior to use. For specific hazard statement, see Section 8. 1.2 The values stated in SI units are to be regarded as the standard. The values given in parentheses are for information only.

  11. Standardization in dust emission measurement; Mesure des emissions de poussieres normalisation

    Energy Technology Data Exchange (ETDEWEB)

    Perret, R. [INERIS, 60 - Verneuil-en-Halatte, (France)

    1996-12-31

    The European Standardization Committee (CEN TC 264WG5) is developing a new reference method for measuring particulate emissions, suitable for concentrations inferior to 20 mg/m{sup 3} and especially for concentrations around 5 mg/m{sup 3}; the measuring method should be applicable to waste incinerator effluents and more generally to industrial effluents. Testing protocols and data analysis have been examined and repeatability and reproducibility issues are discussed

  12. Calibration and consistency of results of an ionization-chamber secondary standard measuring system for activity

    International Nuclear Information System (INIS)

    Schrader, Heinrich

    2000-01-01

    Calibration in terms of activity of the ionization-chamber secondary standard measuring systems at the PTB is described. The measurement results of a Centronic IG12/A20, a Vinten ISOCAL IV and a radionuclide calibrator chamber for nuclear medicine applications are discussed, their energy-dependent efficiency curves established and the consistency checked using recently evaluated radionuclide decay data. Criteria for evaluating and transferring calibration factors (or efficiencies) are given

  13. Actinide half-lives as standards for nuclear data measurements: current status

    International Nuclear Information System (INIS)

    Reich, C.W.

    1984-01-01

    The present status of knowledge of the half-life data for a number of long-lived actinide nuclides useful as standards for nuclear data measurements and reactor research and technology is summarized. The half-life information given draws heavily from the file of recommended decay data, generated over a number of years through the activities of an IAEA coordinated research program on the measurement and evaluation of transactinium-isotope nuclear decay data, whose work has recently been completed

  14. A novel synthetic quantification standard including virus and internal report targets: application for the detection and quantification of emerging begomoviruses on tomato.

    Science.gov (United States)

    Péréfarres, Frédéric; Hoareau, Murielle; Chiroleu, Frédéric; Reynaud, Bernard; Dintinger, Jacques; Lett, Jean-Michel

    2011-08-05

    Begomovirus is a genus of phytopathogenic single-stranded DNA viruses, transmitted by the whitefly Bemisia tabaci. This genus includes emerging and economically significant viruses such as those associated with Tomato Yellow Leaf Curl Disease, for which diagnostic tools are needed to prevent dispersion and new introductions. Five real-time PCRs with an internal tomato reporter gene were developed for accurate detection and quantification of monopartite begomoviruses, including two strains of the Tomato yellow leaf curl virus (TYLCV; Mld and IL strains), the Tomato leaf curl Comoros virus-like viruses (ToLCKMV-like viruses) and the two molecules of the bipartite Potato yellow mosaic virus. These diagnostic tools have a unique standard quantification, comprising the targeted viral and internal report amplicons. These duplex real-time PCRs were applied to artificially inoculated plants to monitor and compare their viral development. Real-time PCRs were optimized for accurate detection and quantification over a range of 2 × 10(9) to 2 × 10(3) copies of genomic viral DNA/μL for TYLCV-Mld, TYLCV-IL and PYMV-B and 2 × 10(8) to 2 × 10(3) copies of genomic viral DNA/μL for PYMV-A and ToLCKMV-like viruses. These real-time PCRs were applied to artificially inoculated plants and viral loads were compared at 10, 20 and 30 days post-inoculation. Different patterns of viral accumulation were observed between the bipartite and the monopartite begomoviruses. Interestingly, PYMV accumulated more viral DNA at each date for both genomic components compared to all the monopartite viruses. Also, PYMV reached its highest viral load at 10 dpi contrary to the other viruses (20 dpi). The accumulation kinetics of the two strains of emergent TYLCV differed from the ToLCKMV-like viruses in the higher quantities of viral DNA produced in the early phase of the infection and in the shorter time to reach this peak viral load. To detect and quantify a wide range of begomoviruses, five duplex

  15. A novel synthetic quantification standard including virus and internal report targets: application for the detection and quantification of emerging begomoviruses on tomato

    Directory of Open Access Journals (Sweden)

    Lett Jean-Michel

    2011-08-01

    Full Text Available Abstract Background Begomovirus is a genus of phytopathogenic single-stranded DNA viruses, transmitted by the whitefly Bemisia tabaci. This genus includes emerging and economically significant viruses such as those associated with Tomato Yellow Leaf Curl Disease, for which diagnostic tools are needed to prevent dispersion and new introductions. Five real-time PCRs with an internal tomato reporter gene were developed for accurate detection and quantification of monopartite begomoviruses, including two strains of the Tomato yellow leaf curl virus (TYLCV; Mld and IL strains, the Tomato leaf curl Comoros virus-like viruses (ToLCKMV-like viruses and the two molecules of the bipartite Potato yellow mosaic virus. These diagnostic tools have a unique standard quantification, comprising the targeted viral and internal report amplicons. These duplex real-time PCRs were applied to artificially inoculated plants to monitor and compare their viral development. Results Real-time PCRs were optimized for accurate detection and quantification over a range of 2 × 109 to 2 × 103 copies of genomic viral DNA/μL for TYLCV-Mld, TYLCV-IL and PYMV-B and 2 × 108 to 2 × 103 copies of genomic viral DNA/μL for PYMV-A and ToLCKMV-like viruses. These real-time PCRs were applied to artificially inoculated plants and viral loads were compared at 10, 20 and 30 days post-inoculation. Different patterns of viral accumulation were observed between the bipartite and the monopartite begomoviruses. Interestingly, PYMV accumulated more viral DNA at each date for both genomic components compared to all the monopartite viruses. Also, PYMV reached its highest viral load at 10 dpi contrary to the other viruses (20 dpi. The accumulation kinetics of the two strains of emergent TYLCV differed from the ToLCKMV-like viruses in the higher quantities of viral DNA produced in the early phase of the infection and in the shorter time to reach this peak viral load. Conclusions To detect and

  16. Can consistent benchmarking within a standardized pain management concept decrease postoperative pain after total hip arthroplasty? A prospective cohort study including 367 patients

    Directory of Open Access Journals (Sweden)

    Benditz A

    2016-12-01

    Full Text Available Achim Benditz,1 Felix Greimel,1 Patrick Auer,2 Florian Zeman,3 Antje Göttermann,4 Joachim Grifka,1 Winfried Meissner,4 Frederik von Kunow1 1Department of Orthopedics, University Medical Center Regensburg, 2Clinic for anesthesia, Asklepios Klinikum Bad Abbach, Bad Abbach, 3Centre for Clinical Studies, University Medical Center Regensburg, Regensburg, 4Department of Anesthesiology and Intensive Care, Jena University Hospital, Jena, Germany Background: The number of total hip replacement surgeries has steadily increased over recent years. Reduction in postoperative pain increases patient satisfaction and enables better mobilization. Thus, pain management needs to be continuously improved. Problems are often caused not only by medical issues but also by organization and hospital structure. The present study shows how the quality of pain management can be increased by implementing a standardized pain concept and simple, consistent, benchmarking.Methods: All patients included in the study had undergone total hip arthroplasty (THA. Outcome parameters were analyzed 24 hours after surgery by means of the questionnaires from the German-wide project “Quality Improvement in Postoperative Pain Management” (QUIPS. A pain nurse interviewed patients and continuously assessed outcome quality parameters. A multidisciplinary team of anesthetists, orthopedic surgeons, and nurses implemented a regular procedure of data analysis and internal benchmarking. The health care team was informed of any results, and suggested improvements. Every staff member involved in pain management participated in educational lessons, and a special pain nurse was trained in each ward.Results: From 2014 to 2015, 367 patients were included. The mean maximal pain score 24 hours after surgery was 4.0 (±3.0 on an 11-point numeric rating scale, and patient satisfaction was 9.0 (±1.2. Over time, the maximum pain score decreased (mean 3.0, ±2.0, whereas patient satisfaction

  17. Revisions to the 2009 American Society of Clinical Oncology/Oncology Nursing Society chemotherapy administration safety standards: expanding the scope to include inpatient settings.

    Science.gov (United States)

    Jacobson, Joseph O; Polovich, Martha; Gilmore, Terry R; Schulmeister, Lisa; Esper, Peg; Lefebvre, Kristine B; Neuss, Michael N

    2012-01-01

    In November 2009, the American Society of Clinical Oncology (ASCO) and the Oncology Nursing Society (ONS) jointly published a set of 31 voluntary chemotherapy safety standards for adult patients with cancer, as the end result of a highly structured, multistakeholder process. The standards were explicitly created to address patient safety in the administration of parenteral and oral chemotherapeutic agents in outpatient oncology settings. In January 2011, a workgroup consisting of ASCO and ONS members was convened to review feedback received since publication of the standards, to address interim changes in practice, and to modify the standards as needed. The most significant change to the standards is to extend their scope to the inpatient setting. This change reflects the conviction that the same standards for chemotherapy administration safety should apply in all settings. The proposed set of standards has been approved by the Board of Directors for both ASCO and ONS and has been posted for public comment. Comments were used as the basis for final editing of the revised standards. The workgroup recognizes that the safety of oral chemotherapy usage, nononcology medication reconciliation, and home chemotherapy administration are not adequately addressed in the original or revised standards. A separate process, cosponsored by ASCO and ONS, will address the development of safety standards for these areas.

  18. An Evaluation of Standardized Tests as Tools for the Measurement of Language Development.

    Science.gov (United States)

    Roberts, Elsa

    Four tests--PPVT, ITPA, MRT, WPPSI--commonly used to measure language development in young children are evaluated by four criteria: (1) what development aspects do they claim to tap; (2) what do they actually tap; (3) what linguistic knowledge is presupposed; (4) what special problems face a non-standard English speaker. These tests are considered…

  19. Development and Standardization of Inventory for Measuring Students' Integration into University Academic Culture

    Science.gov (United States)

    Esomonu, Nkechi Patricia-Mary; Okeaba, James Uzoma

    2016-01-01

    The study developed and standardized an Inventory for measuring Students' Integration into University Academic Culture named Inventory for Students' Integration into University Academic Culture (ISIUAC). The increase in dropout rates, substance use, cultism and other deviant behaviours in Nigerian universities makes it necessary for one to ask the…

  20. Measuring Course Competencies in a School of Business: The Use of Standardized Curriculum and Rubrics

    Science.gov (United States)

    Gibson, Jane Whitney

    2011-01-01

    This paper examines the growing emphasis on measurement of course competencies by individual college students through two course examples, an undergraduate course in managing change and conflict and a graduate course in human resource management. The author explains how standardized curriculum and assignment rubrics are being used to measure…

  1. Magnet measurement interfacing to the G-64 Euro standard bus and testing G-64 modules

    International Nuclear Information System (INIS)

    Hogrefe, R.L.

    1995-01-01

    The Magnet Measurement system utilizes various modules with a G-64 Euro (Gespac) Standard Interface. All modules are designed to be software controlled, normally under the constraints of the OS-9 operating system with all data transfers to a host computer accomplished by a serial link

  2. 30 CFR 202.353 - Measurement standards for reporting and paying royalties and direct use fees.

    Science.gov (United States)

    2010-07-01

    ... gallons to the nearest hundred gallons. (e) You need not report the quality of geothermal resources... 30 Mineral Resources 2 2010-07-01 2010-07-01 false Measurement standards for reporting and paying royalties and direct use fees. 202.353 Section 202.353 Mineral Resources MINERALS MANAGEMENT SERVICE...

  3. The Relationship between Mean Square Differences and Standard Error of Measurement: Comment on Barchard (2012)

    Science.gov (United States)

    Pan, Tianshu; Yin, Yue

    2012-01-01

    In the discussion of mean square difference (MSD) and standard error of measurement (SEM), Barchard (2012) concluded that the MSD between 2 sets of test scores is greater than 2(SEM)[superscript 2] and SEM underestimates the score difference between 2 tests when the 2 tests are not parallel. This conclusion has limitations for 2 reasons. First,…

  4. An assessment of cellulose filters as a standardized material for measuring litter breakdown in headwater streams

    Science.gov (United States)

    The decay rate of cellulose filters and associated chemical and biological characteristics were compared to those of white oak (Quercus alba) leaves to determine if cellulose filters could be a suitable standardized material for measuring deciduous leaf breakdown in headwater str...

  5. Recent updates on the Standard Model Higgs boson measurements from the ATLAS and CMS experiments

    CERN Document Server

    Wang, Song-Ming

    2017-01-01

    This report presents the latest results from the ATLAS and CMS experiments on the measurements of the Standard Model Higgs boson by using the proton-proton collisions produced by the Large Hadron Collider during the first two years of Run 2 data taking.

  6. Internationally Comparable Measures of Occupational Status for the 1988 International Standard Classification of Occupations

    NARCIS (Netherlands)

    Ganzeboom, H.B.G.; Treiman, D.J.

    1996-01-01

    This paper provides operational procedures for coding internationally comparable measures of occupational status from the recently published International Standard Classification of Occupation 1988 (ISCO88) of the International Labor Office (ILO, 1990). We first discuss the nature of the ISCO88

  7. An International Standard Set of Patient-Centered Outcome Measures After Stroke

    NARCIS (Netherlands)

    Salinas, J. (Joel); Sprinkhuizen, S.M. (Sara M.); Ackerson, T. (Teri); Bernhardt, J. (Julie); Davie, C. (Charlie); George, M.G. (Mary G.); Gething, S. (Stephanie); Kelly, A.G. (Adam G.); Lindsay, P. (Patrice); Liu, L. (Liping); Martins, S.C.O. (Sheila C.O.); Morgan, L. (Louise); B. Norrving (Bo); Ribbers, G.M. (Gerard M.); Silver, F.L. (Frank L.); Smith, E.E. (Eric E.); Williams, L.S. (Linda S.); Schwamm, L.H. (Lee H.)

    2015-01-01

    markdownabstract__BACKGROUND AND PURPOSE:__ Value-based health care aims to bring together patients and health systems to maximize the ratio of quality over cost. To enable assessment of healthcare value in stroke management, an international standard set of patient-centered stroke outcome measures

  8. "First among Others? Cohen's ""d"" vs. Alternative Standardized Mean Group Difference Measures"

    Directory of Open Access Journals (Sweden)

    Sorel Cahan

    2011-06-01

    Full Text Available Standardized effect size measures typically employed in behavioral and social sciences research in the multi-group case (e.g., 2, f2 evaluate between-group variability in terms of either total or within-group variability, such as variance or standard deviation -' that is, measures of dispersion about the mean. In contrast, the definition of Cohen's d, the effect size measure typically computed in the two-group case, is incongruent due to a conceptual difference between the numerator -' which measures between-group variability by the intuitive and straightforward raw difference between the two group means -' and the denominator - which measures within-group variability in terms of the difference between all observations and the group mean (i.e., the pooled within-groups standard deviation, SW. Two congruent alternatives to d, in which the root square or absolute mean difference between all observation pairs is substituted for SW as the variability measure in the denominator of d, are suggested and their conceptual and statistical advantages and disadvantages are discussed.

  9. Standard Test Methods for Measurement of Electrical Performance and Spectral Response of Nonconcentrator Multijunction Photovoltaic Cells and Modules

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    2010-01-01

    1.1 These test methods provide special techniques needed to determine the electrical performance and spectral response of two-terminal, multijunction photovoltaic (PV) devices, both cell and modules. 1.2 These test methods are modifications and extensions of the procedures for single-junction devices defined by Test Methods E948, E1021, and E1036. 1.3 These test methods do not include temperature and irradiance corrections for spectral response and current-voltage (I-V) measurements. Procedures for such corrections are available in Test Methods E948, E1021, and E1036. 1.4 These test methods may be applied to cells and modules intended for concentrator applications. 1.5 The values stated in SI units are to be regarded as standard. No other units of measurement are included in this standard. 1.6 This standard does not purport to address all of the safety concerns, if any, associated with its use. It is the responsibility of the user of this standard to establish appropriate safety and health practices and ...

  10. National Energy Efficiency Evaluation, Measurement and Verification (EM&V) Standard: Scoping Study of Issues and Implementation Requirements

    Energy Technology Data Exchange (ETDEWEB)

    Schiller Consulting, Inc.; Schiller, Steven R.; Goldman, Charles A.; Galawish, Elsia

    2011-02-04

    This report is a scoping study that identifies issues associated with developing a national evaluation, measurement and verification (EM&V) standard for end-use, non-transportation, energy efficiency activities. The objectives of this study are to identify the scope of such a standard and define EM&V requirements and issues that will need to be addressed in a standard. To explore these issues, we provide and discuss: (1) a set of definitions applicable to an EM&V standard; (2) a literature review of existing guidelines, standards, and 'initiatives' relating to EM&V standards as well as a review of 'bottom-up' versus 'top-down' evaluation approaches; (3) a summary of EM&V related provisions of two recent federal legislative proposals (Congressman Waxman's and Markey's American Clean Energy and Security Act of 2009 and Senator Bingaman's American Clean Energy Leadership Act of 2009) that include national efficiency resource requirements; (4) an annotated list of issues that that are likely to be central to, and need to be considered when, developing a national EM&V standard; and (5) a discussion of the implications of such issues. There are three primary reasons for developing a national efficiency EM&V standard. First, some policy makers, regulators and practitioners believe that a national standard would streamline EM&V implementation, reduce costs and complexity, and improve comparability of results across jurisdictions; although there are benefits associated with each jurisdiction setting its own EM&V requirements based on their specific portfolio and evaluation budgets and objectives. Secondly, if energy efficiency is determined by the US Environmental Protection Agency to be a Best Available Control Technology (BACT) for avoiding criteria pollutant and/or greenhouse gas emissions, then a standard can be required for documenting the emission reductions resulting from efficiency actions. The third reason for a national

  11. Measurement of the Standard Model W+W- production cross-section using the ATLAS experiment on the LHC

    International Nuclear Information System (INIS)

    Zeman, Martin

    2014-01-01

    Measurements of di-boson production cross-sections are an important part of the physics programme at the CERN Large Hadron Collider. These physics analyses provide the opportunity to probe the electroweak sector of the Standard Model at the TeV scale and could also indicate the existence of new particles or probe beyond the Standard Model physics. The excellent performance of the LHC through years 2011 and 2012 allowed for very competitive measurements. This thesis provides a comprehensive overview of the experimental considerations and methods used in the measurement of the W + W - production cross-section in proton-proton collisions at √s = 7 TeV and 8 TeV. The treatise covers the material in great detail, starting with the introduction of the theoretical framework of the Standard Model and follows with an extensive discussion of the methods implemented in recording and reconstructing physics events in an experiment of this magnitude. The associated online and offline software tools are included in the discussion. The relevant experiments are covered, including a very detailed section about the ATLAS detector. The final chapter of this thesis contains a detailed description of the analysis of the W-pair production in the leptonic decay channels using the datasets recorded by the ATLAS experiment during 2011 and 2012 (Run I). The analyses use 4.60 fb -1 recorded at √s = 7 TeV and 20.28 fb -1 recorded at 8 TeV. The experimentally measured cross section for the production of W bosons at the ATLAS experiment is consistently enhanced compared to the predictions of the Standard Model at centre-of-mass energies of 7 TeV and 8 TeV. The thesis concludes with the presentation of differential cross-section measurement results. (author) [fr

  12. Subjective and objective measurement of the intelligibility of synthesized speech impaired by the very low bit rate STANAG 4591 codec including packet loss

    NARCIS (Netherlands)

    Počta, P.; Beerends, J.G.

    2017-01-01

    This paper deals with the intelligibility of speech coded by the STANAG 4591 standard codec, including packet loss, using synthesized speech input. Both subjective and objective assessments are used. It is shown that this codec significantly degrades intelligibility when compared to a standard

  13. A survey on development of neutron standards and neutron measurement technique corresponding to various fields

    International Nuclear Information System (INIS)

    Matsumoto, Tetsuro

    2007-01-01

    Various uses of neutrons are being watched in many fields such as industry, medical technology and radiation protection. Especially, high energy neutrons above 15 MeV are important in a radiation exposure issue of an aircraft and a soft error issue of a semiconductor. Therefore neutron fluence standards for the high energy region are very important. However, the standards are not almost provided all over the world. Three reasons are mainly considered: (1) poor measurement techniques for the high energy neutrons, (2) a small number of high energy neutron facilities and (3) lack of nuclear data for high energy particle reactions. In this paper, the present status of the measurement techniques, the facilities and the nuclear data is investigated and discussed. In NMIJ/AIST, the 19.0-MeV neutron fluence standard will be established by 2010, and development of high energy neutron standards above 20 MeV is also examined. An outline of the development of the high energy neutron standards is also shown in this paper. (author)

  14. Standard test method for calibration of surface/stress measuring devices

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    1997-01-01

    Return to Contents page 1.1 This test method covers calibration or verification of calibration, or both, of surface-stress measuring devices used to measure stress in annealed and heat-strengthened or tempered glass using polariscopic or refractometry based principles. 1.2 This test method is nondestructive. 1.3 This test method uses transmitted light, and therefore, is applicable to light-transmitting glasses. 1.4 This test method is not applicable to chemically tempered glass. 1.5 Using the procedure described, surface stresses can be measured only on the “tin” side of float glass. 1.6 This standard does not purport to address all of the safety concerns, if any, associated with its use. It is the responsibility of the user of this standard to establish appropriate safety and health practices and determine the applicability of regulatory limitations prior to use.

  15. On the measurement of the neutrino velocity applying the standard time of the Global Positioning System

    Science.gov (United States)

    Skeivalas, J.; Parseliunas, E.

    2013-01-01

    The measurement of the neutrino velocity applying the standard time of the Global Positioning System (GPS) is presented in the paper. The practical data were taken from the OPERA experiment, in which neutrino emission from the CERN LHC accelerator to Gran Sasso detector was investigated. The distance between accelerator and detector is about 730 km. The time interval was measured by benchmark clocks, which were calibrated by the standard GPS time signals received from GPS satellites. The calculation of the accuracy of the GPS time signals with respect to changes of the signals' frequencies due to the Doppler effect is presented. It is shown that a maximum error of about 200 ns could occur when GPS time signals are applied for the calibration of the clocks for the neutrino velocity measurements.

  16. Alcohol Consumption in Vietnam, and the Use of 'Standard Drinks' to Measure Alcohol Intake.

    Science.gov (United States)

    Van Bui, Tan; Blizzard, C Leigh; Luong, Khue Ngoc; Van Truong, Ngoc Le; Tran, Bao Quoc; Otahal, Petr; Srikanth, Velandai; Nelson, Mark R; Au, Thuy Bich; Ha, Son Thai; Phung, Hai Ngoc; Tran, Mai Hoang; Callisaya, Michele; Gall, Seana

    2016-03-01

    To provide nationally representative data on alcohol consumption in Vietnam and to assess whether reported numbers of 'standard drinks' consumed have evidence of validity (particularly in rural areas where home-made alcohol is consumed from cups of varying size). A nationally representative population-based survey of 14,706 participants (46.5% males, response proportion 64.1%) aged 25-64 years in Vietnam. Measurements were made in accordance with WHO STEPS protocols. Data were analysed using complex survey methods. Among men, 80% reported drinking alcohol during the last year, and 40% were hazardous/harmful drinkers. Approximately 60% of men and alcohol during the last week, with one-in-four of the men reporting having consumed at least five standard drinks on at least one occasion. Numbers of standard drinks reported by men were associated with blood pressure/hypertension, particularly in rural areas (P alcohol consumption was provided by binary responses to questions on whether or not alcohol had been consumed during the reference period. Alcohol use and harmful consumption were common among Vietnamese men but less pronounced than in Western nations. Self-reports of quantity of alcohol consumed in terms of standard drinks had predictive validity for blood pressure and hypertension even in rural areas. However, using detailed measures of consumption resulted in only minor improvements in prediction compared to simple measures. © The Author 2015. Medical Council on Alcohol and Oxford University Press. All rights reserved.

  17. Consensus standards for acquisition, measurement, and reporting of intravascular optical coherence tomography studies

    DEFF Research Database (Denmark)

    Tearney, Guillermo J; Regar, Evelyn; Akasaka, Takashi

    2012-01-01

    The purpose of this document is to make the output of the International Working Group for Intravascular Optical Coherence Tomography (IWG-IVOCT) Standardization and Validation available to medical and scientific communities, through a peer-reviewed publication, in the interest of improving the di...... the diagnosis and treatment of patients with atherosclerosis, including coronary artery disease....

  18. Standard test method for measuring pH of soil for use in corrosion testing

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    1995-01-01

    1.1 This test method covers a procedure for determining the pH of a soil in corrosion testing. The principle use of the test is to supplement soil resistivity measurements and thereby identify conditions under which the corrosion of metals in soil may be accentuated (see G 57 - 78 (1984)). 1.2 This standard does not purport to address all of the safety concerns, if any, associated with its use. It is the responsibility of the user of this standard to establish appropriate safety and health practices and determine the applicability of regulatory limitations prior to use.

  19. A Comparison of Three Methods for Computing Scale Score Conditional Standard Errors of Measurement. ACT Research Report Series, 2013 (7)

    Science.gov (United States)

    Woodruff, David; Traynor, Anne; Cui, Zhongmin; Fang, Yu

    2013-01-01

    Professional standards for educational testing recommend that both the overall standard error of measurement and the conditional standard error of measurement (CSEM) be computed on the score scale used to report scores to examinees. Several methods have been developed to compute scale score CSEMs. This paper compares three methods, based on…

  20. Preface of "The Second Symposium on Border Zones Between Experimental and Numerical Application Including Solution Approaches By Extensions of Standard Numerical Methods"

    Science.gov (United States)

    Ortleb, Sigrun; Seidel, Christian

    2017-07-01

    In this second symposium at the limits of experimental and numerical methods, recent research is presented on practically relevant problems. Presentations discuss experimental investigation as well as numerical methods with a strong focus on application. In addition, problems are identified which require a hybrid experimental-numerical approach. Topics include fast explicit diffusion applied to a geothermal energy storage tank, noise in experimental measurements of electrical quantities, thermal fluid structure interaction, tensegrity structures, experimental and numerical methods for Chladni figures, optimized construction of hydroelectric power stations, experimental and numerical limits in the investigation of rain-wind induced vibrations as well as the application of exponential integrators in a domain-based IMEX setting.

  1. Use of existing standards to measure sound power levels of powered hand tools-necessary revisions

    Science.gov (United States)

    Hayden, Charles S.; Zechmann, Edward

    2005-09-01

    At recent NOISE-CON and Acoustical Society of America meetings, noise rating labeling was discussed as a way of manufacturers providing full disclosure information for their noise emitting products. The first step is to gather sound power level data from these products. Sound power level data should be gathered in accordance with existing ANSI and/or ISO standards. Some standards, such as ANSI 12.15, may not define true operational noise emissions[r1] and thus may provide inaccurate information when that information is used to choose a hearing protection device or used to make a purchasing decision. A number of standards were systematically combined by NIOSH researchers to provide the most accurate information on sound power levels of powered hand tools used in the construction industry. This presentation will detail some of the challenges of existing ANSI 12.15 (and draft ANSI 12.41) to measure sound power levels of electric (and pneumatic) powered hand tools.

  2. Standard Practice for QCM Measurement of Spacecraft Molecular Contamination in Space

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    2004-01-01

    1.1 This practice provides guidance for making decisions concerning the use of a quartz crystal microbalance (QCM) and a thermoelectrically cooled quartz crystal microbalance (TQCM) in space where contamination problems on spacecraft are likely to exist. Careful adherence to this document should ensure adequate measurement of condensation of molecular constituents that are commonly termed “contamination” on spacecraft surfaces. 1.2 A corollary purpose is to provide choices among the flight-qualified QCMs now existing to meet specific needs. 1.3 The values stated in SI units are to be regarded as the standard. The values given in parentheses are for information only. 1.4 This standard does not purport to address all of the safety concerns, if any, associated with its use. It is the responsibility of the user of this standard to establish appropriate safety and health practices and determine the applicability of regulatory limitations prior to use.

  3. Standard practice for examination of welds using the alternating current field measurement technique

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    2007-01-01

    1.1 This practice describes procedures to be followed during alternating current field measurement examination of welds for baseline and service-induced surface breaking discontinuities. 1.2 This practice is intended for use on welds in any metallic material. 1.3 This practice does not establish weld acceptance criteria. 1.4 The values stated in either inch-pound units or SI units are to be regarded separately as standard. The values stated in each system might not be exact equivalents; therefore, each system shall be used independently of the other. 1.5 This standard does not purport to address all of the safety concerns, if any, associated with its use. It is the responsibility of the user of this standard to establish appropriate safety and health practices and determine the applicability of regulatory limitations prior to use.

  4. Current issues with standards in the measurement and documentation of human skeletal anatomy.

    Science.gov (United States)

    Magee, Justin; McClelland, Brian; Winder, John

    2012-09-01

    Digital modeling of human anatomy has become increasingly important and relies on well-documented quantitative anatomy literature. This type of documentation is common for the spine and pelvis; however, significant issues exist due to the lack of standardization in measurement and technique. Existing literature on quantitative anatomy for the spine and pelvis of white adults (aged 18-65 years, separated into decadal categories) was reviewed from the disciplines of anatomy, manipulative therapy, anthropometrics, occupational ergonomics, biomechanics and forensic science. The data were unified into a single normative model of the sub-axial spine. Two-dimensional orthographic drawings were produced from the 590 individual measurements identified, which informed the development of a 3D digital model. A similar review of full range of motion data was conducted as a meta-analysis and the results were applied to the existing model, providing an inter-connected, articulated digital spine. During these data analysis processes several inconsistencies were observed accompanied by an evidential lack of standardization with measurement and recording of data. These have been categorized as: anatomical terminology; scaling of measurements; measurement methodology, dimension and anatomical reference positions; global coordinate systems. There is inconsistency in anatomical terminology where independent researchers use the same terms to describe different aspects of anatomy or different terms for the same anatomy. Published standards exist for measurement methods of the human body regarding spatial interaction, anthropometric databases, automotive applications, clothing industries and for computer manikins, but none exists for skeletal anatomy. Presentation of measurements often lacks formal structure in clinical publications, seldom providing geometric reference points, therefore making digital reconstruction difficult. Published quantitative data does not follow existing

  5. The standard centrifuge method accurately measures vulnerability curves of long-vesselled olive stems.

    Science.gov (United States)

    Hacke, Uwe G; Venturas, Martin D; MacKinnon, Evan D; Jacobsen, Anna L; Sperry, John S; Pratt, R Brandon

    2015-01-01

    The standard centrifuge method has been frequently used to measure vulnerability to xylem cavitation. This method has recently been questioned. It was hypothesized that open vessels lead to exponential vulnerability curves, which were thought to be indicative of measurement artifact. We tested this hypothesis in stems of olive (Olea europea) because its long vessels were recently claimed to produce a centrifuge artifact. We evaluated three predictions that followed from the open vessel artifact hypothesis: shorter stems, with more open vessels, would be more vulnerable than longer stems; standard centrifuge-based curves would be more vulnerable than dehydration-based curves; and open vessels would cause an exponential shape of centrifuge-based curves. Experimental evidence did not support these predictions. Centrifuge curves did not vary when the proportion of open vessels was altered. Centrifuge and dehydration curves were similar. At highly negative xylem pressure, centrifuge-based curves slightly overestimated vulnerability compared to the dehydration curve. This divergence was eliminated by centrifuging each stem only once. The standard centrifuge method produced accurate curves of samples containing open vessels, supporting the validity of this technique and confirming its utility in understanding plant hydraulics. Seven recommendations for avoiding artefacts and standardizing vulnerability curve methodology are provided. © 2014 The Authors. New Phytologist © 2014 New Phytologist Trust.

  6. Measuring synovial fluid procalcitonin levels in distinguishing cases of septic arthritis, including prosthetic joints, from other causes of arthritis and aseptic loosening.

    Science.gov (United States)

    Saeed, K; Dryden, M; Sitjar, A; White, G

    2013-08-01

    Differentiating septic arthritis from non-septic arthritis can be challenging as the clinical pictures are similar and an efficacious diagnostic test is not yet available. Our objectives in this study were to establish if procalcitonin (PCT) could be reproducibly measured from synovial fluid, if there is a difference in synovial procalcitonin values between patients with septic and non-septic arthritis, respectively, including those with implants and to determine cut-off levels that could be used as a practical tool in the management of these conditions. Using a standard serum assay, synovial fluid PCT levels were measured retrospectively in 26 septic and 50 non-septic predefined arthritis cases. The reproducibility of synovial PCT was also assessed at various concentrations. Synovial PCT can be measured and is reproducible. In this cohort, statistically significant higher synovial PCT levels were found in cases of septic arthritis than in non-septic arthritis. Sensitivities, specificities and positive and negative predictive values varied at different cut-off levels. The test could be added to other microbiological and biochemical tests and may be used to supplement other clinical, radiological and laboratory findings in the assessment of patients with acute painful joints. In our cohort, findings of very high synovial PCT levels supported an infection process, including in prosthesis-related infections. The high negative predictive value of low synovial PCT levels could exclude infection in both native and prosthetic joints. Larger prospective studies are needed to further validate these results and to examine the cost effectiveness of synovial PCT.

  7. Retrospective Analysis of NIST Standard Reference Material 1450, Fibrous Glass Board, for Thermal Insulation Measurements

    Science.gov (United States)

    Zarr, Robert R; Heckert, N Alan; Leigh, Stefan D

    2014-01-01

    Thermal conductivity data acquired previously for the establishment of Standard Reference Material (SRM) 1450, Fibrous Glass Board, as well as subsequent renewals 1450a, 1450b, 1450c, and 1450d, are re-analyzed collectively and as individual data sets. Additional data sets for proto-1450 material lots are also included in the analysis. The data cover 36 years of activity by the National Institute of Standards and Technology (NIST) in developing and providing thermal insulation SRMs, specifically high-density molded fibrous-glass board, to the public. Collectively, the data sets cover two nominal thicknesses of 13 mm and 25 mm, bulk densities from 60 kg·m−3 to 180 kg·m−3, and mean temperatures from 100 K to 340 K. The analysis repetitively fits six models to the individual data sets. The most general form of the nested set of multilinear models used is given in the following equation: λ(ρ,T)=a0+a1ρ+a2T+a3T3+a4e−(T−a5a6)2where λ(ρ,T) is the predicted thermal conductivity (W·m−1·K−1), ρ is the bulk density (kg·m−3), T is the mean temperature (K) and ai (for i = 1, 2, … 6) are the regression coefficients. The least squares fit results for each model across all data sets are analyzed using both graphical and analytic techniques. The prevailing generic model for the majority of data sets is the bilinear model in ρ and T. λ(ρ,T)=a0+a1ρ+a2T One data set supports the inclusion of a cubic temperature term and two data sets with low-temperature data support the inclusion of an exponential term in T to improve the model predictions. Physical interpretations of the model function terms are described. Recommendations for future renewals of SRM 1450 are provided. An Addendum provides historical background on the origin of this SRM and the influence of the SRM on external measurement programs. PMID:26601034

  8. New Primary Standards for Establishing SI Traceability for Moisture Measurements in Solid Materials

    Science.gov (United States)

    Heinonen, M.; Bell, S.; Choi, B. Il; Cortellessa, G.; Fernicola, V.; Georgin, E.; Hudoklin, D.; Ionescu, G. V.; Ismail, N.; Keawprasert, T.; Krasheninina, M.; Aro, R.; Nielsen, J.; Oğuz Aytekin, S.; Österberg, P.; Skabar, J.; Strnad, R.

    2018-01-01

    A European research project METefnet addresses a fundamental obstacle to improving energy-intensive drying process control: due to ambiguous reference analysis methods and insufficient methods for estimating uncertainty in moisture measurements, the achievable accuracy in the past was limited and measurement uncertainties were largely unknown. This paper reports the developments in METefnet that provide a sound basis for the SI traceability: four new primary standards for realizing the water mass fraction were set up, analyzed and compared to each other. The operation of these standards is based on combining sample weighing with different water vapor detection techniques: cold trap, chilled mirror, electrolytic and coulometric Karl Fischer titration. The results show that an equivalence of 0.2 % has been achieved between the water mass fraction realizations and that the developed methods are applicable to a wide range of materials.

  9. THE NEW BASEL CAPITAL ACCORD - AN INTERNATIONAL CONVERGENCE OF CAPITAL MEASUREMENTS AND CAPITAL STANDARDS IN BANKING

    Directory of Open Access Journals (Sweden)

    IMOLA DRIGĂ

    2007-01-01

    Full Text Available The International Convergence of Capital Measurements and Capital Standards was finally published on June 26, 2004 by the Basel Committee on Banking Supervision. This framework is known in the market as Basel II and it replaces the current framework (Basel I for banks as to how they calculate their capital requirements. The Basel II describes a more comprehensive measure and minimum standard for capital adequacy that national supervisory authorities are implementing through domestic rule-making and adoption procedures. It seeks to improve on the existing rules by aligning regulatory capital requirements more closely to the underlying risks that banks face. In addition, the Basel II is intended to promote a more forward-looking approach to capital supervision.

  10. Standard Test Method for Measuring Fast-Neutron Reaction Rates by Radioactivation of Aluminum

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    2011-01-01

    1.1 This test method covers procedures measuring reaction rates by the activation reaction 27Al(n,α)24Na. 1.2 This activation reaction is useful for measuring neutrons with energies above approximately 6.5 MeV and for irradiation times up to about 2 days (for longer irradiations, see Practice E261). 1.3 With suitable techniques, fission-neutron fluence rates above 106 cm−2·s−1 can be determined. 1.4 Detailed procedures for other fast neutron detectors are referenced in Practice E261. 1.5 This standard does not purport to address all of the safety concerns, if any, associated with its use. It is the responsibility of the user of this standard to establish appropriate safety and health practices and determine the applicability of regulatory limitations prior to use.

  11. Performance of the RAD-57 pulse CO-oximeter compared with standard laboratory carboxyhemoglobin measurement.

    Science.gov (United States)

    Touger, Michael; Birnbaum, Adrienne; Wang, Jessica; Chou, Katherine; Pearson, Darion; Bijur, Polly

    2010-10-01

    We assess agreement between carboxyhemoglobin levels measured by the Rad-57 signal extraction pulse CO-oximeter (RAD), a Food and Drug Administration-approved device for noninvasive bedside measurement, and standard laboratory arterial or venous measurement in a sample of emergency department (ED) patients with suspected carbon monoxide poisoning. The study was a cross-sectional cohort design using a convenience sample of adult and pediatric ED patients in a Level I trauma, burn, and hyperbaric oxygen referral center. Measurement of RAD carboxyhemoglobin was performed simultaneously with blood sampling for laboratory determination of carboxyhemoglobin level. The difference between the measures for each patient was calculated as laboratory carboxyhemoglobin minus carboxyhemoglobin from the carbon monoxide oximeter. The limits of agreement from a Bland-Altman analysis are calculated as the mean of the differences between methods ±1.96 SDs above and below the mean. Median laboratory percentage carboxyhemoglobin level was 2.3% (interquartile range 1 to 8.5; range 0% to 38%). The mean difference between laboratory carboxyhemoglobin values and RAD values was 1.4% carboxyhemoglobin (95% confidence interval [CI] 0.2% to 2.6%). The limits of agreement of differences of measurement made with the 2 devices were -11.6% and 14.4% carboxyhemoglobin. This range exceeded the value of ±5% carboxyhemoglobin defined a priori as clinically acceptable. RAD correctly identified 11 of 23 patients with laboratory values greater than 15% carboxyhemoglobin (sensitivity 48%; 95% CI 27% to 69%). There was one case of a laboratory carboxyhemoglobin level less than 15%, in which the RAD device gave a result greater than 15% (specificity of RAD 96/97=99%; 95% CI 94% to 100%). In the range of carboxyhemoglobin values measured in this sample, the level of agreement observed suggests RAD measurement may not be used interchangeably with standard laboratory measurement. Copyright © 2010 American

  12. Is the green key standard the golden key for sustainability measurement in the hospitality sector?

    OpenAIRE

    Rietbergen, Martijn; Rheede, van, Arjan

    2014-01-01

    The Green Key is an eco-rating program that aims at promoting sustainable business practices in the hospitality sector. The Green Key assesses amongst others the sustainable management of energy, water and waste within hotels and other hospitality firms. The Green Key standard awards points if specific sustainable practices or environmental measures have been implemented, but does however not assess the actual environmental performance of hospitality firms. Therefore, the interesting question...

  13. Standardization of the method for measurement of plasma estrone by radioimmunoassay

    International Nuclear Information System (INIS)

    Vilanova, M.S.V.; Moreira, A.C.; Sala, M.M. de; Sa, M.F.S. de

    1994-01-01

    The present paper has as objective standardize a radioimmunoassay method for measurement of plasma estrone. Ethyl ether was used for plasma extraction. The sensitivity (Minimal detectable dose) was 3,7 pg/tube; the reproducibility (inter assay error) was 8,6%; the precision (intra assay error) was 4,1%. As a biological control the plasma estrone was ml) and in 24 patients with polycystic ovarian syndrome (median = 77,9 pg/ml). (author). 6 refs, 2 figs, 2 tabs

  14. Transmission of Airborne Bacteria across Built Environments and Its Measurement Standards: A Review

    Directory of Open Access Journals (Sweden)

    So Fujiyoshi

    2017-11-01

    Full Text Available Human health is influenced by various factors including microorganisms present in built environments where people spend most of their lives (approximately 90%. It is therefore necessary to monitor and control indoor airborne microbes for occupational safety and public health. Most studies concerning airborne microorganisms have focused on fungi, with scant data available concerning bacteria. The present review considers papers published from 2010 to 2017 approximately and factors affecting properties of indoor airborne bacteria (communities and concentration with respect to temporal perspective and to multiscale interaction viewpoint. From a temporal perspective, bacterial concentrations in built environments change depending on numbers of human occupancy, while properties of bacterial communities tend to remain stable. Similarly, the bacteria found in social and community spaces such as offices, classrooms and hospitals are mainly associated with human occupancy. Other major sources of indoor airborne bacteria are (i outdoor environments, and (ii the building materials themselves. Indoor bacterial communities and concentrations are varied with varying interferences by outdoor environment. Airborne bacteria from the outdoor environment enter an indoor space through open doors and windows, while indoor bacteria are simultaneously released to the outer environment. Outdoor bacterial communities and their concentrations are also affected by geographical factors such as types of land use and their spatial distribution. The bacteria found in built environments therefore originate from any of the natural and man-made surroundings around humans. Therefore, to better understand the factors influencing bacterial concentrations and communities in built environments, we should study all the environments that humans contact as a single ecosystem. In this review, we propose the establishment of a standard procedure for assessing properties of indoor airborne

  15. Testing the Standard Model by precision measurement of the weak charges of quarks

    Energy Technology Data Exchange (ETDEWEB)

    Ross Young; Roger Carlini; Anthony Thomas; Julie Roche

    2007-05-01

    In a global analysis of the latest parity-violating electron scattering measurements on nuclear targets, we demonstrate a significant improvement in the experimental knowledge of the weak neutral-current lepton-quark interactions at low-energy. The precision of this new result, combined with earlier atomic parity-violation measurements, limits the magnitude of possible contributions from physics beyond the Standard Model - setting a model-independent, lower-bound on the scale of new physics at ~1 TeV.

  16. Standard practices for verification of displacement measuring systems and devices used in material testing machines

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    2005-01-01

    1.1 These practices cover procedures and requirements for the calibration and verification of displacement measuring systems by means of standard calibration devices for static and quasi-static testing machines. This practice is not intended to be complete purchase specifications for testing machines or displacement measuring systems. Displacement measuring systems are not intended to be used for the determination of strain. See Practice E83. 1.2 These procedures apply to the verification of the displacement measuring systems associated with the testing machine, such as a scale, dial, marked or unmarked recorder chart, digital display, etc. In all cases the buyer/owner/user must designate the displacement-measuring system(s) to be verified. 1.3 The values stated in either SI units or inch-pound units are to be regarded separately as standard. The values stated in each system may not be exact equivalents; therefore, each system shall be used independently of the other. Combining values from the two systems m...

  17. A Combination of Preliminary Electroweak Measurements and Constraints on the Standard Model, 1999

    CERN Document Server

    Acciarri, M

    1999-01-01

    This note presents a combination of published and preliminary electroweak results from the four LEP collaborations and the SLD collaboration which were prepared for the 1998 summer conferences. Averages are derived for hadronic and leptonic cross-sections, the leptonic forward-backward asymmetries, the *polarisation asymmetries, the bb and cc partial widths and forward-backward asymmetries and the qq charge asymmetry. The major changes with respect to results presented in summer 1997 are updates to the measurements of the Z lineshape, tau polarisation, W mass and triple-gaugeboson couplings from LEP, and ALR from SLD. The results are compared with precise electroweak measurements from other experiments. A signifi- cant update here is a new measurement of the mixing angle from the NuTeV Collaboration. The parameters of the Standard Model are evaluated, first using the combined LEP electroweak measurements, and then using the full set of electroweak results.

  18. A base to standardize data processing of cadmium ratio RCd and thermal neutron flux measurements on reactor

    International Nuclear Information System (INIS)

    Li Zhaohuan

    1993-08-01

    The cadmium ratio R Cd and thermal neutron flux are usually measured in a reactor. But its data process is rather complex. The results from same measured data differ by different existing process methods. The purpose of this work is to standardize data processing in R Cd and thermal neutron flux measurements. A natural choice for this purpose is to derive a R Cd formula based on standard average thermal activation cross section and resonance integral and to define related parameters or factors that provide an unique base for comparison between different measurements in laboratories. The parameters or factors include E c , F m , F m ' and G th ' in thermal energy region due to upper truncated Maxwellian distribution and E Cd , F Cd , G r and S r in intermediate energy region. They are the function of multiple variables. The Au foil is used as an example to demonstrate their behaviors by chosen figures and tables which provide for practical data process by hand. The work also discusses limitation of R Cd measurement in terms of so called available and optimum region and notes that Co and Mn foils have a much wider available region among Au, In, Mn, W and Co, the commonly used detector foils

  19. A gravitational-wave standard siren measurement of the Hubble constant

    Science.gov (United States)

    Abbott, B. P.; Abbott, R.; Abbott, T. D.; Acernese, F.; Ackley, K.; Adams, C.; Adams, T.; Addesso, P.; Adhikari, R. X.; Adya, V. B.; Affeldt, C.; Afrough, M.; Agarwal, B.; Agathos, M.; Agatsuma, K.; Aggarwal, N.; Aguiar, O. D.; Aiello, L.; Ain, A.; Ajith, P.; Allen, B.; Allen, G.; Allocca, A.; Altin, P. A.; Amato, A.; Ananyeva, A.; Anderson, S. B.; Anderson, W. G.; Angelova, S. V.; Antier, S.; Appert, S.; Arai, K.; Araya, M. C.; Areeda, J. S.; Arnaud, N.; Arun, K. G.; Ascenzi, S.; Ashton, G.; Ast, M.; Aston, S. M.; Astone, P.; Atallah, D. V.; Aufmuth, P.; Aulbert, C.; Aultoneal, K.; Austin, C.; Avila-Alvarez, A.; Babak, S.; Bacon, P.; Bader, M. K. M.; Bae, S.; Baker, P. T.; Baldaccini, F.; Ballardin, G.; Ballmer, S. W.; Banagiri, S.; Barayoga, J. C.; Barclay, S. E.; Barish, B. C.; Barker, D.; Barkett, K.; Barone, F.; Barr, B.; Barsotti, L.; Barsuglia, M.; Barta, D.; Bartlett, J.; Bartos, I.; Bassiri, R.; Basti, A.; Batch, J. C.; Bawaj, M.; Bayley, J. C.; Bazzan, M.; Bécsy, B.; Beer, C.; Bejger, M.; Belahcene, I.; Bell, A. S.; Berger, B. K.; Bergmann, G.; Bero, J. J.; Berry, C. P. L.; Bersanetti, D.; Bertolini, A.; Betzwieser, J.; Bhagwat, S.; Bhandare, R.; Bilenko, I. A.; Billingsley, G.; Billman, C. R.; Birch, J.; Birney, R.; Birnholtz, O.; Biscans, S.; Biscoveanu, S.; Bisht, A.; Bitossi, M.; Biwer, C.; Bizouard, M. A.; Blackburn, J. K.; Blackman, J.; Blair, C. D.; Blair, D. G.; Blair, R. M.; Bloemen, S.; Bock, O.; Bode, N.; Boer, M.; Bogaert, G.; Bohe, A.; Bondu, F.; Bonilla, E.; Bonnand, R.; Boom, B. A.; Bork, R.; Boschi, V.; Bose, S.; Bossie, K.; Bouffanais, Y.; Bozzi, A.; Bradaschia, C.; Brady, P. R.; Branchesi, M.; Brau, J. E.; Briant, T.; Brillet, A.; Brinkmann, M.; Brisson, V.; Brockill, P.; Broida, J. E.; Brooks, A. F.; Brown, D. A.; Brown, D. D.; Brunett, S.; Buchanan, C. C.; Buikema, A.; Bulik, T.; Bulten, H. J.; Buonanno, A.; Buskulic, D.; Buy, C.; Byer, R. L.; Cabero, M.; Cadonati, L.; Cagnoli, G.; Cahillane, C.; Bustillo, J. Calderón; Callister, T. A.; Calloni, E.; Camp, J. B.; Canepa, M.; Canizares, P.; Cannon, K. C.; Cao, H.; Cao, J.; Capano, C. D.; Capocasa, E.; Carbognani, F.; Caride, S.; Carney, M. F.; Diaz, J. Casanueva; Casentini, C.; Caudill, S.; Cavaglià, M.; Cavalier, F.; Cavalieri, R.; Cella, G.; Cepeda, C. B.; Cerdá-Durán, P.; Cerretani, G.; Cesarini, E.; Chamberlin, S. J.; Chan, M.; Chao, S.; Charlton, P.; Chase, E.; Chassande-Mottin, E.; Chatterjee, D.; Chatziioannou, K.; Cheeseboro, B. D.; Chen, H. Y.; Chen, X.; Chen, Y.; Cheng, H.-P.; Chia, H.; Chincarini, A.; Chiummo, A.; Chmiel, T.; Cho, H. S.; Cho, M.; Chow, J. H.; Christensen, N.; Chu, Q.; Chua, A. J. K.; Chua, S.; Chung, A. K. W.; Chung, S.; Ciani, G.; Ciolfi, R.; Cirelli, C. E.; Cirone, A.; Clara, F.; Clark, J. A.; Clearwater, P.; Cleva, F.; Cocchieri, C.; Coccia, E.; Cohadon, P.-F.; Cohen, D.; Colla, A.; Collette, C. G.; Cominsky, L. R.; Constancio, M.; Conti, L.; Cooper, S. J.; Corban, P.; Corbitt, T. R.; Cordero-Carrión, I.; Corley, K. R.; Cornish, N.; Corsi, A.; Cortese, S.; Costa, C. A.; Coughlin, M. W.; Coughlin, S. B.; Coulon, J.-P.; Countryman, S. T.; Couvares, P.; Covas, P. B.; Cowan, E. E.; Coward, D. M.; Cowart, M. J.; Coyne, D. C.; Coyne, R.; Creighton, J. D. E.; Creighton, T. D.; Cripe, J.; Crowder, S. G.; Cullen, T. J.; Cumming, A.; Cunningham, L.; Cuoco, E.; Dal Canton, T.; Dálya, G.; Danilishin, S. L.; D'Antonio, S.; Danzmann, K.; Dasgupta, A.; da Silva Costa, C. F.; Datrier, L. E. H.; Dattilo, V.; Dave, I.; Davier, M.; Davis, D.; Daw, E. J.; Day, B.; de, S.; Debra, D.; Degallaix, J.; de Laurentis, M.; Deléglise, S.; Del Pozzo, W.; Demos, N.; Denker, T.; Dent, T.; de Pietri, R.; Dergachev, V.; De Rosa, R.; Derosa, R. T.; de Rossi, C.; Desalvo, R.; de Varona, O.; Devenson, J.; Dhurandhar, S.; Díaz, M. C.; di Fiore, L.; di Giovanni, M.; di Girolamo, T.; di Lieto, A.; di Pace, S.; di Palma, I.; di Renzo, F.; Doctor, Z.; Dolique, V.; Donovan, F.; Dooley, K. L.; Doravari, S.; Dorrington, I.; Douglas, R.; Dovale Álvarez, M.; Downes, T. P.; Drago, M.; Dreissigacker, C.; Driggers, J. C.; Du, Z.; Ducrot, M.; Dupej, P.; Dwyer, S. E.; Edo, T. B.; Edwards, M. C.; Effler, A.; Eggenstein, H.-B.; Ehrens, P.; Eichholz, J.; Eikenberry, S. S.; Eisenstein, R. A.; Essick, R. C.; Estevez, D.; Etienne, Z. B.; Etzel, T.; Evans, M.; Evans, T. M.; Factourovich, M.; Fafone, V.; Fair, H.; Fairhurst, S.; Fan, X.; Farinon, S.; Farr, B.; Farr, W. M.; Fauchon-Jones, E. J.; Favata, M.; Fays, M.; Fee, C.; Fehrmann, H.; Feicht, J.; Fejer, M. M.; Fernandez-Galiana, A.; Ferrante, I.; Ferreira, E. C.; Ferrini, F.; Fidecaro, F.; Finstad, D.; Fiori, I.; Fiorucci, D.; Fishbach, M.; Fisher, R. P.; Fitz-Axen, M.; Flaminio, R.; Fletcher, M.; Fong, H.; Font, J. A.; Forsyth, P. W. F.; Forsyth, S. S.; Fournier, J.-D.; Frasca, S.

    2017-11-01

    On 17 August 2017, the Advanced LIGO and Virgo detectors observed the gravitational-wave event GW170817—a strong signal from the merger of a binary neutron-star system. Less than two seconds after the merger, a γ-ray burst (GRB 170817A) was detected within a region of the sky consistent with the LIGO-Virgo-derived location of the gravitational-wave source. This sky region was subsequently observed by optical astronomy facilities, resulting in the identification of an optical transient signal within about ten arcseconds of the galaxy NGC 4993. This detection of GW170817 in both gravitational waves and electromagnetic waves represents the first ‘multi-messenger’ astronomical observation. Such observations enable GW170817 to be used as a ‘standard siren’ (meaning that the absolute distance to the source can be determined directly from the gravitational-wave measurements) to measure the Hubble constant. This quantity represents the local expansion rate of the Universe, sets the overall scale of the Universe and is of fundamental importance to cosmology. Here we report a measurement of the Hubble constant that combines the distance to the source inferred purely from the gravitational-wave signal with the recession velocity inferred from measurements of the redshift using the electromagnetic data. In contrast to previous measurements, ours does not require the use of a cosmic ‘distance ladder’: the gravitational-wave analysis can be used to estimate the luminosity distance out to cosmological scales directly, without the use of intermediate astronomical distance measurements. We determine the Hubble constant to be about 70 kilometres per second per megaparsec. This value is consistent with existing measurements, while being completely independent of them. Additional standard siren measurements from future gravitational-wave sources will enable the Hubble constant to be constrained to high precision.

  20. A gravitational-wave standard siren measurement of the Hubble constant.

    Science.gov (United States)

    2017-11-02

    On 17 August 2017, the Advanced LIGO and Virgo detectors observed the gravitational-wave event GW170817-a strong signal from the merger of a binary neutron-star system. Less than two seconds after the merger, a γ-ray burst (GRB 170817A) was detected within a region of the sky consistent with the LIGO-Virgo-derived location of the gravitational-wave source. This sky region was subsequently observed by optical astronomy facilities, resulting in the identification of an optical transient signal within about ten arcseconds of the galaxy NGC 4993. This detection of GW170817 in both gravitational waves and electromagnetic waves represents the first 'multi-messenger' astronomical observation. Such observations enable GW170817 to be used as a 'standard siren' (meaning that the absolute distance to the source can be determined directly from the gravitational-wave measurements) to measure the Hubble constant. This quantity represents the local expansion rate of the Universe, sets the overall scale of the Universe and is of fundamental importance to cosmology. Here we report a measurement of the Hubble constant that combines the distance to the source inferred purely from the gravitational-wave signal with the recession velocity inferred from measurements of the redshift using the electromagnetic data. In contrast to previous measurements, ours does not require the use of a cosmic 'distance ladder': the gravitational-wave analysis can be used to estimate the luminosity distance out to cosmological scales directly, without the use of intermediate astronomical distance measurements. We determine the Hubble constant to be about 70 kilometres per second per megaparsec. This value is consistent with existing measurements, while being completely independent of them. Additional standard siren measurements from future gravitational-wave sources will enable the Hubble constant to be constrained to high precision.

  1. Study on AC loss measurements of HTS power cable for standardizing

    Science.gov (United States)

    Mukoyama, Shinichi; Amemiya, Naoyuki; Watanabe, Kazuo; Iijima, Yasuhiro; Mido, Nobuhiro; Masuda, Takao; Morimura, Toshiya; Oya, Masayoshi; Nakano, Tetsutaro; Yamamoto, Kiyoshi

    2017-09-01

    High-temperature superconducting power cables (HTS cables) have been developed for more than 20 years. In addition of the cable developments, the test methods of the HTS cables have been discussed and proposed in many laboratories and companies. Recently the test methods of the HTS cables is required to standardize and to common in the world. CIGRE made the working group (B1-31) for the discussion of the test methods of the HTS cables as a power cable, and published the recommendation of the test method. Additionally, IEC TC20 submitted the New Work Item Proposal (NP) based on the recommendation of CIGRE this year, IEC TC20 and IEC TC90 started the standardization work on Testing of HTS AC cables. However, the individual test method that used to measure a performance of HTS cables hasn’t been established as world’s common methods. The AC loss is one of the most important properties to disseminate low loss and economical efficient HTS cables in the world. We regard to establish the method of the AC loss measurements in rational and in high accuracy. Japan is at a leading position in the AC loss study, because Japanese researchers have studied on the AC loss technically and scientifically, and also developed the effective technologies for the AC loss reduction. The JP domestic commission of TC90 made a working team to discussion the methods of the AC loss measurements for aiming an international standard finally. This paper reports about the AC loss measurement of two type of the HTS conductors, such as a HTS conductor without a HTS shield and a HTS conductor with a HTS shield. The AC loss measurement method is suggested by the electrical method..

  2. Measuring evapotranspiration: comparison of in situ micrometeorological methods including eddy covariance, scintillometer, Bowen ratio, and surface renewal method

    Science.gov (United States)

    Poznikova, G.; Fischer, M.; Orsag, M.; Trnka, M.

    2016-12-01

    Quantifying evapotranspiration (ET) is a challenging task as different methods can induce large discrepancies. Comparisons of various techniques are not rare, however it is demanding to maintain several in situ measurements for longer time. In our study, we aimed to compare four micrometeorological methods measuring ET at relatively large homogeneous area. The study took place on a winter wheat field in Polkovice, the Czech Republic (49°23'42.8"N 17°14'47.3"E) from Jul 1st 2015 until Sep 15th 2015. In the centre of 26-ha experimental field we deployed the eddy covariance (EC) system, the Bowen ratio energy balance (BREB) system, thermocouples for surface renewal technique, and the surface layer scintillometer with 106 m path length. Additionally, we installed the large aperture scintillometer with 617 m path length across the field. Our results showed good agreement of compared methods during the wetter periods of the measurements with slight overestimation of the scintillometry. The BREB method agreed the best with EC. Both scintillometers gave very consistent results throughout the whole measurement period. The EC tended to underestimate other methods. One of potential reasons is energy balance disclosure which reached 27.4 % for the measured period. The surface renewal method showed good potential however, need to be further tested in our conditions. Our experimental locality is one of several we are running as a part of ground based measurement network for ET estimation. Gained results helped us to enhance and optimise our network to ensure effective and reliable data acquisition for future validation of airborne images (satellite based drought monitoring).

  3. Validation of the Crime and Violence Scale (CVS) against the Rasch Measurement Model Including Differences by Gender, Race, and Age

    Science.gov (United States)

    Conrad, Kendon J.; Riley, Barth B.; Conrad, Karen M.; Chan, Ya-Fen; Dennis, Michael L.

    2010-01-01

    In assessing criminality, researchers have used counts of crimes, arrests, and so on, because interval measures were not available. Additionally, crime seriousness varies depending on demographic factors. This study examined the Crime and Violence Scale (CVS) regarding psychometric quality using item response theory (IRT) and invariance of the…

  4. Diagnostic value of different adherence measures using electronic monitoring and virologic failure as reference standards.

    Science.gov (United States)

    Deschamps, Ann E; De Geest, Sabina; Vandamme, Anne-Mieke; Bobbaers, Herman; Peetermans, Willy E; Van Wijngaerden, Eric

    2008-09-01

    Nonadherence to antiretroviral therapy is a substantial problem in HIV and jeopardizes the success of treatment. Accurate measurement of nonadherence is therefore imperative for good clinical management but no gold standard has been agreed on yet. In a single-center prospective study nonadherence was assessed by electronic monitoring: percentage of doses missed and drug holidays and by three self reports: (1) a visual analogue scale (VAS): percentage of overall doses taken; (2) the Swiss HIV Cohort Study Adherence Questionnaire (SHCS-AQ): percentage of overall doses missed and drug holidays and (3) the European HIV Treatment Questionnaire (EHTQ): percentage of doses missed and drug holidays for each antiretroviral drug separately. Virologic failure prospectively assessed during 1 year, and electronic monitoring were used as reference standards. Using virologic failure as reference standard, the best results were for (1) the SHCS-AQ after electronic monitoring (sensitivity, 87.5%; specificity, 78.6%); (2) electronic monitoring (sensitivity, 75%; specificity, 85.6%), and (3) the VAS combined with the SHCS-AQ before electronic monitoring (sensitivity, 87.5%; specificity, 58.6%). The sensitivity of the complex EHTQ was less than 50%. Asking simple questions about doses taken or missed is more sensitive than complex questioning about each drug separately. Combining the VAS with the SHCS-AQ seems a feasible nonadherence measure for daily clinical practice. Self-reports perform better after electronic monitoring: their diagnostic value could be lower when given independently.

  5. Can baroreflex measurements with spontaneous sequence analysis be improved by also measuring breathing and by standardization of filtering strategies?

    International Nuclear Information System (INIS)

    Hollow, M R; Parkes, M J; Clutton-Brock, T H

    2011-01-01

    Baroreflex sensitivity (BRS) is known to be attenuated by inspiration and all the original BRS methodologies took this into account by measuring only in expiration. Spontaneous sequence analysis (SSA) is a non-invasive clinical tool widely used to estimate BRS in Man but does not take breathing into account. We have therefore modified it to test whether it too can detect inspiratory attenuation. Traditional SSA is also entangled with issues of distinguishing causal from random relationships between blood pressure and heart period and of the optimum choice of data filter settings. We have also tested whether the sequences our modified SSA rejects do behave as random relationships and show the limitations of the absence of filter standardization. SSA was performed on eupneic data from 1 h periods in 20 healthy subjects. Applying SSA traditionally produced a mean BRS of 23 ± 3 ms mmHg −1 . After modification to measure breathing, SSA detected significant inspiratory attenuation (11 ± 1 ms mmHg −1 ), and the mean expiratory BRS was significantly higher (26 ± 5 ms mmHg −1 ). Traditional SSA therefore underestimates BRS by an amount (3 ms mmHg −1 ) as big as the major physiological and clinical factors known to alter BRS. We show that the sequences rejected by SSA do behave like random associations between pressure and period. We also show the minimal effect of the r 2 filter and the biases that some pressure and heart period filters can introduce. We discuss whether SSA might be improved by standardization of filter settings and by also measuring breathing

  6. Unreliability and error in the military's "gold standard" measure of sexual harassment by education and gender.

    Science.gov (United States)

    Murdoch, Maureen; Pryor, John B; Griffin, Joan M; Ripley, Diane Cowper; Gackstetter, Gary D; Polusny, Melissa A; Hodges, James S

    2011-01-01

    The Department of Defense's "gold standard" sexual harassment measure, the Sexual Harassment Core Measure (SHCore), is based on an earlier measure that was developed primarily in college women. Furthermore, the SHCore requires a reading grade level of 9.1. This may be higher than some troops' reading abilities and could generate unreliable estimates of their sexual harassment experiences. Results from 108 male and 96 female soldiers showed that the SHCore's temporal stability and alternate-forms reliability was significantly worse (a) in soldiers without college experience compared to soldiers with college experience and (b) in men compared to women. For men without college experience, almost 80% of the temporal variance in SHCore scores was attributable to error. A plain language version of the SHCore had mixed effects on temporal stability depending on education and gender. The SHCore may be particularly ill suited for evaluating population trends of sexual harassment in military men without college experience.

  7. A Combination of Preliminary Electroweak Measurements and Constraints on the Standard Model, 1997

    CERN Document Server

    Abbaneo, D; Antilogus, P; Behnke, T; Bertucci, B; Blondel, A; Burgard, C; Clare, R; Clarke, P E L; Dutta, S; Elsing, M; Faccini, R; Fassouliotis, D; Grünewald, M W; Gurtu, A; Hamacher, K; Hansen, J B; Jones, R W L; de Jong, P; Kawamoto, T; Kobel, M; Lançon, E; Lohmann, W; Mariotti, C; Martínez, M; Matteuzzi, C; Minard, M N; Mönig, K; Molnár, P; Nippe, A; Olshevskii, A G; Paus, C; Pepé-Altarelli, M; Petzold, S; Pietrzyk, B; Quast, G; Reid, D; Renton, P B; Roney, J M; Sekulin, R L; Tenchini, Roberto; Teubert, F; Thomson, M A; Timmermans, J; Turner-Watson, M F; Wahlen, H; Ward, C P; Ward, D R; Watson, N K; Weber, A

    1997-01-01

    This note presents a combination of published and preliminary electroweak results from the four LEP collaborations and the SLD collaboration which were prepared for the 1997 summer conferences. Averages are derived for hadronic and leptonic cross-sections, the leptonic forward-backward asymmetries, the $\\tau$ polarisation asymmetries, the $\\bb$ and $\\cc$ partial widths and forward-backward asymmetries and the $\\qq$ charge asymmetry. The major changes with respect to results presented last year are updated results of $\\ALR$ from SLD, and the inclusion of the first direct measurements of the W mass and triple-gauge-boson couplings performed at LEP. The results are compared with precise electroweak measurements from other experiments. The parameters of the Standard Model are evaluated, first using the combined LEP electroweak measurements, and then using the full set of electroweak results.

  8. Quantitative angle-insensitive flow measurement using relative standard deviation OCT.

    Science.gov (United States)

    Zhu, Jiang; Zhang, Buyun; Qi, Li; Wang, Ling; Yang, Qiang; Zhu, Zhuqing; Huo, Tiancheng; Chen, Zhongping

    2017-10-30

    Incorporating different data processing methods, optical coherence tomography (OCT) has the ability for high-resolution angiography and quantitative flow velocity measurements. However, OCT angiography cannot provide quantitative information of flow velocities, and the velocity measurement based on Doppler OCT requires the determination of Doppler angles, which is a challenge in a complex vascular network. In this study, we report on a relative standard deviation OCT (RSD-OCT) method which provides both vascular network mapping and quantitative information for flow velocities within a wide range of Doppler angles. The RSD values are angle-insensitive within a wide range of angles, and a nearly linear relationship was found between the RSD values and the flow velocities. The RSD-OCT measurement in a rat cortex shows that it can quantify the blood flow velocities as well as map the vascular network in vivo .

  9. A Combination of Preliminary Electroweak Measurements and Constraints on the Standard Model, 2000

    CERN Document Server

    CERN. Geneva

    2000-01-01

    This note presents a combination of published and preliminary electroweak results from the four LEP collaborations and the SLD collaboration which were prepared for the 1999 summer conferences. Averages are derived for hadronic and leptonic cross sections, the leptonic forward-backward asymmetries, the $\\tau$ polarisation asymmetries, the $\\bb$ and $\\cc$ partial widths and forward-backward asymmetries and the $\\qq$ charge asymmetry. The major changes with respect to results presented in summer 1998 are updates to the lineshape, W mass and triple-gauge-boson couplings from LEP, and $\\cAb$ and $\\cAc$ from SLD. The results are compared with precise electroweak measurements from other experiments. A significant update here are new W mass measurements from CDF and D\\O. The parameters of the Standard Model are evaluated, first using the combined LEP electroweak measurements, and then using the full set of electroweak results.

  10. Predicting a single HIV drug resistance measure from three international interpretation gold standards.

    Science.gov (United States)

    Yashik, Singh; Maurice, Mars

    2012-07-01

    To investigate the possibility of combining the interpretation of three gold standard interpretation algorithms using weighted heuristics in order to produce a single resistance measure. The outputs of HIVdb, Rega, ANRS were combined to obtain a single resistance profile using the equally weighted voting algorithm, accuracy based weighing voting algorithm and the Bayesian based weighted voting algorithm techniques. The Bayesian based voting combination increased the accuracy of the resistance profile prediction compared to phenotype, from 58% to 69%. The equal weighted voting algorithm and the accuracy based algorithm both increased the prediction accuracy to 60%. From the result obtained it is evident that combining the gold standard interpretation algorithms may increase the predictive ability of the individual interpretation algorithms. Copyright © 2012 Hainan Medical College. Published by Elsevier B.V. All rights reserved.

  11. Veterinary antimicrobial-usage statistics based on standardized measures of dosage

    DEFF Research Database (Denmark)

    Jensen, Vibeke Frøkjær; Jacobsen, Erik; Bager, Flemming

    2004-01-01

    . A national system of animal defined daily doses (ADD) for each age-group and species has been defined in VetStat (the Danish national system monitoring veterinary therapeutic drug use). The usage is further standardized according to the number of animals in the target population, acquired from production...... data on the national level or on herd size by species and age in the Danish central husbandry register (CHR). Statistics based on standardized measures of VetStat data can be used for comparison of drug usage between different herds, veterinary practices, or geographic regions (allowing subdivision...... by animal species and animal production class, route of administration, disease categories, season and geographic location). Individual statistics are available as interactive reports to the control authorities, farmers and veterinary practitioners by a secure access to the database. The ADD also is used...

  12. Advances in absorbed dose measurement standards at the australian radiation laboratory

    International Nuclear Information System (INIS)

    Boas, J.F.; Hargrave, N.J.; Huntley, R.B.; Kotler, L.H.; Webb, D.V.; Wise, K.N.

    1996-01-01

    The applications of ionising radiation in the medical and industrial fields require both an accurate knowledge of the amount of ionising radiation absorbed by the medium in question and the capability of relating this to National and International standards. The most useful measure of the amount of radiation is the absorbed dose which is defined as the energy absorbed per unit mass. For radiotherapy, the reference medium is water, even though the measurement of the absorbed dose to water is not straightforward. Two methods are commonly used to provide calibrations in absorbed dose to water. The first is the calibration of the chamber in terms of exposure in a Cobalt-60 beam, followed by the conversion by a protocol into dose to water in this and higher energy beams. The other route is via the use of a graphite calorimeter as a primary standard device, where the conversion from absorbed dose to graphite to absorbed dose in water is performed either by theoretical means making use of cavity ionisation theory, or by experiment where the graphite calorimeter and secondary standard ionisation chamber are placed at scaled distances from the source of the radiation beam (known as the Dose-Ratio method). Extensive measurements have been made at Cobalt-60 at ARL using both the exposure and absorbed dose to graphite routes. Agreement between the ARL measurements and those based on standards maintained by ANSTO and NPL is within ± 0.3%. Absorbed dose measurements have also been performed at ARL with photon beams of nominal energy 16 and 19 MeV obtained from the ARL linac. The validity of the protocols at high photon energies, the validity of the methods used to convert from absorbed dose in graphite to absorbed dose in water and the validity of the indices used to specify the beams are discussed. Brief mention will also be made of the establishment of a calibration facility for neutron monitors at ARL and of progress in the development of ERP dosimetry

  13. Measurement and standardization of eye safety for optical radiation of LED products

    Science.gov (United States)

    Mou, Tongsheng; Peng, Zhenjian

    2013-06-01

    The blue light hazard (BLH) to human eye's retina is now a new issue emerging in applications of artificial light sources. Especially for solid state lighting sources based on the blue chip-LED(GaN), the photons with their energy more than 2.4 eV show photochemical effects on the retina significantly, raising damage both in photoreceptors and retinal pigment epithelium. The photobiological safety of artificial light sources emitting optical radiation has gained more and more attention worldwide and addressed by international standards IEC 62471-2006(CIE S009/E: 2002). Meanwhile, it is involved in IEC safety specifications of LED lighting products and covered by European Directive 2006/25/EC on the minimum health and safety requirements regarding the exposure of the workers to artificial optical radiation. In practical applications of the safety standards, the measuring methods of optical radiation from LED products to eyes are important in establishment of executable methods in the industry. In 2011, a new project to develop the international standard of IEC TR62471-4,that is "Measuring methods of optical radiation related to photobiological safety", was approved and are now under way. This paper presents the concerned methods for the assessment of optical radiation hazards in the standards. Furthermore, a retina radiance meter simulating eye's optical geometry is also described, which is a potential tool for blue light hazard assessment of retinal exposure to optical radiation. The spectroradiometric method integrated with charge-coupled device(CCD) imaging system is introduced to provide more reliable results.

  14. Climatologies from satellite measurements: the impact of orbital sampling on the standard error of the mean

    Directory of Open Access Journals (Sweden)

    M. Toohey

    2013-04-01

    Full Text Available Climatologies of atmospheric observations are often produced by binning measurements according to latitude and calculating zonal means. The uncertainty in these climatological means is characterised by the standard error of the mean (SEM. However, the usual estimator of the SEM, i.e., the sample standard deviation divided by the square root of the sample size, holds only for uncorrelated randomly sampled measurements. Measurements of the atmospheric state along a satellite orbit cannot always be considered as independent because (a the time-space interval between two nearest observations is often smaller than the typical scale of variations in the atmospheric state, and (b the regular time-space sampling pattern of a satellite instrument strongly deviates from random sampling. We have developed a numerical experiment where global chemical fields from a chemistry climate model are sampled according to real sampling patterns of satellite-borne instruments. As case studies, the model fields are sampled using sampling patterns of the Michelson Interferometer for Passive Atmospheric Sounding (MIPAS and Atmospheric Chemistry Experiment Fourier-Transform Spectrometer (ACE-FTS satellite instruments. Through an iterative subsampling technique, and by incorporating information on the random errors of the MIPAS and ACE-FTS measurements, we produce empirical estimates of the standard error of monthly mean zonal mean model O3 in 5° latitude bins. We find that generally the classic SEM estimator is a conservative estimate of the SEM, i.e., the empirical SEM is often less than or approximately equal to the classic estimate. Exceptions occur only when natural variability is larger than the random measurement error, and specifically in instances where the zonal sampling distribution shows non-uniformity with a similar zonal structure as variations in the sampled field, leading to maximum sensitivity to arbitrary phase shifts between the sample distribution and

  15. The measures for achieving nZEB standard of retrofitted educational building for specific polish location - case study

    Science.gov (United States)

    Kwiatkowski, Jerzy; Mijakowski, Maciej; Trząski, Adrian

    2017-11-01

    Most of the EU member states have already set a definition of nZEB for new buildings and some of the countries also done it for existing buildings. As there is no definition of nZEB for existing buildings in Poland, the paper will include various considerations of such a standard. Next, a case study of educational building retrofitting to a proposed nZEB standard will be presented. The aim of the paper is to present what measures can be used in order to decrease energy consumption in existing building. The measures are divided into three parts: architectural and construction, installations and energy sources. Thus a complexity of the solutions are presented. As the nZEB standard is related to available energy sources, also an influence of local condition will be considered. Building chosen for analysis is located in an area under historic protection which makes the work even more difficult. It was proved that used solutions were chosen not only to reduce energy demand or increase energy production from renewable energy sources, but also to increase social and aesthetic features of the building.

  16. Measured Properties of Turbulent Premixed Flames for Model Assessment, Including Burning Velocities, Stretch Rates, and Surface Densities (Postprint)

    Science.gov (United States)

    2006-10-01

    conditions was stabilized on a large two-dimensional slot Bunsen burner . It was found that the turbulent burning velocity of Bunsen flames depends...burning velocity of Bunsen flames are inadequate because they should include two additional parameters: mean velocity Ū and burner width W. These...corru- gated) flame with well-defined boundary conditions was stabilized on a large two-dimensional slot Bunsen burner . It was found that the turbulent

  17. Evaluation, including effects of storage and repeated freezing and thawing, of a method for measurement of urinary creatinine

    DEFF Research Database (Denmark)

    Garde, A H; Hansen, Åse Marie; Kristiansen, J

    2003-01-01

    The aims of this study were to elucidate to what extent storage and repeated freezing and thawing influenced the concentration of creatinine in urine samples and to evaluate the method for determination of creatinine in urine. The creatinine method was based on the well-known Jaffe's reaction and.......1 mmol/L), was 0.3 mmol/L, and the recovery of a certified reference material was 97%. The relative precision at 3.15 mmol/L was 2.3%. It was concluded that the method is appropriate for measurement of urinary creatinine....

  18. Quality assessment and consistency check of measured nuclear data up to 20 MeV including the range of resonances

    International Nuclear Information System (INIS)

    Boedy, Z.T.

    1984-09-01

    This is the final report of a research contract with the IAEA. The object is the compilation and evaluation of all the data on (n,t) and (n,3He) reactions cross-sections, respectively. The main results of the research are given (some discrepancies in the experimental data; analytic formulas for an empirical description of the data, separately for the even and odd nuclei with z>20; methods to extrapolate to energies where measurements are missing; mass regions where data are needed), and publications by the authors with the detailed results are quoted

  19. Japan Explosives Society Standard. 5. Measuring method for propellant; Kayaku gakkai kikaku. 5. Puroperanto keisoku hoho

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1996-10-30

    This paper states a standard measuring method for propellants as a Standard of Japan Explosives Society. A propellant is a high-energy substance used as a rocket motor propellant and as a gunpowder for guns. For its practical application, its physical and chemical properties must be identified, and ignition and combustion phenomena must be understood fully. Such characteristics are also demanded that predetermined performance is maintained even after having been stored for an extended period of time, and the material will not decompose, deteriorate, or ignite spontaneously. Also, when the propellant`s energy is discharged by a rocket motor and a gun, the propellant must burn at a predetermined velocity, and cause no abnormal ignition, vibratory combustion and erosive combustion. Physical and chemical property values of propellants vary largely with materials constituting the propellants and their mixing ratio. Property requirements also vary depending on practical use conditions. Therefore, this paper describes a measuring method required to derive basic properties of the propellants, so that data derived from the measurement may be used commonly by engineers.

  20. Standardization of the continuing care activity measure: a multicenter study to assess reliability, validity, and ability to measure change.

    Science.gov (United States)

    Huijbregts, Maria P J; Teare, Gary F; McCullough, Carolyn; Kay, Theresa M; Streiner, David; Wong, Steve K C; McEwen, Sara E; Otten, Ingrid

    2009-06-01

    There is a lack of standardized mobility measures specific to the long-term care (LTC) population. Therefore, the Continuing Care Activity Measure (CCAM) was developed. This study determined levels of reliability, validity for clinical utilization, and sensitivity to change of this measure. This was a prospective longitudinal cohort study among elderly people with primarily physical or medical impairments who were residing in LTC institutions that provide nursing home and more-complex care, with access to physical therapy services. The CCAM, the Clinical Outcome Variables Scale (COVS), the Social Engagement Scale (SES) of the Resident Assessment Instrument-Minimum Data Set (RAI-MDS) 2.0 instrument, and the Resource Utilization Groups, version 3, (RUG-III) were administered by clinical and research physical therapists, with timing dictated by the study purpose. The participants were 136 residents of LTC institutions and 21 physical therapists. The CCAM interrater reliability (intraclass correlation coefficient [ICC]) was .97 (95% confidence interval=.91-1.00), and test-retest reliability (ICC) over a period of 1 week was .99 (95% confidence interval=.93-1.00). Over 6 months, the absolute change in total score was 5.88 for the CCAM and 4.26 for the COVS; the CCAM was 28% more responsive across all participants (n=105) and 68% more responsive for those scoring in the lower half (n=49). The minimal detectable difference of the CCAM was 8.6 across all participants. The CCAM correlated with the COVS, nursing care hours inferred from the RUG-III, and the SES. Some participants were lost to follow-up. The CCAM is a reliable and valid tool to measure gross motor function and physical mobility for elderly people in LTC institutions. It discriminates among functional levels, measures individual functional change, and can contribute to clinical decision making.

  1. Standardization of spine and hip BMD measurements in different DXA devices

    Energy Technology Data Exchange (ETDEWEB)

    Ozdemir, Aysegul [Gazi University, Department of Radiology, Besevler, Ankara 06510 (Turkey)]. E-mail: aysozd@gazi.edu.tr; Ucar, Murat [Gazi University, Department of Radiology, Besevler, Ankara 06510 (Turkey)

    2007-06-15

    Aim: To compare BMD values of lumbar and hip regions measured in two different DXA scanners in one laboratory, and to investigate the efficiencies of implemented and specifically derived standardization formulas. Materials and methods: PA lumbar (L2-L4) and right femoral neck BMD values were obtained in 100 women (aged 26-75), consecutively in GE-Lunar DPX-NT and Hologic QDR 4500 C DXA scanners. Standardization of BMD values obtained in two different DXA devices was done according to the method developed by International DXA Standardization Committee (IDSC), using the European Spine Phantom (ESP) to obtain the specific constant value. Mean corrected standardized BMD (sBMD) values in two scanners have been compared with each other and with the mean reported sBMD values, respectively. Results: The mean lumbar BMD values were 0.950 {+-} 0.117 g/cm{sup 2} for Hologic and 1.068 {+-} 0.135 g/cm{sup 2} for GE-Lunar (p < 0.05); mean corrected sBMD values were 1.035 {+-} 0.128 g/cm{sup 2} for Hologic and 1.035 {+-} 0.131 g/cm{sup 2} for GE-Lunar (p > 0.05). The mean femoral neck BMD values were 0.798 {+-} 0.114 g/cm{sup 2} for Hologic and 0.895 {+-} 0.111 g/cm{sup 2} for GE-Lunar (p < 0.05); mean corrected sBMD values were 0.869 {+-} 0.124 g/cm{sup 2} for Hologic and 0.867 {+-} 0.108 g/cm{sup 2} for GE-Lunar (p > 0.05). The difference between the mean values of BMD and sBMD, both corrected and reported, were statistically important in each scanner (p < 0.05). The mean values of corrected and reported sBMD were also statistically different in each scanner (p < 0.05; mean standard error in the spine was 1.3 for GE-Lunar and 1.8 for the Hologic device). Conclusion: The originally proposed standardization formulae may not optimally correct for manufacturer, model and device-specific differences. Therefore, use of sBMD is not recommended to compare results of individual patients obtained on scanners of different type and brand. The residual error of reported sBMD, however, is

  2. Reference value standards and primary standards for pH measurements in D2O and aqueous-organic solvent mixtures: new accessions and assessments

    International Nuclear Information System (INIS)

    Mussini, P.R.; Mussini, T.; Rondinini, S.

    1997-01-01

    Recommended Reference Value Standards based on the potassium hydro-genphthalate buffer at various temperatures are reported for pH measurements in various binary solvent mixtures of water with eight organic solvents: methanol, ethanol, 2-propanol, 1,2-ethanediol, 2-methoxyethanol (''methylcellosolve''), acetonitrile, 1,4-dioxane, and dimethyl sulfoxide, together with Reference Value Standard based on the potassium deuterium phthalate buffer for pD measurements in D 2 O. In addition are reported Primary Standards for pH based on numerous buffers in various binary solvent mixtures of water with methanol, ethanol, and dimethyl sulfoxide, together with Primary Standards for pD in D 2 O based on the citrate, phosphate and carbonate buffers. (author)

  3. Absolute np and pp cross section determinations aimed at improving the standard for cross section measurements

    International Nuclear Information System (INIS)

    Laptev, Alexander B.; Haight, Robert C.; Tovesson, Fredrik; Arndt, Richard A.; Briscoe, William J.; Paris, Mark W.; Strakovsky, Igor I.; Workman, Ron L.

    2010-01-01

    Purpose of present research is a keeping improvement of the standard for cross section measurements of neutron-induced reactions. The cross sections for np and pp scattering below 1000 MeV are determined based on partial-wave analyses (PW As) of nucleon-nucleon scattering data. These cross sections are compared with the most recent ENDF/B-V11.0 and JENDL-4.0 data files, and the Nijmegen PWA. Also a comparison of evaluated data with recent experimental data was made to check a quality of evaluation. Excellent agreement was found between the new experimental data and our PWA predictions.

  4. Automation of the ANSTO working standard of measurement for the activity of radionuclides

    International Nuclear Information System (INIS)

    Buckman, S.M.

    1990-08-01

    The ANSTO working standard ion chamber is used routinely for the standardisation of a range of gamma emitting radio-isotopes. The ion chamber has recently been automated by replacing the AAEC type 292 Recycling Discriminator, timer module and model 43 teletype printer with the HP86B computer, HP-59501B voltage programmer and HP-6181C current source. The program 'MEASION', running on the Deltacom IBM AT clone, calculates the radioactivity, with full error statements, from the ion chamber measurements. Each of these programs is listed and discussed. 13 refs., 5 figs., tabs

  5. Standard test bench for calibrating instruments used to measure natural or artificial radioactive airborne particulates

    International Nuclear Information System (INIS)

    Charuau, J.; Grivaud, L.; Le Breton, M.

    1992-01-01

    An aerodynamic calibration device, known as ICARE, has been set up in France at the Saclay Research Centre to certify instruments used to measure natural or artificial airborne radioactive particulate contamination or radon. ICARE can calibrate passive detectors and monitors with sampling air flow-rates of less than 60 m 3 /h. The adjustment of such parameters as 222 Rn daughters volume activity, attached fraction and equilibrium factor, and the volume activity and size of α or β emitter carrying aerosols, allows realistic conditions to be obtained. ICARE complies with monitor test method standard currently under development by the International Electrotechnical Commission

  6. Validation of Measured Damping Trends for Flight-Like Vehicle Panel/Equipment including a Range of Cable Harness Assemblies

    Science.gov (United States)

    Smith, Andrew M.; Davis, R. Benjamin; LaVerde, Bruce T.; Fulcher, Clay W.; Jones, Douglas C.; Waldon, James M.; Craigmyle, Benjamin B.

    2012-01-01

    This validation study examines the effect on vibroacoustic response resulting from the installation of cable bundles on a curved orthogrid panel. Of interest is the level of damping provided by the installation of the cable bundles and whether this damping could be potentially leveraged in launch vehicle design. The results of this test are compared with baseline acoustic response tests without cables. Damping estimates from the measured response data are made using a new software tool that leverages a finite element model of the panel in conjunction with advanced optimization techniques. While the full test series is not yet complete, the first configuration of cable bundles that was assessed effectively increased the viscous critical damping fraction of the system by as much as 0.02 in certain frequency ranges.

  7. Distinguishing enhancing from nonenhancing renal masses with dual-source dual-energy CT: iodine quantification versus standard enhancement measurements.

    Science.gov (United States)

    Ascenti, Giorgio; Mileto, Achille; Krauss, Bernhard; Gaeta, Michele; Blandino, Alfredo; Scribano, Emanuele; Settineri, Nicola; Mazziotti, Silvio

    2013-08-01

    To compare the diagnostic accuracy of iodine quantification and standard enhancement measurements in distinguishing enhancing from nonenhancing renal masses. The Institutional Review Board approved this retrospective study conducted from data found in institutional patient databases and archives. Seventy-two renal masses were characterised as enhancing or nonenhancing using standard enhancement measurements (in HU) and iodine quantification (in mg/ml). Sensitivity, specificity, positive predictive value (PPV) and negative predictive value (NPV) of standard enhancement measurements and iodine quantification were calculated from χ (2) tests of contingency with histopathology or imaging follow-up as the reference standard. Difference in accuracy was assessed by means of McNemar analysis. Sensitivity, specificity, PPV, NPV and diagnostic accuracy for standard enhancement measurements and iodine quantification were 77.7 %, 100 %, 100 %, 81.8 %, 89 % and 100 %, 94.4 %, 94.7, 100 % and 97 %, respectively. The McNemar analysis showed that the accuracy of iodine quantification was significantly better (P < 0.001) than that of standard enhancement measurements. Compared with standard enhancement measurements, whole-tumour iodine quantification is more accurate in distinguishing enhancing from nonenhancing renal masses. • Enhancement of renal lesions is important when differentiating benign from malignant tumours. • Dual-energy CT offers measurement of iodine uptake rather than mere enhancement values. • Whole-tumour iodine quantification seems more accurate than standard CT enhancement measurements.

  8. The Australian Commonwealth standard of measurement for absorbed radiation dose. Part 1

    International Nuclear Information System (INIS)

    Sherlock, S.L.

    1989-08-01

    As an agent for the Commonwealth Scientific and Industrial Research Organisation, the Australian Nuclear Science and Technology Organisation is responsible for maintenance of the Australian Commonwealth standard of absorbed dose. This standard of measurement has application in radiation therapy dosimetry, which is required for the treatment of cancer patients. This report is the first in a series of reports documenting the absorbed dose standard for photon beams in the range from 1 to 25 MeV. The Urquhart graphite micro-calorimeters, which is used for the determination of absorbed dose under high energy photon beams, has been now placed under computer control. Accordingly, a complete upgrade of the calorimeter systems was performed to allow operation in the hospital. In this report, control and monitoring techniques have been described, with an assessment of the performance achieved being given for 6 and 18 MeV bremsstrahlung beams. Random errors have been reduced to near negligible proportions, while systematic errors have been minimized by achieving true quasi-adiabatic operation. 16 refs., 9 tabs., 11 figs

  9. Uncertainty in Measurement: A Review of Monte Carlo Simulation Using Microsoft Excel for the Calculation of Uncertainties Through Functional Relationships, Including Uncertainties in Empirically Derived Constants

    Science.gov (United States)

    Farrance, Ian; Frenkel, Robert

    2014-01-01

    The Guide to the Expression of Uncertainty in Measurement (usually referred to as the GUM) provides the basic framework for evaluating uncertainty in measurement. The GUM however does not always provide clearly identifiable procedures suitable for medical laboratory applications, particularly when internal quality control (IQC) is used to derive most of the uncertainty estimates. The GUM modelling approach requires advanced mathematical skills for many of its procedures, but Monte Carlo simulation (MCS) can be used as an alternative for many medical laboratory applications. In particular, calculations for determining how uncertainties in the input quantities to a functional relationship propagate through to the output can be accomplished using a readily available spreadsheet such as Microsoft Excel. The MCS procedure uses algorithmically generated pseudo-random numbers which are then forced to follow a prescribed probability distribution. When IQC data provide the uncertainty estimates the normal (Gaussian) distribution is generally considered appropriate, but MCS is by no means restricted to this particular case. With input variations simulated by random numbers, the functional relationship then provides the corresponding variations in the output in a manner which also provides its probability distribution. The MCS procedure thus provides output uncertainty estimates without the need for the differential equations associated with GUM modelling. The aim of this article is to demonstrate the ease with which Microsoft Excel (or a similar spreadsheet) can be used to provide an uncertainty estimate for measurands derived through a functional relationship. In addition, we also consider the relatively common situation where an empirically derived formula includes one or more ‘constants’, each of which has an empirically derived numerical value. Such empirically derived ‘constants’ must also have associated uncertainties which propagate through the functional

  10. Uncertainty in measurement: a review of monte carlo simulation using microsoft excel for the calculation of uncertainties through functional relationships, including uncertainties in empirically derived constants.

    Science.gov (United States)

    Farrance, Ian; Frenkel, Robert

    2014-02-01

    The Guide to the Expression of Uncertainty in Measurement (usually referred to as the GUM) provides the basic framework for evaluating uncertainty in measurement. The GUM however does not always provide clearly identifiable procedures suitable for medical laboratory applications, particularly when internal quality control (IQC) is used to derive most of the uncertainty estimates. The GUM modelling approach requires advanced mathematical skills for many of its procedures, but Monte Carlo simulation (MCS) can be used as an alternative for many medical laboratory applications. In particular, calculations for determining how uncertainties in the input quantities to a functional relationship propagate through to the output can be accomplished using a readily available spreadsheet such as Microsoft Excel. The MCS procedure uses algorithmically generated pseudo-random numbers which are then forced to follow a prescribed probability distribution. When IQC data provide the uncertainty estimates the normal (Gaussian) distribution is generally considered appropriate, but MCS is by no means restricted to this particular case. With input variations simulated by random numbers, the functional relationship then provides the corresponding variations in the output in a manner which also provides its probability distribution. The MCS procedure thus provides output uncertainty estimates without the need for the differential equations associated with GUM modelling. The aim of this article is to demonstrate the ease with which Microsoft Excel (or a similar spreadsheet) can be used to provide an uncertainty estimate for measurands derived through a functional relationship. In addition, we also consider the relatively common situation where an empirically derived formula includes one or more 'constants', each of which has an empirically derived numerical value. Such empirically derived 'constants' must also have associated uncertainties which propagate through the functional relationship

  11. A risk score including microdeletions improves relapse prediction for standard and medium risk precursor B-cell acute lymphoblastic leukaemia in children.

    Science.gov (United States)

    Sutton, Rosemary; Venn, Nicola C; Law, Tamara; Boer, Judith M; Trahair, Toby N; Ng, Anthea; Den Boer, Monique L; Dissanayake, Anuruddhika; Giles, Jodie E; Dalzell, Pauline; Mayoh, Chelsea; Barbaric, Draga; Revesz, Tamas; Alvaro, Frank; Pieters, Rob; Haber, Michelle; Norris, Murray D; Schrappe, Martin; Dalla Pozza, Luciano; Marshall, Glenn M

    2018-02-01

    To prevent relapse, high risk paediatric acute lymphoblastic leukaemia (ALL) is treated very intensively. However, most patients who eventually relapse have standard or medium risk ALL with low minimal residual disease (MRD) levels. We analysed recurrent microdeletions and other clinical prognostic factors in a cohort of 475 uniformly treated non-high risk precursor B-cell ALL patients with the aim of better predicting relapse and refining risk stratification. Lower relapse-free survival at 7 years (RFS) was associated with IKZF1 intragenic deletions (P 5 × 10 -5 (P < 0·0001) and High National Cancer Institute (NCI) risk (P < 0·0001). We created a predictive model based on a risk score (RS) for deletions, MRD and NCI risk, extending from an RS of 0 (RS0) for patients with no unfavourable factors to RS2 +  for patients with 2 or 3 high risk factors. RS0, RS1, and RS2 +  groups had RFS of 93%, 78% and 49%, respectively, and overall survival (OS) of 99%, 91% and 71%. The RS provided greater discrimination than MRD-based risk stratification into standard (89% RFS, 96% OS) and medium risk groups (79% RFS, 91% OS). We conclude that this RS may enable better early therapeutic stratification and thus improve cure rates for childhood ALL. © 2017 John Wiley & Sons Ltd.

  12. Accuracy of standard measures of family planning service quality: findings from the simulated client method.

    Science.gov (United States)

    Tumlinson, Katherine; Speizer, Ilene S; Curtis, Siân L; Pence, Brian W

    2014-12-01

    In the field of international family planning, quality of care as a reproductive right is widely endorsed, yet we lack validated data-collection instruments that can accurately assess quality in terms of its public health importance. This study, conducted within 19 public and private facilities in Kisumu, Kenya, used the simulated client method to test the validity of three standard data-collection instruments used in large-scale facility surveys: provider interviews, client interviews, and observation of client-provider interactions. Results found low specificity and low positive predictive values in each of the three instruments for a number of quality indicators, suggesting that the quality of care provided may be overestimated by traditional methods of measurement. Revised approaches to measuring family planning service quality may be needed to ensure accurate assessment of programs and to better inform quality-improvement interventions. © 2014 The Population Council, Inc.

  13. The measurement of magnetic properties of electrical sheet steel - survey on methods and situation of standards

    CERN Document Server

    Sievert, J

    2000-01-01

    A brief review of the different requirements for magnetic measurement techniques for material research, modelling of material properties and grading of the electrical sheet steel for trade purposes is presented. In relation to the main application of laminated electrical steel, this paper deals with AC measurement techniques. Two standard methods, Epstein frame and Single Sheet Tester (SST), producing different results, are used in parallel. This dilemma was analysed in detail. The study leads to a possible solution of the problem, i.e. the possibility of converting the results of one of the two methods into the results of the other in order to satisfy the users of the Epstein method and, at the same time, to improve the acceptance of the more economical SST method.

  14. Software measurement standards for areal surface texture parameters: part 2—comparison of software

    International Nuclear Information System (INIS)

    Harris, P M; Smith, I M; Giusca, C; Leach, R K; Wang, C

    2012-01-01

    A companion paper in this issue describes reference software for the evaluation of areal surface texture parameters, focusing on the definitions of the parameters and giving details of the numerical algorithms employed in the software to implement those definitions. The reference software is used as a benchmark against which software in a measuring instrument can be compared. A data set is used as input to both the software under test and the reference software, and the results delivered by the software under test are compared with those provided by the reference software. This paper presents a comparison of the results returned by the reference software with those reported by proprietary software for surface texture measurement. Differences between the results can be used to identify where algorithms and software for evaluating the parameters differ. They might also be helpful in identifying where parameters are not sufficiently well-defined in standards. (paper)

  15. Balance Assessment Practices and Use of Standardized Balance Measures Among Ontario Physical Therapists

    Science.gov (United States)

    Sibley, Kathryn M.; Straus, Sharon E.; Inness, Elizabeth L.; Salbach, Nancy M.

    2011-01-01

    Background Balance impairment is a significant problem for older adults, as it can influence daily functioning. Treating balance impairment in this population is a major focus of physical therapist practice. Objective The purpose of this study was to document current practices in clinical balance assessment and compare components of balance assessed and measures used across practice areas among physical therapists. Design This was a cross-sectional study. Methods A survey questionnaire was mailed to 1,000 practicing physical therapists in Ontario, Canada. Results Three hundred sixty-nine individuals completed the survey questionnaire. More than 80% of respondents reported that they regularly (more than 60% of the time) assessed postural alignment, static and dynamic stability, functional balance, and underlying motor systems. Underlying sensory systems, cognitive contributions to balance, and reactive control were regularly assessed by 59.6%, 55.0%, and 41.2% of the respondents, respectively. The standardized measures regularly used by the most respondents were the single-leg stance test (79.1%), the Berg Balance Scale (45.0%), and the Timed “Up & Go” Test (27.6%). There was considerable variation in the components of balance assessed and measures used by respondents treating individuals in the orthopedic, neurologic, geriatric, and general rehabilitation populations. Limitations The survey provides quantitative data about what is done to assess balance, but does not explain the factors influencing current practice. Conclusions Many important components of balance and standardized measures are regularly used by physical therapists to assess balance. Further research, however, is needed to understand the factors contributing to the relatively lower rates of assessing reactive control, the component of balance most directly responsible for avoiding a fall. PMID:21868613

  16. BER-3.2 report: Methodology for justification and optimization of protective measures including a case study

    International Nuclear Information System (INIS)

    Hedemann Jensen, P.; Sinkko, K.; Walmod-Larsen, O.; Gjoerup, H.L.; Salo, A.

    1992-07-01

    This report is a part of the Nordic BER-3 project's work to propose and harmonize Nordic intervention levels for countermeasures in case of nuclear accidents. This report focuses on the methodology for justification and optimization of protective measures in case of a reactor accident situation with a large release of fission products to the environment. The down-wind situation is very complicated. The dose to the exposed society is almost unpredictable. The task of the radiation protection experts: To give advice to the decision makers on averted doses by the different actions at hand in the situation - is complicated. That of the decision makers is certainly more: On half of the society they represent, they must decide if they wish to follow the advices from their radiation protection experts or if they wish to add further arguments - economical or political (or personal) - into their considerations before their decisions are taken. Two analysis methods available for handling such situations: cost-benefit analysis and multi-attribute utility analysis are described in principle and are utilized in a case study: The impacts of a Chernobyl-like accident on the Swedish island of Gotland in the Baltic Sea are analyzed with regard to the acute consequences. The use of the intervention principles found in international guidance (IAEA 91, ICRP 91), which can be summarized as the principles of justification, optimization and avoidance of unacceptable doses, are described. How to handle more intangible factors of a psychological or political character is indicated. (au) (6 tabs., 3 ills., 17 refs.)

  17. Comparison of methods for estimating Wet-Bulb Globe Temperature index from standard meteorological measurements.

    Science.gov (United States)

    Patel, Tejash; Mullen, Stephen P; Santee, William R

    2013-08-01

    Environmental heat illness and injuries are a serious concern for the Army and Marines. Currently, the Wet-Bulb Globe Temperature (WBGT) index is used to evaluate heat injury risk. The index is a weighted average of dry-bulb temperature (Tdb), black globe temperature (Tbg), and natural wet-bulb temperature (Tnwb). The WBGT index would be more widely used if it could be determined using standard weather instruments. This study compares models developed by Liljegren at Argonne National Laboratory and by Matthew at the U.S. Army Institute of Environmental Medicine that calculate WBGT using standard meteorological measurements. Both models use air temperature (Ta), relative humidity, wind speed, and global solar radiation (RG) to calculate Tnwb and Tbg. The WBGT and meteorological data used for model validation were collected at Griffin, Georgia and Yuma Proving Ground (YPG), Arizona. Liljegren (YPG: R(2) = 0.709, p < 0.01; Griffin: R(2) = 0.854, p < 0.01) showed closer agreement between calculated and actual WBGT than Matthew (YPG: R(2) = 0.630, p < 0.01; Griffin: R(2) = 0.677, p < 0.01). Compared to actual WBGT heat categorization, the Matthew model tended to underpredict compared to Liljegren's classification. Results indicate Liljegren is an acceptable alternative to direct WBGT measurement, but verification under other environmental conditions is needed. Reprint & Copyright © 2013 Association of Military Surgeons of the U.S.

  18. Measurements of secondary emissions from plasma arc and laser cutting in standard experiments

    International Nuclear Information System (INIS)

    Pilot, G.; Noel, J.P.; Leautier, R.; Steiner, H.; Tarroni, G.; Waldie, B.

    1992-01-01

    As part of an inter-facility comparison of secondary emissions from plasma arc and laser-cutting techniques, standard cutting tests have been done by plasma arc underwater and in air, and by laser beam in air. The same team was commissioned to measure the secondary emissions (solid and gaseous) in each contractor's facility with the same measuring rig. 20 mm and 40 mm thick, grade 304 stainless-steel plates were cut by plasma-torch in three different facilities: Heriot Watt University of Edinburgh, Institut fuer Werkstoffkunde of Universitaet Hannover and CEA/CEN Cadarache. 10 mm and in some cases 20 mm thick, grade 304, stainless-steel plates were cut by laser beam in five different facilities: CEA-CEN Fontenay, CEA-CEN Saclay, Institut fuer Werkstoffkunde of Universitaet Hannover and ENEA/Frascati. The results obtained in the standard experiments are rather similar, and the differences that appear can be explained by the various scales of the involved facilities (semi-industrial and laboratory) and by some particularities in the cutting parameters (an additional secondary gas flow of oxygen in plasma cutting at Universitaet Hannover, for example)

  19. Standard 12-lead electrocardiography measures predictive of increased appropriate therapy in implantable cardioverter defibrillator recipients.

    Science.gov (United States)

    Shi, Bijia; Harding, Scott A; Jimenez, Alejandro; Larsen, Peter D

    2013-06-01

    Identification of patients most likely to benefit from implantable cardioverter defibrillator (ICD) implant remains a complex challenge. This study aimed to investigate the utility of measures derived from standard 10 s 12-lead electrocardiogrphy (ECG) without complex signal processing in predicting appropriate therapy in an ICD population. We examined 108 ICD patients for primary (n = 32) and secondary prevention (n = 76). Baseline clinical data and characteristics of QRS complex, T-wave, and heart rate from standard 12-lead ECG were examined and related to the occurrence of subsequent appropriate therapy. Over a mean follow-up of 29 ± 11 months, 44% of patients received appropriate therapy. Patients with depressed heart rate variability (HRV) (≤6.5%) were 2.68 [95% confidence interval (CI) 1.21-5.90, P = 0.015] times more likely to receive appropriate therapy than patients with HRV >6.5%. In patients with bundle branch block (BBB), large QRS dispersion of >39 ms was associated with 2.88 times risk (95% CI 1.24-6.71, P = 0.014) of experiencing appropriate therapy than those with QRS dispersion 0.4 mV. History of atrial arrhythmia [hazard ratio (HR) = 2.30, 95% CI 1.29-4.12, P = 0.005] and secondary prevention (HR = 2.55, 95% CI 1.14-5.71, P = 0.022) were also predictive of device therapy. Measurements from standard 12-lead ECG were predictive of appropriate therapy in a heterogeneous ICD population. Incorporation of 12-lead ECG parameters such as these into risk stratification models may improve our ability to select patients for ICD implantation.

  20. The COnsensus-based Standards for the selection of health Measurement INstruments (COSMIN and how to select an outcome measurement instrument

    Directory of Open Access Journals (Sweden)

    Lidwine B. Mokkink

    2016-01-01

    Full Text Available Background: COSMIN (COnsensus-based Standards for the selection of health Measurement INstruments is an initiative of an international multidisciplinary team of researchers who aim to improve the selection of outcome measurement instruments both in research and in clinical practice by developing tools for selecting the most appropriate available instrument. Method: In this paper these tools are described, i.e. the COSMIN taxonomy and definition of measurement properties; the COSMIN checklist to evaluate the methodological quality of studies on measurement properties; a search filter for finding studies on measurement properties; a protocol for systematic reviews of outcome measurement instruments; a database of systematic reviews of outcome measurement instruments; and a guideline for selecting outcome measurement instruments for Core Outcome Sets in clinical trials. Currently, we are updating the COSMIN checklist, particularly the standards for content validity studies. Also new standards for studies using Item Response Theory methods will be developed. Additionally, in the future we want to develop standards for studies on the quality of non-patient reported outcome measures, such as clinician-reported outcomes and performance-based outcomes. Conclusions: In summary, we plea for more standardization in the use of outcome measurement instruments, for conducting high quality systematic reviews on measurement instruments in which the best available outcome measurement instrument is recommended, and for stopping the use of poor outcome measurement instruments.

  1. International Comparison of FTIR measurements traceable to HITRAN and Gravimetrically prepared Gas Standards

    Science.gov (United States)

    Edgar, F.; Joële, V.; Philippe, M.; Faraz, I.; Robert, W.

    2009-12-01

    An international comparison is being organised to compare measurement results of gas concentration that are traceable to the HITRAN database with SI traceable values based on gravimetrically prepared gas standards. The results of the comparison will allow potential biases for specific gas species in the HITRAN database to be identified. The comparison is being performed on nitrogen dioxide (NO2) but the developed methodology is applicable to the majority of gas species in the HITRAN database. There is a high international priority attached to activities which reduce NOx in the atmosphere. The current level of permitted emissions is typically between 50 μmol/mol and 100 μmol/mol, but lower values are expected in the future. Currently, ambient air quality monitoring regulations also require the measurement of NOx mole fractions of 0.2 μmol/mol. To determine accurately the NO2 mole fraction at ambient levels high quality primary gas references are desirable. However, since NO2 is a reactive gas, stability problems in the gas standards below mole fractions of 100 μmol/mol, and particularly below 10 μmol/mol, has been observed and studied at the Bureau International des Poids et Mesures during the last years. An international comparison of NO2 measurement capabilities between National Metrology Institutes, coordinated by the Bureau International des Poids et Mesures, was initiated in June 2009. The comparison consists in two main protocols, the protocol CCQM-K74 and the protocol CCQM-P110. The procedure CCQM-K74 comparison is designed to evaluate the level of comparability of the laboratories’ measurement capabilities for NO2 at a nominal mole fraction of 10 μmol/mol based on their own measurement capabilities. The CCQM-P110 protocol, open only to laboratories maintaining Fourier Transform Infrared Spectroscopy facilities, aims to evaluate the level of comparability of laboratories’ NO2 measurement capabilities based on FTIR spectroscopy traceable to primary

  2. The iMTA Productivity Cost Questionnaire: A Standardized Instrument for Measuring and Valuing Health-Related Productivity Losses.

    Science.gov (United States)

    Bouwmans, Clazien; Krol, Marieke; Severens, Hans; Koopmanschap, Marc; Brouwer, Werner; Hakkaart-van Roijen, Leona

    2015-09-01

    Productivity losses often contribute significantly to the total costs in economic evaluations adopting a societal perspective. Currently, no consensus exists on the measurement and valuation of productivity losses. We aimed to develop a standardized instrument for measuring and valuing productivity losses. A group of researchers with extensive experience in measuring and valuing productivity losses designed an instrument suitable for self-completion, building on preknowledge and evidence on validity. The instrument was designed to cover all domains of productivity losses, thus allowing quantification and valuation of all productivity losses. A feasibility study was performed to check the questionnaire's consistency and intelligibility. The iMTA Productivity Cost Questionnaire (iPCQ) includes three modules measuring productivity losses of paid work due to 1) absenteeism and 2) presenteeism and productivity losses related to 3) unpaid work. Questions for measuring absenteeism and presenteeism were derived from existing validated questionnaires. Because validated measures of losses of unpaid work are scarce, the questions of this module were newly developed. To enhance the instrument's feasibility, simple language was used. The feasibility study included 195 respondents (response rate 80%) older than 18 years. Seven percent (n = 13) identified problems while filling in the iPCQ, including problems with the questionnaire's instructions and routing (n = 6) and wording (n = 2). Five respondents experienced difficulties in estimating the time that would be needed for other people to make up for lost unpaid work. Most modules of the iPCQ are based on validated questions derived from previously available instruments. The instrument is understandable for most of the general public. Copyright © 2015 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.

  3. Development of the Exam of GeoloGy Standards, EGGS, to Measure Students' Conceptual Understanding of Geology Concepts

    Science.gov (United States)

    Guffey, S. K.; Slater, T. F.; Slater, S. J.

    2017-12-01

    Discipline-based geoscience education researchers have considerable need for criterion-referenced, easy-to-administer and easy-to-score, conceptual diagnostic surveys for undergraduates taking introductory science survey courses in order for faculty to better be able to monitor the learning impacts of various interactive teaching approaches. To support ongoing discipline-based science education research to improve teaching and learning across the geosciences, this study establishes the reliability and validity of a 28-item, multiple-choice, pre- and post- Exam of GeoloGy Standards, hereafter simply called EGGS. The content knowledge EGGS addresses is based on 11 consensus concepts derived from a systematic, thematic analysis of the overlapping ideas presented in national science education reform documents including the Next Generation Science Standards, the AAAS Benchmarks for Science Literacy, the Earth Science Literacy Principles, and the NRC National Science Education Standards. Using community agreed upon best-practices for creating, field-testing, and iteratively revising modern multiple-choice test items using classical item analysis techniques, EGGS emphasizes natural student language over technical scientific vocabulary, leverages illustrations over students' reading ability, specifically targets students' misconceptions identified in the scholarly literature, and covers the range of topics most geology educators expect general education students to know at the end of their formal science learning experiences. The current version of EGGS is judged to be valid and reliable with college-level, introductory science survey students based on both standard quantitative and qualitative measures, including extensive clinical interviews with targeted students and systematic expert review.

  4. Synchronous Surface Pressure and Velocity Measurements of standard model in hypersonic flow

    Directory of Open Access Journals (Sweden)

    Zhijun Sun

    2018-01-01

    Full Text Available Experiments in the Hypersonic Wind tunnel of NUAA(NHW present synchronous measurements of bow shockwave and surface pressure of a standard blunt rotary model (AGARD HB-2, which was carried out in order to measure the Mach-5-flow above a blunt body by PIV (Particle Image Velocimetry as well as unsteady pressure around the rotary body. Titanium dioxide (Al2O3 Nano particles were seeded into the flow by a tailor-made container. With meticulous care designed optical path, the laser was guided into the vacuum experimental section. The transient pressure was obtained around model by using fast-responding pressure-sensitive paint (PSPsprayed on the model. All the experimental facilities were controlled by Series Pulse Generator to ensure that the data was time related. The PIV measurements of velocities in front of the detached bow shock agreed very well with the calculated value, with less than 3% difference compared to Pitot-pressure recordings. The velocity gradient contour described in accord with the detached bow shock that showed on schlieren. The PSP results presented good agreement with the reference data from previous studies. Our work involving studies of synchronous shock-wave and pressure measurements proved to be encouraging.

  5. Standardized Method for Measuring Collection Efficiency from Wipe-sampling of Trace Explosives.

    Science.gov (United States)

    Verkouteren, Jennifer R; Lawrence, Jeffrey A; Staymates, Matthew E; Sisco, Edward

    2017-04-10

    One of the limiting steps to detecting traces of explosives at screening venues is effective collection of the sample. Wipe-sampling is the most common procedure for collecting traces of explosives, and standardized measurements of collection efficiency are needed to evaluate and optimize sampling protocols. The approach described here is designed to provide this measurement infrastructure, and controls most of the factors known to be relevant to wipe-sampling. Three critical factors (the applied force, travel distance, and travel speed) are controlled using an automated device. Test surfaces are chosen based on similarity to the screening environment, and the wipes can be made from any material considered for use in wipe-sampling. Particle samples of the explosive 1,3,5-trinitroperhydro-1,3,5-triazine (RDX) are applied in a fixed location on the surface using a dry-transfer technique. The particle samples, recently developed to simulate residues made after handling explosives, are produced by inkjet printing of RDX solutions onto polytetrafluoroethylene (PTFE) substrates. Collection efficiency is measured by extracting collected explosive from the wipe, and then related to critical sampling factors and the selection of wipe material and test surface. These measurements are meant to guide the development of sampling protocols at screening venues, where speed and throughput are primary considerations.

  6. Development, improvement and calibration of neutronic reaction rates measurements: elaboration of a standard techniques basis

    International Nuclear Information System (INIS)

    Hudelot, J.P.

    1998-06-01

    In order to improve and to validate the neutronics calculation schemes, perfecting integral measurements of neutronics parameters is necessary. This thesis focuses on the conception, the improvement and the development of neutronics reaction rates measurements, and aims at building a base of standard techniques. Two subjects are discussed. The first one deals with direct measurements by fission chambers. A short presentation of the different usual techniques is given. Then, those last ones are applied through the example of doubling time measurements on the EOLE facility during the MISTRAL 1 experimental programme. Two calibration devices of fission chambers are developed: a thermal column located in the central part of the MINERVE facility, and a calibration cell using a pulsed high flux neutron generator and based on the discrimination of the energy of the neutrons with a time-of-flight method. This second device will soon allow to measure the mass of fission chambers with a precision of about 1 %. Finally, the necessity of those calibrations will be shown through spectral indices measurements in core MISTRAL 1 (UO 2 ) and MISTRAL 2 (MOX) of the EOLE facility. In each case, the associated calculation schemes, performed using the Monte Carlo MCNP code with the ENDF-BV library, will be validated. Concerning the second one, the goal is to develop a method for measuring the modified conversion ratio of 238 U (defined as the ratio of 238 U capture rate to total fission rate) by gamma-ray spectrometry of fuel rods. Within the framework of the MISTRAL 1 and MISTRAL 2 programmes, the measurement device, the experimental results and the spectrometer calibration are described. Furthermore, the MCNP calculations of neutron self-shielding and gamma self-absorption are validated. It is finally shown that measurement uncertainties are better than 1 %. The extension of this technique to future modified conversion ratio measurements for 242 Pu (on MOX rods) and 232 Th (on

  7. A novel synthetic quantification standard including virus and internal report targets: application for the detection and quantification of emerging begomoviruses on tomato

    OpenAIRE

    Péréfarres, Frédéric; Hoareau, Murielle; Chiroleu, Frédéric; Reynaud, Bernard; Dintinger, Jacques; Lett, Jean-Michel

    2011-01-01

    Abstract Background Begomovirus is a genus of phytopathogenic single-stranded DNA viruses, transmitted by the whitefly Bemisia tabaci. This genus includes emerging and economically significant viruses such as those associated with Tomato Yellow Leaf Curl Disease, for which diagnostic tools are needed to prevent dispersion and new introductions. Five real-time PCRs with an internal tomato reporter gene were developed for accurate detection and quantification of monopartite begomoviruses, inclu...

  8. Development of a standardized measure to assess food quality: a proof of concept.

    Science.gov (United States)

    Jomaa, L H; Hwalla, N C; Zidek, J M

    2016-11-09

    Food-based dietary guidelines are promoted to improve diet quality. In applying dietary recommendations, such as the MyPlate, the number of servings in a food group is the unit of measure used to make food selections. However, within each food group, different foods can vary greatly in their nutritional quality despite often having similar energy (caloric) values. This study aimed to develop a novel unit of measure that accounts for both the quantity of energy and the quality of nutrients, as defined by caloric and micronutrient density, respectively, in foods and to demonstrate its usability in identifying high quality foods within a food group. A standardized unit of measure reflecting the quality of kilocalories for nutrition (qCaln) was developed through a mathematical function dependent on the energy content (kilocalories per 100 g) and micronutrient density of foods items within a food group. Nutrition composition of 1806 food items was extracted from the USDA nutrient database. For each food item analyzed, qCaln ratios were calculated to compare qCaln to its caloric content. Finally, a case example was developed comparing two plates adapted from the MyPlate. Examples of food items with highest and lowest qCaln ratios were displayed for five food groups: vegetables, fruits/fruit juices, milk/dairy products, meats/meat alternatives, and breads/cereals. Additionally, the applicability of the qCaln was presented through comparing two plates, adopted from the USDA MyPlate, to show differences in food quality. The newly developed qCaln measure can be used to rank foods in terms of their nutrient density while accounting for their energy content. The proposed metric can provide consumers, public health professionals, researchers, and policy makers with an easy-to-understand measure of food quality and a practical tool to assess diet quality among individuals and population groups.

  9. Measuring Cytotoxicity by Bioluminescence Imaging Outperforms the Standard Chromium-51 Release Assay

    Science.gov (United States)

    Karimi, Mobin A.; Lee, Eric; Bachmann, Michael H.; Salicioni, Ana Maria; Behrens, Edward M.; Kambayashi, Taku; Baldwin, Cynthia L.

    2014-01-01

    The chromium-release assay developed in 1968 is still the most commonly used method to measure cytotoxicity by T cells and by natural killer cells. Target cells are loaded in vitro with radioactive chromium and lysis is determined by measuring chromium in the supernatant released by dying cells. Since then, alternative methods have been developed using different markers of target cell viability that do not involve radioactivity. Here, we compared and contrasted a bioluminescence (BLI)-based cytotoxicity assay to the standard radioactive chromium-release assay using an identical set of effector cells and tumor target cells. For this, we stably transduced several human and murine tumor cell lines to express luciferase. When co-cultured with cytotoxic effector cells, highly reproducible decreases in BLI were seen in an effector to target cell dose-dependent manner. When compared to results obtained from the chromium release assay, the performance of the BLI-based assay was superior, because of its robustness, increased signal-to-noise ratio, and faster kinetics. The reduced/delayed detection of cytotoxicity by the chromium release method was attributable to the association of chromium with structural components of the cell, which are released quickly by detergent solubilization but not by hypotonic lysis. We conclude that the (BLI)-based measurement of cytotoxicity offers a superior non-radioactive alternative to the chromium-release assay that is more robust and quicker to perform. PMID:24586714

  10. Blood pressure measurement in children: which method? which is the gold standard.

    Science.gov (United States)

    Vidal, Enrico; Murer, Luisa; Matteucci, Maria Chiara

    2013-01-01

    The burden of hypertension has become increasingly prevalent in children. Hypertension that begins in childhood can carry on into adulthood, therefore early detection, accurate diagnosis and effective therapy of high blood pressure may improve long-term outcomes of children and adolescents. As far as pediatric hypertension is concerned, doubts still persist about the right instruments, modalities and standards of reference that should be used in routine practice. Due to the dynamic process of growth and development, many physiological parameters undergo intensive change with age. Therefore, in children, the definition of hypertension can not rely on a single blood pressure level but should be based on age- and height-specific percentiles. In this review, we introduce the nephrologist to the correct definition of high blood pressure in children. Moreover, we specifically address the main characteristics of different modalities for blood pressure measurement in children, focusing on practical aspects. The latest international guidelines and appropriate standards of reference for office, ambulatory and home blood pressure data collection are presented. As clinicians are being faced with a greater number of children with hypertension, they should be aware of these peculiarities.

  11. Standard gasoline and ACPM - First year of the measure of prices liberation

    International Nuclear Information System (INIS)

    Unidad de Planeacion Minero Energetica, UPME

    2000-01-01

    At the beginning of 1999, amid a lingering period of low prices of petroleum and relative stability of exchange market, due at the high interest rate levels and it offers of foreign currencies, the national government through the of Mines and Energy Ministry, decided to start the prices liberation for the standard gasoline and ACPM. In that moment, the economic politics of the state pointed to the maintenance of the flow of revenues of the nation, the creation of incentives for the investment of the national and international private capital, the elimination of subsidies of the public sector, the best assignment in the public expense, the construction of an environment competitive and the protection of the real wage of the Colombians. Specifically of liberation measure of the internal prices of standard gasoline and ACPM modified two components of prices structure, entrance to the producer (EP) and margin retailer; to rationalize the finances of Ecopetrol, to achieve a better assignment of the public resources with the elimination from an outline of subsidies to high strata, to look for the specialization of the state company in the exploration, to eliminate the effect that the expectations of increment of prices of hydrocarbons generated on the prices to the consumer index (PCI), to foment the free competition along the production chain and distribution and to improve the quality and covering of the service, by means of the incentive to the investments

  12. Development of a portable transfer standard for the neutron source yield measurement using Fluka simulation

    Energy Technology Data Exchange (ETDEWEB)

    Sen, Meghnath, E-mail: meghms@barc.gov.in; Sathian, V.; Shobha, G.; Sujatha, P.N.; Yashoda, S.; Kulkarni, M.S.; Babu, D.A.R.

    2015-10-01

    A novel light weight portable transfer standard has been developed for the standardization of the laboratory neutron sources conforming to in-situ measurements. Fluka simulation was used to optimize various design parameters in such a way that it has almost constant efficiency for the neutron energy range of few keV to 10 MeV and also for the commonly available laboratory neutron sources. The optimized total length and radius of the system is 39.4 cm and 13.5 cm, respectively. The weight of the system is about 22 kg. The efficiency of the system obtained from experiment for four laboratory neutron sources ({sup 241}Am–Be, {sup 241}Am–B, {sup 252}Cf and {sup 241}Am–F) is constant within the deviation of ±8% having the average value of 2.85 counts/n.cm{sup 2}. The efficiency of the system was also calculated from Fluka simulation. For mono-energetic neutrons from 1 MeV to 10 MeV the efficiency of the system was found to be constant within ±10% having the average value of 2.9 counts/n.cm{sup 2} and for the above mentioned four neutron sources also it is constant within ±4% with the same average value.

  13. 48 CFR 9904.412 - Cost accounting standard for composition and measurement of pension cost.

    Science.gov (United States)

    2010-10-01

    ... 48 Federal Acquisition Regulations System 7 2010-10-01 2010-10-01 false Cost accounting standard... Regulations System COST ACCOUNTING STANDARDS BOARD, OFFICE OF FEDERAL PROCUREMENT POLICY, OFFICE OF MANAGEMENT AND BUDGET PROCUREMENT PRACTICES AND COST ACCOUNTING STANDARDS COST ACCOUNTING STANDARDS 9904.412 Cost...

  14. Analysis of the existing Standard on Power performance measurement and its application in complex terrain

    International Nuclear Information System (INIS)

    Cuerva, A.

    1997-01-01

    There are some groups working on the improvement of the existing Standard and recommendation on WECS power performance measurement and analysis. One of them, besides the one working in this project, is the MEASNET expert group. This one is trying to adequate the main reference, the IEC1400-12 Ref. [9]. to the current requirements on technical quality and trueness. Within this group and the MEASNET one, many deficiencies have been detected in the procedure followed up to now. Several of them belong to general aspects of the method (calculations, assumptions, etc. ) but the most critical fact regards to the inherent characteristics of complex terrain and to the issue of site calibration and uncertainties due to it, specifically. (Author)

  15. Including indigestible carbohydrates in the evening meal of healthy subjects improves glucose tolerance, lowers inflammatory markers, and increases satiety after a subsequent standardized breakfast

    DEFF Research Database (Denmark)

    Nilsson, Anne C; Ostman, Elin M; Holst, Jens Juul

    2008-01-01

    Low-glycemic index (GI) foods and foods rich in whole grain are associated with reduced risk of type 2 diabetes and cardiovascular disease. We studied the effect of cereal-based bread evening meals (50 g available starch), varying in GI and content of indigestible carbohydrates, on glucose...... (GLP-1), serum interleukin (IL)-6, serum IL-8, and plasma adiponectin. Satiety was subjectively rated after breakfast and the gastric emptying rate (GER) was determined using paracetamol as a marker. Breath hydrogen was measured as an indicator of colonic fermentation. Evening meals with barley kernel......-kernel bread compared with WWB. Breath hydrogen correlated positively with satiety (r = 0.27; P carbohydrates of the evening meal may affect glycemic excursions and related metabolic risk variables at breakfast...

  16. Towards standardized measurement of adverse events in spine surgery: conceptual model and pilot evaluation

    Directory of Open Access Journals (Sweden)

    Deyo Richard A

    2006-06-01

    Full Text Available Abstract Background Independent of efficacy, information on safety of surgical procedures is essential for informed choices. We seek to develop standardized methodology for describing the safety of spinal operations and apply these methods to study lumbar surgery. We present a conceptual model for evaluating the safety of spine surgery and describe development of tools to measure principal components of this model: (1 specifying outcome by explicit criteria for adverse event definition, mode of ascertainment, cause, severity, or preventability, and (2 quantitatively measuring predictors such as patient factors, comorbidity, severity of degenerative spine disease, and invasiveness of spine surgery. Methods We created operational definitions for 176 adverse occurrences and established multiple mechanisms for reporting them. We developed new methods to quantify the severity of adverse occurrences, degeneration of lumbar spine, and invasiveness of spinal procedures. Using kappa statistics and intra-class correlation coefficients, we assessed agreement for the following: four reviewers independently coding etiology, preventability, and severity for 141 adverse occurrences, two observers coding lumbar spine degenerative changes in 10 selected cases, and two researchers coding invasiveness of surgery for 50 initial cases. Results During the first six months of prospective surveillance, rigorous daily medical record reviews identified 92.6% of the adverse occurrences we recorded, and voluntary reports by providers identified 38.5% (surgeons reported 18.3%, inpatient rounding team reported 23.1%, and conferences discussed 6.1%. Trained observers had fair agreement in classifying etiology of 141 adverse occurrences into 18 categories (kappa = 0.35, but agreement was substantial (kappa ≥ 0.61 for 4 specific categories: technical error, failure in communication, systems failure, and no error. Preventability assessment had moderate agreement (mean weighted

  17. GUIDELINES FOR THE DESIGN OF INSTRUMENTS OF COMPETENCE ASSESSMENT FROM MEASURABLE LEARNING STANDARDS

    Directory of Open Access Journals (Sweden)

    Ignacio Polo Martínez

    2015-06-01

    Full Text Available The evaluation is not enough to improve education necessary but. There are no magic to educational problems through assessment solutions. However, the evaluation, in partnership with the methodology, should be treated as determinants of the quality and improving education. Surely, many teachers agree that it is not easy to design assessment tools that are consistent with the teaching-learning competence and, in turn, with rigorous curriculum standards and assessment of the Autonomous Community. However, this difficulty should not be considered as an impossibility. Through this article aims to contribute to building an improvement plan associated with the design of the evaluation instruments from different areas or subjects. Such a design should be based on at least the following: I want to evaluate (referring to assessment criteria and standards of assessable learning included in the instrument and I will evaluate (referencing jurisdictional situations linked to next and known student context, HOW I WILL QUALIFY (preserving the right of students to an objective evaluation and what consequences they will have the results for the different agents that make up the educational community.

  18. Evaluation of template-based models in CASP8 with standard measures

    KAUST Repository

    Cozzetto, Domenico

    2009-01-01

    The strategy for evaluating template-based models submitted to CASP has continuously evolved from CASP1 to CASP5, leading to a standard procedure that has been used in all subsequent editions. The established approach includes methods for calculating the quality of each individual model, for assigning scores based on the distribution of the results for each target and for computing the statistical significance of the differences in scores between prediction methods. These data are made available to the assessor of the template-based modeling category, who uses them as a starting point for further evaluations and analyses. This article describes the detailed workflow of the procedure, provides justifications for a number of choices that are customarily made for CASP data evaluation, and reports the results of the analysis of template-based predictions at CASP8.

  19. Alcoholism in the Families of Origin of MSW Students: Estimating the Prevalence of Mental Health Problems Using Standardized Measures.

    Science.gov (United States)

    Hawkins, Catherine A.; Hawkins, Raymond C., II

    1996-01-01

    A 1991 study of 136 graduate social work students determined students' status as adult children of alcoholics (ACAs) by self-report and standardized screening test scores, and evaluated mental health functioning with four standardized measures. Results found that 47% of the social work students were ACAs, and not all (or only) ACAs were vulnerable…

  20. Validation of ultrasonographic muscle thickness measurements as compared to the gold standard of computed tomography in dogs

    Directory of Open Access Journals (Sweden)

    Lindsey E. Bullen

    2017-01-01

    Full Text Available Objective The objective was to quantitatively evaluate the validity of ultrasonographic (US muscle measurements as compared to the gold standard of computed tomography (CT in the canine. Design This was a prospective study. Population Twenty-five, client-owned dogs scheduled for CT as part of a diagnostic work-up for the management of their primary disease process were included. Materials and Methods Specific appendicular (cubital flexors and extensors, coxofemoral flexors and extensors and axial (temporalis, supraspinatus, infraspinatus, lumbar epaxials muscle groups were selected for quantitative measure based on CT planning and patient position. Prior to CT scan, the skin over the muscle sites was shaved and marked with a permanent marker. Patient body position was determined based on the patient’s CT plan; positioning was consistent between CT and US imaging. To ensure identical imaging position for both CT and US measurements, radio-opaque fiducial markers were placed directly over the skin marks once the dog was positioned. Quantitative measurements (cm for both lean muscle mass (LMM and subcutaneous adipose (SQA were recorded. Statistical comparisons between CT and US values were done separately for each site and type. Results Muscle groups and associated SQA measured by US and CT were not statistically different based on an adjusted p-value using Bonferroni’s correction (p 0.897 for LLM and >0.8289 for SQA. Conclusions Ultrasound imaging of selected appendicular and axial muscle groups in dogs can provide comparable assessment of muscle thickness to the current gold standard, CT. In consideration of both statistical reliability to CT and cage-side accessibility, the temporalis, supraspinatus, infraspinatus, and lumbar epaxial LMM sites are considered the most useful targets for US LMM assessment in the canine. Our findings support the potential utility of US as a clinical tool in veterinary medicine to assess LMM status in patients

  1. Technology development for evaluation of operational quantities and measurement standard in radiation protection

    Energy Technology Data Exchange (ETDEWEB)

    Chang, Si Young; Lee, T. Y.; Kim, J. L.; Kim, B. H.; Chung, K. K.; Lee, J. I.; Park, T. S.; Ha, S. H.; Oh, P. J.; Jun, K. J

    1999-03-01

    A study on the fabrication of a new personal thermo-luminescence dosimeter, which can evaluate the personal dose equivalent H{sub p}(d), has been performed. Optimum conditions for fabrications of a LiF:Mg, Cu, Na, Si TL phosphor powder has been determined and a disc type TL pellet has been fabricated from this TL powder. Another type of CaSO{sub 4}:Dy, Mo TL material has been also fabricated. These two TL materials have shown greater TL sensitivity than the foreign-made commercial TL materials. Mono-energetic florescence X-rays from 8.6 response have been constructed and evaluated for the performance of the purity, air kerma, beam uniformity and distribution,and scattered fraction of X-rays.A free-air ionization chamber for the absolute measurement of air kerma in medium X-ray has been designed and constructed. Experimental results showed that the homemade chamber leaves nothing to be desired, compared with the national standard chambers in other advanced countries. Gas proportional counting system has been designed and constructed for absolute activity measurements of gaseous radionuclides. Unattached fractions of radon progeny were evaluated in the characteristic study on the detection of radon progeny.

  2. A framework for the definition of standardized protocols for measuring upper-extremity kinematics.

    Science.gov (United States)

    Kontaxis, A; Cutti, A G; Johnson, G R; Veeger, H E J

    2009-03-01

    Increasing interest in upper extremity biomechanics has led to closer investigations of both segment movements and detailed joint motion. Unfortunately, conceptual and practical differences in the motion analysis protocols used up to date reduce compatibility for post data and cross validation analysis and so weaken the body of knowledge. This difficulty highlights a need for standardised protocols, each addressing a set of questions of comparable content. The aim of this work is therefore to open a discussion and propose a flexible framework to support: (1) the definition of standardised protocols, (2) a standardised description of these protocols, and (3) the formulation of general recommendations. Proposal of a framework for the definition of standardized protocols. The framework is composed by two nested flowcharts. The first defines what a motion analysis protocol is by pointing out its role in a motion analysis study. The second flowchart describes the steps to build a protocol, which requires decisions on the joints or segments to be investigated and the description of their mechanical equivalent model, the definition of the anatomical or functional coordinate frames, the choice of marker or sensor configuration and the validity of their use, the definition of the activities to be measured and the refinements that can be applied to the final measurements. Finally, general recommendations are proposed for each of the steps based on the current literature, and open issues are highlighted for future investigation and standardisation. Standardisation of motion analysis protocols is urgent. The proposed framework can guide this process through the rationalisation of the approach.

  3. Measurement uncertainty of lesion and reference mediastinum standardized uptake value in lung cancer.

    Science.gov (United States)

    Laffon, Eric; Milpied, Noel; Marthan, Roger

    2017-06-01

    To assess standardized uptake value (SUV) measurement uncertainty (MU) of lung cancer lesions with uptake greater than mediastinum but less than or equal to the liver and that of the mediastinum blood pool, and to compare lesion SUV with mediastinum SUV by assessing MU of their ratio. Dynamic PET data involving 10 frames were retrospectively analyzed in 10 patients, yielding maximal SUV of 25 lesions (Lesion-SUVmax), 10 mediastinum SUV, either maximal or mean (Med-SUVmax, Med-SUVmean), 25 Rmax ratios (=Lesion-SUVmax/Med-SUVmax), and 25 Rmean ratios (=Lesion-SUVmax/Med-SUVmean). A mean coefficient of variation was calculated for each parameter, leading to relative measurement uncertainty (MUr), respectively. MU of Rmax was found to involve both Lesion-SUVmax and Med-SUVmax MU: MUr=33.3-23.3-21.9%, respectively (95% confidence level). No significant difference in MUr was found between Med-SUVmax and Med-SUVmean and between Rmax and Rmean. Comparison between target lesion SUV and reference mediastinum SUV must take into account SUV MU of both. Therefore, no MU reduction can be expected from using the lesion/mediastinum SUVmax ratio instead of Lesion-SUVmax. Moreover, no MU reduction can be expected from using the mean mediastinum SUV instead of the maximal one.

  4. Towards a Quantitative Performance Measurement Framework to Assess the Impact of Geographic Information Standards

    Science.gov (United States)

    Vandenbroucke, D.; Van Orshoven, J.; Vancauwenberghe, G.

    2012-12-01

    Over the last decennia, the use of Geographic Information (GI) has gained importance, in public as well as in private sector. But even if many spatial data and related information exist, data sets are scattered over many organizations and departments. In practice it remains difficult to find the spatial data sets needed, and to access, obtain and prepare them for using in applications. Therefore Spatial Data Infrastructures (SDI) haven been developed to enhance the access, the use and sharing of GI. SDIs consist of a set of technological and non-technological components to reach this goal. Since the nineties many SDI initiatives saw light. Ultimately, all these initiatives aim to enhance the flow of spatial data between organizations (users as well as producers) involved in intra- and inter-organizational and even cross-country business processes. However, the flow of information and its re-use in different business processes requires technical and semantic interoperability: the first should guarantee that system components can interoperate and use the data, while the second should guarantee that data content is understood by all users in the same way. GI-standards within the SDI are necessary to make this happen. However, it is not known if this is realized in practice. Therefore the objective of the research is to develop a quantitative framework to assess the impact of GI-standards on the performance of business processes. For that purpose, indicators are defined and tested in several cases throughout Europe. The proposed research will build upon previous work carried out in the SPATIALIST project. It analyzed the impact of different technological and non-technological factors on the SDI-performance of business processes (Dessers et al., 2011). The current research aims to apply quantitative performance measurement techniques - which are frequently used to measure performance of production processes (Anupindi et al., 2005). Key to reach the research objectives

  5. Computerized tomography magnified bone windows are superior to standard soft tissue windows for accurate measurement of stone size: an in vitro and clinical study.

    Science.gov (United States)

    Eisner, Brian H; Kambadakone, Avinash; Monga, Manoj; Anderson, James K; Thoreson, Andrew A; Lee, Hang; Dretler, Stephen P; Sahani, Dushyant V

    2009-04-01

    We determined the most accurate method of measuring urinary stones on computerized tomography. For the in vitro portion of the study 24 calculi, including 12 calcium oxalate monohydrate and 12 uric acid stones, that had been previously collected at our clinic were measured manually with hand calipers as the gold standard measurement. The calculi were then embedded into human kidney-sized potatoes and scanned using 64-slice multidetector computerized tomography. Computerized tomography measurements were performed at 4 window settings, including standard soft tissue windows (window width-320 and window length-50), standard bone windows (window width-1120 and window length-300), 5.13x magnified soft tissue windows and 5.13x magnified bone windows. Maximum stone dimensions were recorded. For the in vivo portion of the study 41 patients with distal ureteral stones who underwent noncontrast computerized tomography and subsequently spontaneously passed the stones were analyzed. All analyzed stones were 100% calcium oxalate monohydrate or mixed, calcium based stones. Stones were prospectively collected at the clinic and the largest diameter was measured with digital calipers as the gold standard. This was compared to computerized tomography measurements using 4.0x magnified soft tissue windows and 4.0x magnified bone windows. Statistical comparisons were performed using Pearson's correlation and paired t test. In the in vitro portion of the study the most accurate measurements were obtained using 5.13x magnified bone windows with a mean 0.13 mm difference from caliper measurement (p = 0.6). Measurements performed in the soft tissue window with and without magnification, and in the bone window without magnification were significantly different from hand caliper measurements (mean difference 1.2, 1.9 and 1.4 mm, p = 0.003, window settings with magnification. For uric acid calculi the measurement error was observed only in standard soft tissue window settings. In vivo 4.0x

  6. Measuring the Hubble constant with Type Ia supernovae as near-infrared standard candles

    Science.gov (United States)

    Dhawan, Suhail; Jha, Saurabh W.; Leibundgut, Bruno

    2018-01-01

    The most precise local measurements of H0 rely on observations of Type Ia supernovae (SNe Ia) coupled with Cepheid distances to SN Ia host galaxies. Recent results have shown tension comparing H0 to the value inferred from CMB observations assuming ΛCDM, making it important to check for potential systematic uncertainties in either approach. To date, precise local H0 measurements have used SN Ia distances based on optical photometry, with corrections for light curve shape and colour. Here, we analyse SNe Ia as standard candles in the near-infrared (NIR), where luminosity variations in the supernovae and extinction by dust are both reduced relative to the optical. From a combined fit to 9 nearby calibrator SNe with host Cepheid distances from Riess et al. (2016) and 27 SNe in the Hubble flow, we estimate the absolute peak J magnitude MJ = -18.524 ± 0.041 mag and H0 = 72.8 ± 1.6 (statistical) ±2.7 (systematic) km s-1 Mpc-1. The 2.2% statistical uncertainty demonstrates that the NIR provides a compelling avenue to measuring SN Ia distances, and for our sample the intrinsic (unmodeled) peak J magnitude scatter is just 0.10 mag, even without light curve shape or colour corrections. Our results do not vary significantly with different sample selection criteria, though photometric calibration in the NIR may be a dominant systematic uncertainty. Our findings suggest that tension in the competing H0 distance ladders is likely not a result of supernova systematics that could be expected to vary between optical and NIR wavelengths, like dust extinction. We anticipate further improvements in H0 with a larger calibrator sample of SNe Ia with Cepheid distances, more Hubble flow SNe Ia with NIR light curves, and better use of the full NIR photometric data set beyond simply the peak J-band magnitude.

  7. Customisation and Desirable Characteristics of a Standard Method of Measurement for Building Works in Ghana

    Directory of Open Access Journals (Sweden)

    Gabriel Nani

    2012-11-01

    Full Text Available This paper reports a study that identified andcategorised the modifications to the 5thEdition of the British Standard Method ofMeasurement (SMM5 of building works inGhana. Typical modifications involved ‘costinsignificant items’, ‘minor labour items’,‘custom units of measurement’, ‘methodrelated items’, ‘combinable items’,‘subordinate items’, and ‘items of minorinformative impact’. It was also observed thatthe desirable characteristics/ qualities ofstandard methods of measurement (SMM ofbuilding work were noteworthy, since theyprovide insight into the nature of a SMMrequired for the construction industry inGhana.The research reviewed available literature,various SMMs and bills of quantities (BQs.The relevance of the modifications andSMM characteristics identified wasconfirmed by a survey of the opinions ofprofessional quantity surveyors conductedthrough a carefully designed questionnaire.Inferences from the opinion survey formedthe basis for grouping both SMMmodifications found and the desired qualitiesof a SMM for Ghana.Survey respondents confirmed all theidentified modifications to the British SMM,except for the elimination of items of minorinformative impact. It was held that allinformation was relevant in measurement.Desirable characteristics of a SMM were ratedin decreasing order of relevance as: easylocation of items; cost significance; simplicity;thoroughness; ease of cost analysis; goodpractice; conciseness; adoptability; precision;industry practice; stakeholders’ opinion;custom classification; regional relevance; andinclusion of jargon. It was noted that therelevance of these characteristics may varyform one region to the other as a result oftechnological, cultural and legal differences.However, the desired SMM characteristicswere recommended as fundamental indeveloping an appropriate SMM for Ghana.

  8. Improving health, safety and energy efficiency in New Zealand through measuring and applying basic housing standards.

    Science.gov (United States)

    Gillespie-Bennett, Julie; Keall, Michael; Howden-Chapman, Philippa; Baker, Michael G

    2013-08-02

    Substandard housing is a problem in New Zealand. Historically there has been little recognition of the important aspects of housing quality that affect people's health and safety. In this viewpoint article we outline the importance of assessing these factors as an essential step to improving the health and safety of New Zealanders and household energy efficiency. A practical risk assessment tool adapted to New Zealand conditions, the Healthy Housing Index (HHI), measures the physical characteristics of houses that affect the health and safety of the occupants. This instrument is also the only tool that has been validated against health and safety outcomes and reported in the international peer-reviewed literature. The HHI provides a framework on which a housing warrant of fitness (WOF) can be based. The HHI inspection takes about one hour to conduct and is performed by a trained building inspector. To maximise the effectiveness of this housing quality assessment we envisage the output having two parts. The first would be a pass/fail WOF assessment showing whether or not the house meets basic health, safety and energy efficiency standards. The second component would rate each main assessment area (health, safety and energy efficiency), potentially on a five-point scale. This WOF system would establish a good minimum standard for rental accommodation as well encouraging improved housing performance over time. In this article we argue that the HHI is an important, validated, housing assessment tool that will improve housing quality, leading to better health of the occupants, reduced home injuries, and greater energy efficiency. If required, this tool could be extended to also cover resilience to natural hazards, broader aspects of sustainability, and the suitability of the dwelling for occupants with particular needs.

  9. Comparison between predicted duct effectiveness from proposed ASHRAE Standard 152P and measured field data for residential forced air cooling systems; TOPICAL

    International Nuclear Information System (INIS)

    Siegel, Jeffrey A.; McWilliams, Jennifer A.; Walker, Iain S.

    2002-01-01

    The proposed ASHRAE Standard 152P ''Method of Test for Determining the Design and Seasonal Efficiencies of Residential Thermal Distribution Systems'' (ASHRAE 2002) has recently completed its second public review. As part of the standard development process, this study compares the forced air distribution system ratings provided by the public review draft of Standard 152P to measured field results. 58 field tests were performed on cooling systems in 11 homes in the summers of 1998 and 1999. Seven of these houses had standard attics with insulation on the attic floor and a well-vented attic space. The other four houses had unvented attics where the insulation is placed directly under the roof deck and the attic space is not deliberately vented. Each house was tested under a range of summer weather conditions at each particular site, and in some cases the amount of duct leakage was intentionally varied. The comparison between 152P predicted efficiencies and the measured results includes evaluation of the effects of weather, duct location, thermal conditions, duct leakage, and system capacity. The results showed that the difference between measured delivery effectiveness and that calculated using proposed Standard 152P is about 5 percentage points if weather data, duct leakage and air handler flow are well known. However, the accuracy of the standard is strongly dependent on having good measurements of duct leakage and system airflow. Given that the uncertainty in the measured delivery effectiveness is typically also about 5 percentage points, the Standard 152P results are acceptably close to the measured data

  10. Standards for Measurements in the Field of High Frequency Electromagnetic Radiation for the Purpose of Protection Against Adverse Health Effects

    International Nuclear Information System (INIS)

    Tanatarec, B.; Nikolic, N.

    2011-01-01

    In this paper standards for measurements in the field of high frequency electromagnetic radiation are described with a view to protection from its hazardous action. Beside the standards which directly deal with high frequency electromagnetic radiation measurements, guidelines which describe hazardous influences of high frequency electromagnetic radiation on human body in the form of specific absorption rate (SAR) are given. Special attention is dedicated to standards and regulations, which are dealing with social responsibility, as well as with social responsibility in the field of high frequency radiation. This area is new and insufficiently known, rarely extended in everyday life. (author)

  11. Standardization of vitrinite reflectance measurements in shale petroleum systems: How accurate are my Ro data?

    Science.gov (United States)

    Hackley, Paul C.

    2014-01-01

    Vitrinite reflectance generally is considered the most robust thermal maturity parameter available for application to hydrocarbon exploration and petroleum system evaluation. However, until 2011 there was no standardized methodology available to provide guidelines for vitrinite reflectance measurements in shale. Efforts to correct this deficiency resulted in publication of ASTM D7708-11: Standard test method for microscopical determination of the reflectance of vitrinite dispersed in sedimentary rocks. In 2012-2013, an interlaboratory exercise was conducted to establish precision limits for the measurement technique. Six samples, representing a wide variety of shale, were tested in duplicate by 28 analysts in 22 laboratories from 14 countries. Samples ranged from immature to overmature (Ro 0.31-1.53%), from organic-rich to organic-lean (1-22 wt.% total organic carbon), and contained Type I (lacustrine), Type II (marine), and Type III (terrestrial) kerogens. Repeatability values (difference between repetitive results from same operator, same conditions) ranged from 0.03-0.11% absolute reflectance, whereas reproducibility values (difference between results obtained on same test material by different operators, different laboratories) ranged from 0.12-0.54% absolute reflectance. Repeatability and reproducibility degraded consistently with increasing maturity and decreasing organic content. However, samples with terrestrial kerogens (Type III) fell off this trend, showing improved levels of reproducibility due to higher vitrinite content and improved ease of identification. Operators did not consistently meet the reporting requirements of the test method, indicating that a common reporting template is required to improve data quality. The most difficult problem encountered was the petrographic distinction of solid bitumens and low-reflecting inert macerals from vitrinite when vitrinite occurred with reflectance ranges overlapping the other components. Discussion among

  12. Standardization of reflectance measurements in dispersed organic matter: results of an exercise to improve interlaboratory agreement

    Science.gov (United States)

    Hackley, Paul C.; Araujo, Carla Viviane; Borrego, Angeles G.; Bouzinos, Antonis; Cardott, Brian; Cook, Alan C.; Eble, Cortland; Flores, Deolinda; Gentzis, Thomas; Gonçalves, Paula Alexandra; Filho, João Graciano Mendonça; Hámor-Vidó, Mária; Jelonek, Iwona; Kommeren, Kees; Knowles, Wayne; Kus, Jolanta; Mastalerz, Maria; Menezes, Taíssa Rêgo; Newman, Jane; Pawlewicz, Mark; Pickel, Walter; Potter, Judith; Ranasinghe, Paddy; Read, Harold; Reyes, Julito; Rodriguez, Genaro De La Rosa; de Souza, Igor Viegas Alves Fernandes; Suarez-Ruiz, Isabel; Sýkorová, Ivana; Valentine, Brett J.

    2015-01-01

    Vitrinite reflectance generally is considered the most robust thermal maturity parameter available for application to hydrocarbon exploration and petroleum system evaluation. However, until 2011 there was no standardized methodology available to provide guidelines for vitrinite reflectance measurements in shale. Efforts to correct this deficiency resulted in publication of ASTM D7708: Standard test method for microscopical determination of the reflectance of vitrinite dispersed in sedimentary rocks. In 2012-2013, an interlaboratory exercise was conducted to establish precision limits for the D7708 measurement technique. Six samples, representing a wide variety of shale, were tested in duplicate by 28 analysts in 22 laboratories from 14 countries. Samples ranged from immature to overmature (0.31-1.53% Ro), from organic-lean to organic-rich (1-22 wt.% total organic carbon), and contained Type I (lacustrine), Type II (marine), and Type III (terrestrial) kerogens. Repeatability limits (maximum difference between valid repetitive results from same operator, same conditions) ranged from 0.03-0.11% absolute reflectance, whereas reproducibility limits (maximum difference between valid results obtained on same test material by different operators, different laboratories) ranged from 0.12-0.54% absolute reflectance. Repeatability and reproducibility limits degraded consistently with increasing maturity and decreasing organic content. However, samples with terrestrial kerogens (Type III) fell off this trend, showing improved levels of reproducibility due to higher vitrinite content and improved ease of identification. Operators did not consistently meet the reporting requirements of the test method, indicating that a common reporting template is required to improve data quality. The most difficult problem encountered was the petrographic distinction of solid bitumens and low-reflecting inert macerals from vitrinite when vitrinite occurred with reflectance ranges overlapping

  13. Standard practice of calibration of force-measuring instruments for verifying the force indication of testing machines

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    2006-01-01

    1.1 The purpose of this practice is to specify procedures for the calibration of force-measuring instruments. Procedures are included for the following types of instruments: 1.1.1 Elastic force-measuring instruments, and 1.1.2 Force-multiplying systems, such as balances and small platform scales. Note 1Verification by deadweight loading is also an acceptable method of verifying the force indication of a testing machine. Tolerances for weights for this purpose are given in Practices E 4; methods for calibration of the weights are given in NIST Technical Note 577, Methods of Calibrating Weights for Piston Gages. 1.2 The values stated in SI units are to be regarded as the standard. Other metric and inch-pound values are regarded as equivalent when required. 1.3 This practice is intended for the calibration of static force measuring instruments. It is not applicable for dynamic or high speed force calibrations, nor can the results of calibrations performed in accordance with this practice be assumed valid for...

  14. Quantitative optical microscopy: measurement of cellular biophysical features with a standard optical microscope.

    Science.gov (United States)

    Phillips, Kevin G; Baker-Groberg, Sandra M; McCarty, Owen J T

    2014-04-07

    We describe the use of a standard optical microscope to perform quantitative measurements of mass, volume, and density on cellular specimens through a combination of bright field and differential interference contrast imagery. Two primary approaches are presented: noninterferometric quantitative phase microscopy (NIQPM), to perform measurements of total cell mass and subcellular density distribution, and Hilbert transform differential interference contrast microscopy (HTDIC) to determine volume. NIQPM is based on a simplified model of wave propagation, termed the paraxial approximation, with three underlying assumptions: low numerical aperture (NA) illumination, weak scattering, and weak absorption of light by the specimen. Fortunately, unstained cellular specimens satisfy these assumptions and low NA illumination is easily achieved on commercial microscopes. HTDIC is used to obtain volumetric information from through-focus DIC imagery under high NA illumination conditions. High NA illumination enables enhanced sectioning of the specimen along the optical axis. Hilbert transform processing on the DIC image stacks greatly enhances edge detection algorithms for localization of the specimen borders in three dimensions by separating the gray values of the specimen intensity from those of the background. The primary advantages of NIQPM and HTDIC lay in their technological accessibility using "off-the-shelf" microscopes. There are two basic limitations of these methods: slow z-stack acquisition time on commercial scopes currently abrogates the investigation of phenomena faster than 1 frame/minute, and secondly, diffraction effects restrict the utility of NIQPM and HTDIC to objects from 0.2 up to 10 (NIQPM) and 20 (HTDIC) μm in diameter, respectively. Hence, the specimen and its associated time dynamics of interest must meet certain size and temporal constraints to enable the use of these methods. Excitingly, most fixed cellular specimens are readily investigated with

  15. Measuring the global information society - explaining digital inequality by economic level and education standard

    Science.gov (United States)

    Ünver, H.

    2017-02-01

    A main focus of this research paper is to investigate on the explanation of the ‘digital inequality’ or ‘digital divide’ by economic level and education standard of about 150 countries worldwide. Inequality regarding GDP per capita, literacy and the so-called UN Education Index seem to be important factors affecting ICT usage, in particular Internet penetration, mobile phone usage and also mobile Internet services. Empirical methods and (multivariate) regression analysis with linear and non-linear functions are useful methods to measure some crucial factors of a country or culture towards becoming information and knowledge based society. Overall, the study concludes that the convergence regarding ICT usage proceeds worldwide faster than the convergence in terms of economic wealth and education in general. The results based on a large data analysis show that the digital divide is declining over more than a decade between 2000 and 2013, since more people worldwide use mobile phones and the Internet. But a high digital inequality explained to a significant extent by the functional relation between technology penetration rates, education level and average income still exists. Furthermore it supports the actions of countries at UN/G20/OECD level for providing ICT access to all people for a more balanced world in context of sustainable development by postulating that policymakers need to promote comprehensive education worldwide by means of using ICT.

  16. New telemetry device for the measurement of gastrointestinal motility in rats and comparison with standard equipment.

    Science.gov (United States)

    Meile, Tobias; Zieker, Derek; Königsrainer, Alfred; Glatzle, Jörg

    2015-04-01

    To perform stress-free recording of gastrointestinal motility in rats with strain gauge transducers, telemetry equipment had to be developed. We developed, programmed, and tested a new telemetry device that records gastrointestinal motility in freely moving rats using strain gauge transducers. The device can collect and transmit data in freely moving rats. Data are received and stored for later analysis with a regular PC. Linear calibration curves were obtained for the strain gauge transducers used. We compared data obtained with the new telemetry device with data gathered with standard equipment and could not find any statistically significant difference. Wired gastric and colonic contraction frequencies were 4.6 ± 0.3 per minute and 1.5 ± 0.3 per minute, whereas telemetric contraction frequencies were 4.4 ± 0.1 per minute and 1.25 ± 0.1 per minute. The new telemetry device is a very useful tool for the measurement of gastrointestinal motility in rats.

  17. Accurate weak lensing of standard candles. II. Measuring σ8 with supernovae

    Science.gov (United States)

    Quartin, Miguel; Marra, Valerio; Amendola, Luca

    2014-01-01

    Soon the number of type Ia supernova (SN) measurements should exceed 100 000. Understanding the effect of weak lensing by matter structures on the supernova brightness will then be more important than ever. Although SN lensing is usually seen as a source of systematic noise, we will show that it can be in fact turned into signal. More precisely, the non-Gaussianity introduced by lensing in the SN Hubble diagram dispersion depends rather sensitively on the amplitude σ8 of the matter power spectrum. By exploiting this relation, we are able to predict constraints on σ8 of 7% (3%) for a catalog of 100 000 (500 000) SNe of average magnitude error 0.12, without having to assume that such intrinsic dispersion and its redshift evolution are known a priori. The intrinsic dispersion has been assumed to be Gaussian; possible intrinsic non-Gaussianities in the data set (due to the SN themselves and/or to other transients) could be potentially dealt with by means of additional nuisance parameters describing higher moments of the intrinsic dispersion distribution function. This method is independent of and complementary to the standard methods based on cosmic microwave background, cosmic shear, or cluster abundance observables.

  18. Standard Test Method for Measuring Fast-Neutron Reaction Rates by Radioactivation of Copper

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    2011-01-01

    1.1 This test method covers procedures for measuring reaction rates by the activation reaction 63Cu(n,α)60Co. The cross section for 60Co produced in this reaction increases rapidly with neutrons having energies greater than about 5 MeV. 60Co decays with a half-life of 1925.27 days (±0.29 days)(1) and emits two gamma rays having energies of 1.1732278 and 1.332492 MeV (1). The isotopic content of natural copper is 69.17 % 63Cu and 30.83 % 65Cu (2). The neutron reaction, 63Cu(n,γ)64Cu, produces a radioactive product that emits gamma rays which might interfere with the counting of the 60Co gamma rays. 1.2 With suitable techniques, fission-neutron fluence rates above 109 cm−2·s−1 can be determined. The 63Cu(n,α)60Co reaction can be used to determine fast-neutron fluences for irradiation times up to about 15 years (for longer irradiations, see Practice E261). 1.3 Detailed procedures for other fast-neutron detectors are referenced in Practice E261. 1.4 This standard does not purport to address all of the...

  19. FOVEA: a new program to standardize the measurement of foveal pit morphology

    Directory of Open Access Journals (Sweden)

    Bret A. Moore

    2016-04-01

    Full Text Available The fovea is one of the most studied retinal specializations in vertebrates, which consists of an invagination of the retinal tissue with high packing of cone photoreceptors, leading to high visual resolution. Between species, foveae differ morphologically in the depth and width of the foveal pit and the steepness of the foveal walls, which could influence visual perception. However, there is no standardized methodology to measure the contour of the foveal pit across species. We present here FOVEA, a program for the quantification of foveal parameters (width, depth, slope of foveal pit using images from histological cross-sections or optical coherence tomography (OCT. FOVEA is based on a new algorithm to detect the inner retina contour based on the color variation of the image. We evaluated FOVEA by comparing the fovea morphology of two Passerine birds based on histological cross-sections and its performance with data from previously published OCT images. FOVEA detected differences between species and its output was not significantly different from previous estimates using OCT software. FOVEA can be used for comparative studies to better understand the evolution of the fovea morphology in vertebrates as well as for diagnostic purposes in veterinary pathology. FOVEA is freely available for academic use and can be downloaded at: http://estebanfj.bio.purdue.edu/fovea.

  20. Standard Test Method for Measuring Fast-Neutron Reaction Rates by Radioactivation of Iron

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    2009-01-01

    DESIG: E 263 09 ^TITLE: Standard Test Method for Measuring Fast-Neutron Reaction Rates by Radioactivation of Iron ^SIGNUSE: Refer to Guide E 844 for guidance on the selection, irradiation, and quality control of neutron dosimeters. Refer to Practice E 261 for a general discussion of the determination of fast-neutron fluence rate with threshold detectors. Pure iron in the form of foil or wire is readily available and easily handled. Fig. 1 shows a plot of cross section as a function of neutron energy for the fast-neutron reaction 54Fe(n,p)54Mn (1). This figure is for illustrative purposes only to indicate the range of response of the 54Fe(n,p)54Mn reaction. Refer to Guide E 1018 for descriptions of recommended tabulated dosimetry cross sections. 54Mn has a half-life of 312.13 days (3) (2) and emits a gamma ray with an energy of 834.845 keV (5). (2) Interfering activities generated by neutron activation arising from thermal or fast neutron interactions are 2.57878 (46)-h 56Mn, 44.95-d (8) 59Fe, and 5.27...

  1. Standard Test Method for Measuring Neutron Fluence Rate by Radioactivation of Cobalt and Silver

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    2010-01-01

    1.1 This test method covers a suitable means of obtaining the thermal neutron fluence rate, or fluence, in well moderated nuclear reactor environments where the use of cadmium, as a thermal neutron shield as described in Method E262, is undesirable because of potential spectrum perturbations or of temperatures above the melting point of cadmium. 1.2 This test method describes a means of measuring a Westcott neutron fluence rate (Note 1) by activation of cobalt- and silver-foil monitors (See Terminology E170). The reaction 59Co(n,γ)60Co results in a well-defined gamma emitter having a half-life of 1925.28 days (1). The reaction 109Ag(n,˙γ) 110mAg results in a nuclide with a complex decay scheme which is well known and having a half-life of 249.76 days (1). Both cobalt and silver are available either in very pure form or alloyed with other metals such as aluminum. A reference source of cobalt in aluminum alloy to serve as a neutron fluence rate monitor wire standard is available from the National Institute ...

  2. An analysis of combined standard uncertainty for radiochemical measurements of environmental samples

    International Nuclear Information System (INIS)

    Berne, A.

    1996-01-01

    It is anticipated that future data acquisitions intended for use in radiological risk assessments will require the incorporation of uncertainty analysis. Often, only one aliquot of the sample is taken and a single determination is made. Under these circumstances, the total uncertainty is calculated using the open-quotes propagation of errorsclose quotes approach. However, there is no agreement in the radioanalytical community as to the exact equations to use. The Quality Assurance/Metrology Division of the Environmental Measurements Laboratory has developed a systematic process to compute uncertainties in constituent components of the analytical procedure, as well as the combined standard uncertainty (CSU). The equations for computation are presented here, with examples of their use. They have also been incorporated into a code for use in the spreadsheet application, QuattroPro trademark. Using the spreadsheet with appropriate inputs permits an analysis of the variations in the CSU as a function of several different variables. The relative importance of the open-quotes counting uncertaintyclose quotes can also be ascertained

  3. A review to determine the appropriatness and adequacy of existing provisions for the measurement of ionizing radiation in Australia: proceedings of a workshop held at the National Standards Commission on 13 May 1985

    International Nuclear Information System (INIS)

    1985-01-01

    Current arrangements for the physical measurement of ionizing radiation in Australia are reviewed. Topics covered include the roles of the National Standards Commission, the National Association of Testing Authorities, the Australian Atomic Energy Commission and the Australian Radiation Laboratory. Legal units for ionizing radiation and Australia's measurement standards are discussed. The role of the radiologist and industrial radiographer are examined. Existing codes standards and legislation at both Federal and State government levels are briefly outlined

  4. Income or living standard and health in Germany: different ways of measurement of relative poverty with regard to self-rated health.

    Science.gov (United States)

    Pfoertner, Timo-Kolja; Andress, Hans-Juergen; Janssen, Christian

    2011-08-01

    Current study introduces the living standard concept as an alternative approach of measuring poverty and compares its explanatory power to an income-based poverty measure with regard to subjective health status of the German population. Analyses are based on the German Socio-Economic Panel (2001, 2003 and 2005) and refer to binary logistic regressions of poor subjective health status with regard to each poverty condition, their duration and their causal influence from a previous time point. To calculate the discriminate power of both poverty indicators, initially the indicators were considered separately in regression models and subsequently, both were included simultaneously. The analyses reveal a stronger poverty-health relationship for the living standard indicator. An inadequate living standard in 2005, longer spells of an inadequate living standard between 2001, 2003 and 2005 as well as an inadequate living standard at a previous time point is significantly strongly associated with subjective health than income poverty. Our results challenge conventional measurements of the relationship between poverty and health that probably has been underestimated by income measures so far.

  5. The COnsensus-based standards for the selection of health measurement INstruments (COSMIN) and how to select an outcome measurement instrument

    NARCIS (Netherlands)

    Mokkink, Lidwine B.; Prinsen, Cecilia A.C.; Bouter, Lex M.; de Vet, Henrica C.W.; Terwee, Caroline B.

    2016-01-01

    Background: COSMIN (COnsensus-based Standards for the selection of health Measurement INstruments) is an initiative of an international multidisciplinary team of researchers who aim to improve the selection of outcome measurement instruments both in research and in clinical practice by developing

  6. Standard Test Method for Measuring Heat Transfer Rate Using a Thin-Skin Calorimeter

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    2005-01-01

    1.1 This test method covers the design and use of a thin metallic calorimeter for measuring heat transfer rate (also called heat flux). Thermocouples are attached to the unexposed surface of the calorimeter. A one-dimensional heat flow analysis is used for calculating the heat transfer rate from the temperature measurements. Applications include aerodynamic heating, laser and radiation power measurements, and fire safety testing. 1.2 Advantages 1.2.1 Simplicity of ConstructionThe calorimeter may be constructed from a number of materials. The size and shape can often be made to match the actual application. Thermocouples may be attached to the metal by spot, electron beam, or laser welding. 1.2.2 Heat transfer rate distributions may be obtained if metals with low thermal conductivity, such as some stainless steels, are used. 1.2.3 The calorimeters can be fabricated with smooth surfaces, without insulators or plugs and the attendant temperature discontinuities, to provide more realistic flow conditions for ...

  7. Does leaf chemistry differentially affect breakdown in tropical vs temperate streams? Importance of standardized analytical techniques to measure leaf chemistry

    Science.gov (United States)

    Marcelo Ard& #243; n; Catherine M. Pringle; Susan L. Eggert

    2009-01-01

    Comparisons of the effects of leaf litter chemistry on leaf breakdown rates in tropical vs temperate streams are hindered by incompatibility among studies and across sites of analytical methods used to measure leaf chemistry. We used standardized analytical techniques to measure chemistry and breakdown rate of leaves from common riparian tree species at 2 sites, 1...

  8. Career Oriented Mathematics, Student's Manual. [Includes Owning an Automobile and Driving as a Career; Retail Sales; Measurement; and Area-Perimeter.

    Science.gov (United States)

    Mahaffey, Michael L.; McKillip, William D.

    This volume includes student manuals for four units in the Career Oriented Mathematics Program, which was developed to improve computational abilities and attitudes of secondary students by presenting the material in a job-relevant context. The units are titled: (1) Owning an Automobile and Driving as a Career, (2) Retail Sales, (3) Measurement,…

  9. Methodological Choices in the Content Analysis of Textbooks for Measuring Alignment with Standards

    Science.gov (United States)

    Polikoff, Morgan S.; Zhou, Nan; Campbell, Shauna E.

    2015-01-01

    With the recent adoption of the Common Core standards in many states, there is a need for quality information about textbook alignment to standards. While there are many existing content analysis procedures, these generally have little, if any, validity or reliability evidence. One exception is the Surveys of Enacted Curriculum (SEC), which has…

  10. What You Measure Is What You Get. The Effects of Accounting Standards Effects Studies

    NARCIS (Netherlands)

    Koenigsgruber, R.; Gross, C.

    2012-01-01

    The UK's Accounting Standards Board and the European Financial Reporting Advisory Group have published a discussion paper entitled 'Considering the Effects of Accounting Standards'. While the effort to think through potential consequences of proposed regulatory acts in advance is welcome, we argue

  11. International cooperative effort to establish ASTM [American Society for Testing and Materials] standards for the measurement of radiation dose for food processing

    International Nuclear Information System (INIS)

    Farrar, H. IV.

    1987-01-01

    A task group has been formed within the American Society for Testing and Materials (ASTM) specifically to develop standards for measuring radiation dose for food processing. The task group, which has 78 members, including 16 from Europe, consists of a broad cross section of food industry, government, regulatory, manufacturing, and university interests. The group is working on seven standards; three specifically for food irradiation applications, and four for using specific dosimeter types for all radiation applications, including food processing. Together, this set of standards will specify acceptable methods of accomplishing the required irradiation treatment of food and other products, and will be available for adoption by regulatory agencies in food irradiation protocols. 1 tab

  12. Development of traceable measurement of the diffuse optical properties of solid reference standards for biomedical optics at National Institute of Standards and Technology.

    Science.gov (United States)

    Lemaillet, Paul; Bouchard, Jean-Pierre; Allen, David W

    2015-07-01

    The development of a national reference instrument dedicated to the measurement of the scattering and absorption properties of solid tissue-mimicking phantoms used as reference standards is presented. The optical properties of the phantoms are measured with a double-integrating sphere setup in the steady-state domain, coupled with an inversion routine of the adding-doubling procedure that allows for the computation of the uncertainty budget for the measurements. The results are compared to the phantom manufacturer's values obtained by a time-resolved approach. The results suggest that the agreement between these two independent methods is within the estimated uncertainties. This new reference instrument will provide optical biomedical research laboratories with reference values for absolute diffuse optical properties of phantom materials.

  13. Translation of questionnaires measuring health related quality of life is not standardized

    DEFF Research Database (Denmark)

    Danielsen, Anne Kjaergaard; Pommergaard, Hans-Christian; Burcharth, Jakob

    2015-01-01

    aimed at patients from both countries. In relation to this and similar international cooperation, the validity and reliability of translated questionnaires are central aspects. MAIN OBJECTIVES: The purpose of this study was to explore which methodological measures were used in studies reporting......INTRODUCTION: There is growing awareness of the need to explore patient reported outcomes in clinical trials. In the Scandinavian Surgical Outcomes Research Group we are conducting several clinical trials in cooperation between Danish and Swedish surgical researchers, and we use questionnaires...... reporting the methodological process when translating questionnaires on health related quality of life for different diseases. RESULTS: We retrieved 187 studies and out of theses we included 52 studies. The psychometric properties of the translated versions were validated using different tests. The focus...

  14. Development of a Standard Methodology for the Quantitative Measurement of Steel Phase Transformation Kinetics and Dilation Strains Using Dilatometric Methods, QMST (TRP 0015)

    Energy Technology Data Exchange (ETDEWEB)

    Dr. Manish Metha; Dr. Tom Oakwood

    2004-04-28

    The purpose of this collaborative project was to develop a standard practice for obtaining and archiving quantitative steel transformation kinetic data and thermal strain data. Two families of dilatometric equipment were employed to develop this standard practice for testing bar product steels. These include high-speed quenching and deformation dilatometers and Gleeble{reg_sign} thermomechanical simulation instruments. Standard measurement, data interpretation and data reporting methods were developed and defined by the cross-industry QMST Consortium members consisting of steel-manufacturers, forgers, heat-treaters, modelers, automotive and heavy vehicle OEMs along with team expert technologists from the National Labs and academia. The team designed phase transformation experiments on two selected steel grades to validate the standard practices--a medium carbon grade SAE 1050 and an alloy steel SAE 8620. A final standard practice document was developed based on the two dilatometry methods, and was submitted to and approved by ASTM (available as A1033-04). The standard practice specifies a method for measuring austenite transformation under no elastic stress or plastic deformation. These methods will be an enabler for the development and electronic archiving of a quantitative database for process modeling using computer simulation software, and will greatly assist endusers in developing accurate process and product simulations during the thermo-mechanical processing of bar and rod product steels.

  15. Functional living skills assessment: a standardized measure of high-order activities of daily living in patients with dementia.

    Science.gov (United States)

    Farina, E; Fioravanti, R; Pignatti, R; Alberoni, M; Mantovani, F; Manzoni, G; Chiavari, L; Imbornone, E; Villanelli, F; Nemni, R

    2010-03-01

    Performance measures are tools aimed to directly evaluate social function in older adults. The authors present the standardization of a new direct performance measure for patients with dementia, the functional living skills assessment (FLSA). FLSA was conceived to detect functional impairment in very mild to moderate patients and to pick up functional modification due to intervention. The patient is asked to perform an activity, and the performance is scored according to completeness and level of assistance required. Eight areas of interest are evaluated (Resources, Consumer Skills, Public Transportation, Time Management, Money management, Leisure, Telephone Skills, Self-Care and Health). Subjects included 54 patients with dementia and 36 normal controls. Total and partial FLSA scores significantly differed for the two groups (P0.9). Correction scores for education were calculated, while age influence was only marginally significant. Mini Mental State Examination (MMSE) and CDR highly influenced FLSA score (Psensibility to different levels of functional impairment is needed, as evaluation of treatment efficacy (both non-pharmacological and pharmacological) identification of relatively intact functional areas to plan cognitive rehabilitation, and confirmation of dementia in the initial phase when there are doubts about functional decline.

  16. Towards Uniform Accelerometry Analysis: A Standardization Methodology to Minimize Measurement Bias Due to Systematic Accelerometer Wear-Time Variation

    Directory of Open Access Journals (Sweden)

    Tarun R. Katapally, Nazeem Muhajarine

    2014-06-01

    Full Text Available Accelerometers are predominantly used to objectively measure the entire range of activity intensities – sedentary behaviour (SED, light physical activity (LPA and moderate to vigorous physical activity (MVPA. However, studies consistently report results without accounting for systematic accelerometer wear-time variation (within and between participants, jeopardizing the validity of these results. This study describes the development of a standardization methodology to understand and minimize measurement bias due to wear-time variation. Accelerometry is generally conducted over seven consecutive days, with participants' data being commonly considered 'valid' only if wear-time is at least 10 hours/day. However, even within ‘valid’ data, there could be systematic wear-time variation. To explore this variation, accelerometer data of Smart Cities, Healthy Kids study (www.smartcitieshealthykids.com were analyzed descriptively and with repeated measures multivariate analysis of variance (MANOVA. Subsequently, a standardization method was developed, where case-specific observed wear-time is controlled to an analyst specified time period. Next, case-specific accelerometer data are interpolated to this controlled wear-time to produce standardized variables. To understand discrepancies owing to wear-time variation, all analyses were conducted pre- and post-standardization. Descriptive analyses revealed systematic wear-time variation, both between and within participants. Pre- and post-standardized descriptive analyses of SED, LPA and MVPA revealed a persistent and often significant trend of wear-time’s influence on activity. SED was consistently higher on weekdays before standardization; however, this trend was reversed post-standardization. Even though MVPA was significantly higher on weekdays both pre- and post-standardization, the magnitude of this difference decreased post-standardization. Multivariable analyses with standardized SED, LPA and

  17. Standards and measurements for assessing bone health-workshop report co-sponsored by the International Society for Clinical Densitometry (ISCD) and the National Institute of Standards and Technology (NIST).

    Science.gov (United States)

    Bennett, Herbert S; Dienstfrey, Andrew; Hudson, Lawrence T; Oreskovic, Tammy; Fuerst, Thomas; Shepherd, John

    2006-01-01

    This article reports and discusses the results of the recent ISCD-NIST Workshop on Standards and Measurements for Assessing Bone Health. The purpose of the workshop was to assess the status of efforts to standardize and compare results from dual-energy X-ray absorptiometry (DXA) scans, and then to identify and prioritize ongoing measurement and standards needs.

  18. Protocol of the COSMIN study: COnsensus-based Standards for the selection of health Measurement INstruments

    NARCIS (Netherlands)

    Mokkink, L.B.; Terwee, C.B.; Knol, D.L.; Stratford, P.W.; Alonso, J.; Patrick, D.L.; Bouter, L.M.; de Vet, H.C.W.

    2006-01-01

    Background: Choosing an adequate measurement instrument depends on the proposed use of the instrument, the concept to be measured, the measurement properties (e.g. internal consistency, reproducibility, content and construct validity, responsiveness, and interpretability), the requirements, the

  19. Standard-Model Tests with Superallowed β-Decay: An Important Application of Very Precise Mass Measurements

    International Nuclear Information System (INIS)

    Hardy, J. C.; Towner, I. S.

    2001-01-01

    Superallowed β-decay provides a sensitive means for probing the limitations of the Electroweak Standard Model. To date, the strengths (ft-values) of superallowed 0 +→ 0 + β-decay transitions have been determined with high precision from nine different short-lived nuclei, ranging from 10 C to 54 Co. Each result leads to an independent measure for the vector coupling constant G V and collectively the nine values can be used to test the conservation of the weak vector current (CVC). Within current uncertainties, the results support CVC to better than a few parts in 10,000 - a clear success for the Standard Model! However, when the average value of G V , as determined in this way, is combined with data from decays of the muon and kaon to test another prediction of the Standard Model, the result is much more provocative. A test of the unitarity of the Cabibbo-Kobayashi-Maskawa matrix fails by more than two standard deviations. This result can be made more definitive by experiments that require extremely precise mass measurements, in some cases on very short-lived (≤100 ms) nuclei. This talk presents the current status and future prospects for these Standard-Model tests, emphasizing the role of precise mass, or mass-difference measurements. There remains a real challenge to mass-measurement technique with the opportunity for significant new results

  20. Influence of measurement uncertainty on classification of thermal environment in buildings according to European Standard EN 15251

    DEFF Research Database (Denmark)

    Kolarik, Jakub; Olesen, Bjarne W.

    2015-01-01

    European Standard EN 15 251 in its current version does not provide any guidance on how to handle uncertainty of long term measurements of indoor environmental parameters used for classification of buildings. The objective of the study was to analyse the uncertainty for field measurements...... measurements of operative temperature at two measuring points (south/south-west and north/northeast orientation). Results of the present study suggest that measurement uncertainty needs to be considered during assessment of thermal environment in existing buildings. When expanded standard uncertainty was taken...... into account in categorization of thermal environment according to EN 15251, the difference in prevalence of exceeded category limits were up to 17.3%, 8.3% and 2% of occupied hours for category I, II and III respectively....

  1. Statistical Analysis of a Round-Robin Measurement Survey of Two Candidate Materials for a Seebeck Coefficient Standard Reference Material

    Directory of Open Access Journals (Sweden)

    Lu, Z. Q. J.

    2009-01-01

    Full Text Available In an effort to develop a Standard Reference Material (SRM™ for Seebeck coefficient, we have conducted a round-robin measurement survey of two candidate materials—undoped Bi2Te3 and Constantan (55 % Cu and 45 % Ni alloy. Measurements were performed in two rounds by twelve laboratories involved in active thermoelectric research using a number of different commercial and custom-built measurement systems and techniques. In this paper we report the detailed statistical analyses on the interlaboratory measurement results and the statistical methodology for analysis of irregularly sampled measurement curves in the interlaboratory study setting. Based on these results, we have selected Bi2Te3 as the prototype standard material. Once available, this SRM will be useful for future interlaboratory data comparison and instrument calibrations.

  2. Calculated and measured stresses in simple panels subject to intense random acoustic loading including the near noise field of a turbojet engine

    Science.gov (United States)

    Lassiter, Leslie W; Hess, Robert W

    1958-01-01

    Flat 2024-t3 aluminum panels measuring 11 inches by 13 inches were tested in the near noise fields of a 4-inch air jet and turbojet engine. The stresses which were developed in the panels are compared with those calculated by generalized harmonic analysis. The calculated and measured stresses were found to be in good agreement. In order to make the stress calculations, supplementary data relating to the transfer characteristics, damping, and static response of flat and curved panels under periodic loading are necessary and were determined experimentally. In addition, an appendix containing detailed data on the near pressure field of the turbojet engine is included.

  3. LHCb: Measurements of the relative branching fractions of the decay channel $B^{\\pm}\\to p \\bar{p} K^{\\pm}$ including charmonium contributions at LHCb

    CERN Multimedia

    Cardinale, Roberta

    2011-01-01

    The study of the $B^{\\pm}\\to p \\bar{p} K^{\\pm}$ decay channel at LHCb is of great interest since it gives the possibility to study different aspects of the Standard Model and possibly Beyond Standard Model physics. A measurement of the direct CP asymmetry can be performed. Moreover intermediate states such as charmonium and "charmonium-like" resonances in the $p \\bar{p}$ final state can be observed and studied along with their characteristics. A multivariate selection has been implemented to select the interesting events using kinematic and topological variables and the particle identification information using the Ring Imaging Cherenkov detectors. The selection has a high signal efficiency and high background rejection capability. The ratios of the branching fractions of the $B^{\\pm}\\to p \\bar{p} K^{\\pm}$ decay channel, of the charmless component with $M_{p \\bar{p}} < 2.85 \\,{\\rm GeV/}c^{2}$ and of the charmonium contribution $\\eta_{c}$, ${\\mathcal B} (B^{\\pm}\\to \\eta_{c} K^{\\pm})\\times {\\mathcal B} (\\eta...

  4. Update to nuclear data standards for nuclear measurements. Summary report of a consultants' meeting

    International Nuclear Information System (INIS)

    Wienke, H.; Chiba, S.; Hambsch, F.J.; Olsson, N.; Smirnov, A.N.

    1997-05-01

    This document is an update of the 1991 NEANDC/INDC Nuclear Standards File (INDC(SEC)-101) to indicate the status and extension of cross section standards for the 10 B(n,α) reaction in the energy region below 20 Mev; and the H(n,n), 235 U9(n,f), 238 U(n,f), and 209 Bi(n,f) reactions for selected energy regions above 20 Mev. Refs, figs, tabs

  5. Comparison of Antenna Measurement Facilities with the DTU-ESA 12 GHz Validation Standard Antenna within the EU Antenna Centre of Excellence

    DEFF Research Database (Denmark)

    Pivnenko, Sergey; Pallesen, Janus Engberg; Breinbjerg, Olav

    2009-01-01

    The primary objective of many antenna measurement facilities is to provide a specified high accuracy of the measured data. The validation of an antenna measurement facility is the process of proving that such a specified accuracy can be achieved. Since this constitutes a very challenging task...... are presented and its three different coordinate systems are defined. The primary objective of the comparison is to obtain experience and results that can serve to develop standards for validation of antenna measurement facilities. The paper focuses on the comparison of the radiation pattern and presents...... not only the measurement data obtained at the facilities, but also investigates several procedures for comparing these data. This includes various definitions of pattern difference and statistical measures as well as a reference pattern. The comparison took place in 2004-2005 as part of the European Union...

  6. External and internal standards in the single-isotope derivative (radioenzymatic) measurement of plasma norepinephrine and epinephrine

    International Nuclear Information System (INIS)

    Shah, S.D.; Clutter, W.E.; Cryer, P.E.

    1985-01-01

    In plasma from normal humans (n = 9, 35 samples) and from patients with diabetes mellitus (n = 12, 24 samples) single-isotope derivative (radioenzymatic) plasma norepinephrine and epinephrine concentrations calculated from external standard curves constructed in a normal plasma pool were identical to those calculated from internal standards added to an aliquot of each plasma sample. In plasma from patients with end-stage renal failure receiving long-term dialysis (n = 34, 109 samples), competitive catechol-O-methyltransferase (COMT) inhibitory activity resulted in a systematic error when external standards in a normal plasma pool were used, as reported previously; values so calculated averaged 21% (+/- 12%, SD) lower than those calculated from internal standards. However, when external standard curves were constructed in plasma from a given patient with renal failure and used to calculate that patient's values, or in a renal failure plasma pool and used to calculate all renal failure values, norepinephrine and epinephrine concentrations were not significantly different from those calculated from internal standards. We conclude: (1) External standard curves constructed in plasma from a given patient with renal failure can be used to measure norepinephrine and epinephrine in plasma from that patient; further, external standards in a renal failure plasma pool can be used for assays in patients with end-stage renal failure receiving long-term dialysis. (2) Major COMT inhibitory activity is not present commonly if samples from patients with renal failure are excluded. Thus, it would appear that external standard curves constructed in normal plasma can be used to measure norepinephrine and epinephrine precisely in samples from persons who do not have renal failure

  7. Validity of anthropometric measurements to assess body composition, including muscle mass, in 3-year-old children from the SKOT cohort

    DEFF Research Database (Denmark)

    Jensen, Signe Marie; Mølgaard, Christian; Ejlerskov, Katrine Tschentscher

    2015-01-01

    Nutritional status of children is commonly assessed by anthropometry both in under and overnutrition. The link between anthropometry and body fat, the body compartment most affected by overnutrition, is well known, but the link with muscle mass, the body compartment most depleted in undernutrition...... to estimate muscle mass. Overall, anthropometric measures were more effective to measure absolute size of fat, lean and muscle mass than their relative sizes. Proportion of the variance explained by anthropometry was 79% for lean mass, 76% for fat mass and 74% for muscle mass. For fat mass and lean mass......, which included only mid-upper arm circumference and subscapular skinfold. The power of height in the weight-to-height ratio to determine fat mass proportion was 1.71 with a 95% confidence interval (0.83-2.60) including the value of 2 used in body mass index (BMI). Limitations of anthropometry to assess...

  8. Field Measurements of Trace Gases and Aerosols Emitted by Undersampled Combustion Sources Including Wood and Dung Cooking Fires, Garbage and Crop Residue Burning, and Indonesian Peat Fires

    Science.gov (United States)

    Stockwell, C.; Jayarathne, T. S.; Goetz, D.; Simpson, I. J.; Selimovic, V.; Bhave, P.; Blake, D. R.; Cochrane, M. A.; Ryan, K. C.; Putra, E. I.; Saharjo, B.; Stone, E. A.; DeCarlo, P. F.; Yokelson, R. J.

    2017-12-01

    Field measurements were conducted in Nepal and in the Indonesian province of Central Kalimantan to improve characterization of trace gases and aerosols emitted by undersampled combustion sources. The sources targeted included cooking with a variety of stoves, garbage burning, crop residue burning, and authentic peat fires. Trace gas and aerosol emissions were studied using a land-based Fourier transform infrared spectrometer, whole air sampling, photoacoustic extinctiometers (405 and 870nm), and filter samples that were analyzed off-line. These measurements were used to calculate fuel-based emission factors (EFs) for up to 90 gases, PM2.5, and PM2.5 constituents. The aerosol optical data measured included EFs for the scattering and absorption coefficients, the single scattering albedo (at 870 and 405 nm), as well as the absorption Ångström exponent. The emissions varied significantly by source, although light absorption by both brown and black carbon (BrC and BC, respectively) was important for all non-peat sources. For authentic peat combustion, the emissions of BC were negligible and absorption was dominated by organic aerosol. The field results from peat burning were in reasonable agreement with recent lab measurements of smoldering Kalimantan peat and compare well to the limited data available from other field studies. The EFs can be used with estimates of fuel consumption to improve regional emissions inventories and assessments of the climate and health impacts of these undersampled sources.

  9. Comparison of the Calibration Standards of Three Commercially Available Multiplex Kits for Human Cytokine Measurement to WHO Standards Reveals Striking Differences

    Directory of Open Access Journals (Sweden)

    Ivan M. Roitt

    2008-01-01

    Full Text Available Serum parameters as indicators for the efficacy of therapeutic drugs are currently in the focus of intensive research. The induction of certain cytokines (or cytokine patterns is known to be related to the status of the immune response e.g. in regulating the TH1/TH2 balance. Regarding their potential value as surrogate parameters in clinical trials and subsequently for the assignment of treatment effi cacy, the accurate and reliable determination of cytokines in patient serum is mandatory. Because serum samples are precious and limited, test methods—like the xMAP multiplex technology—that allow for the simultaneous determination of a variety of cytokines from only a small sample aliquot, can offer great advantages.We here have compared multiplex kits from three different manufactures and found striking differences upon standardizing using WHO standards for selected cytokines. We therefore extended our xMAP multiplex measurements investigations to an ex-vivo situation by testing serum samples and found that the cytokine amounts measured was critically influenced by the actual kit used. The presented data indicate that statements regarding the quantitive determination of cytokines—and therefore their use as biomarkers—in serum samples have to be interpreted with caution.

  10. Tests of the electroweak standard model and measurement of the weak mixing angle with the ATLAS detector

    International Nuclear Information System (INIS)

    Goebel, M.

    2011-09-01

    In this thesis the global Standard Model (SM) fit to the electroweak precision observables is revisted with respect to newest experimental results. Various consistency checks are performed showing no significant deviation from the SM. The Higgs boson mass is estimated by the electroweak fit to be M H =94 -24 +30 GeV without any information from direct Higgs searches at LEP, Tevatron, and the LHC and the result is M H =125 -10 +8 GeV when including the direct Higgs mass constraints. The strong coupling constant is extracted at fourth perturbative order as α s (M Z 2 )=0.1194±0.0028(exp)±0.0001 (theo). From the fit including the direct Higgs constraints the effective weak mixing angle is determined indirectly to be sin 2 θ l eff =0.23147 -0.00010 +0.00012 . For the W mass the value of M W =80.360 -0.011 +0.012 GeV is obtained indirectly from the fit including the direct Higgs constraints. The electroweak precision data is also exploited to constrain new physics models by using the concept of oblique parameters. In this thesis the following models are investigated: models with a sequential fourth fermion generation, the inert-Higgs doublet model, the littlest Higgs model with T-parity conservation, and models with large extra dimensions. In contrast to the SM, in these models heavy Higgs bosons are in agreement with the electroweak precision data. The forward-backward asymmetry as a function of the invariant mass is measured for pp→ Z/γ * →e + e - events collected with the ATLAS detector at the LHC. The data taken in 2010 at a center-of-mass energy of √(s)=7 TeV corresponding to an integrated luminosity of 37.4 pb -1 is analyzed. The measured forward-backward asymmetry is in agreement with the SM expectation. From the measured forward-backward asymmetry the effective weak mixing angle is extracted as sin 2 θ l eff =0.2204±.0071(stat) -0.0044 +0.0039 (syst). The impact of unparticles and large extra dimensions on the forward-backward asymmetry at large

  11. Tests of the electroweak standard model and measurement of the weak mixing angle with the ATLAS detector

    Energy Technology Data Exchange (ETDEWEB)

    Goebel, M.

    2011-09-15

    In this thesis the global Standard Model (SM) fit to the electroweak precision observables is revisted with respect to newest experimental results. Various consistency checks are performed showing no significant deviation from the SM. The Higgs boson mass is estimated by the electroweak fit to be M{sub H}=94{sub -24}{sup +30} GeV without any information from direct Higgs searches at LEP, Tevatron, and the LHC and the result is M{sub H}=125{sub -10}{sup +8} GeV when including the direct Higgs mass constraints. The strong coupling constant is extracted at fourth perturbative order as {alpha}{sub s}(M{sub Z}{sup 2})=0.1194{+-}0.0028(exp){+-}0.0001 (theo). From the fit including the direct Higgs constraints the effective weak mixing angle is determined indirectly to be sin{sup 2} {theta}{sup l}{sub eff}=0.23147{sub -0.00010}{sup +0.00012}. For the W mass the value of M{sub W}=80.360{sub -0.011}{sup +0.012} GeV is obtained indirectly from the fit including the direct Higgs constraints. The electroweak precision data is also exploited to constrain new physics models by using the concept of oblique parameters. In this thesis the following models are investigated: models with a sequential fourth fermion generation, the inert-Higgs doublet model, the littlest Higgs model with T-parity conservation, and models with large extra dimensions. In contrast to the SM, in these models heavy Higgs bosons are in agreement with the electroweak precision data. The forward-backward asymmetry as a function of the invariant mass is measured for pp{yields} Z/{gamma}{sup *}{yields}e{sup +}e{sup -} events collected with the ATLAS detector at the LHC. The data taken in 2010 at a center-of-mass energy of {radical}(s)=7 TeV corresponding to an integrated luminosity of 37.4 pb{sup -1} is analyzed. The measured forward-backward asymmetry is in agreement with the SM expectation. From the measured forward-backward asymmetry the effective weak mixing angle is extracted as sin{sup 2} {theta}{sup l

  12. Uganda; Financial System Stability Assessment, including Reports on the Observance of Standards and Codes on the following topics: Monetary and Financial Policy Transparency, Banking Supervision, Securities Regulation, and Payment Systems

    OpenAIRE

    International Monetary Fund

    2003-01-01

    This paper presents findings of Uganda’s Financial System Stability Assessment, including Reports on the Observance of Standards and Codes on Monetary and Financial Policy Transparency, Banking Supervision, Securities Regulation, Insurance Regulation, Corporate Governance, and Payment Systems. The banking system in Uganda, which dominates the financial system, is fundamentally sound, more resilient than in the past, and currently poses no threat to macroeconomic stability. A major disruption ...

  13. Standard Practice for Internal Temperature Measurements in Low-Conductivity Materials

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    2008-01-01

    1.1 This practice covers methods for instrumenting low-conductivity specimens for testing in an environment subject to rapid thermal changes such as produced by rocket motors, atmospheric re-entry, electric-arc plasma heaters, and so forth. Specifically, practices for bare-wire thermocouple instrumentation applicable to sheath-type thermocouples are discussed. 1.2 The values stated in inch-pound units are to be regarded as the standard. The metric equivalents of inch-pound units may be approximate. 1.3 This standard does not purport to address all of the safety concerns, if any, associated with its use. It is the responsibility of the user of this standard to establish appropriate safety and health practices and determine the applicability of regulatory limitations prior to use.

  14. New standards for devices used for the measurement of human body temperature.

    Science.gov (United States)

    Ring, E F J; McEvoy, H; Jung, A; Zuber, J; Machin, G

    2010-05-01

    Significant changes in recording of human body temperature have been taking place worldwide in recent years. The clinical thermometer introduced in the mid-19th century by Wunderlich has been replaced by digital thermometers or radiometer devices for recording tympanic membrane temperature. More recently the use of infrared thermal imaging for fever screening has become more widespread following the SARS infection, and particularly during the pandemic H1N1 outbreak. Important new standards that have now reached international acceptance will affect clinical and fever screening applications. This paper draws attention to these new standard documents. They are designed to improve the standardization of both performance and practical use of these key techniques in clinical medicine, especially necessary in a pandemic influenza situation.

  15. Design and construction of a cryogenic facility providing absolute measurements of radon 222 activity for developing a primary standard

    International Nuclear Information System (INIS)

    Picolo, Jean-Louis

    1995-06-01

    Radon 222 metrology is required to obtain higher accuracy in assessing human health risks from exposure to natural radiation. This paper describes the development of a cryogenic facility that allows absolute measurements of radon 222 in order to obtain a primary standard. The method selected is the condensation of a radon 222 sample on a geometrically defined cold surface with a constant, well known and adjustable temperature and facing an alpha particles detector. Counting of the alpha particles reaching the detector and the precisely known detection geometry provide an absolute measurement of the source activity. After describing the cryogenic facility, the measurement accuracy and precision are discussed and a comparison made with other measurement systems. The relative uncertainty is below 1 pc (1 σ). The facility can also be used to improve our knowledge of the nuclear properties of radon 222 and to produce secondary standards. (author) [fr

  16. An automated LS(β)- NaI(Tl)(γ) coincidence system as absolute standard for radioactivity measurements.

    Science.gov (United States)

    Joseph, Leena; Das, A P; Ravindra, Anuradha; Kulkarni, D B; Kulkarni, M S

    2018-03-06

    4πβ-γ coincidence method is a powerful and widely used method to determine the absolute activity concentration of radioactive solutions. A new automated liquid scintillator based coincidence system has been designed, developed, tested and established as absolute standard for radioactivity measurements. The automation is achieved using PLC (programmable logic controller) and SCADA (supervisory control and data acquisition). Radioactive solution of 60 Co was standardized to compare the performance of the automated system with proportional counter based absolute standard maintained in the laboratory. The activity concentrations determined using these two systems were in very good agreement; the new automated system can be used for absolute measurement of activity concentration of radioactive solutions. Copyright © 2018. Published by Elsevier Ltd.

  17. $\\beta$-asymmetry measurements in nuclear $\\beta$-decay as a probe for non-standard model physics

    CERN Multimedia

    Roccia, S

    2002-01-01

    We propose to perform a series of measurements of the $\\beta$-asymmetry parameter in the decay of selected nuclei, in order to investigate the presence of possible time reversal invariant tensor contributions to the weak interaction. The measurements have the potential to improve by a factor of about four on the present limits for such non-standard model contributions in nuclear $\\beta$-decay.

  18. Commutator of the quark mass matrices in the standard electroweak model and a measure of maximal CP-violation

    International Nuclear Information System (INIS)

    Jarlskog, C.

    1985-06-01

    The structure of the quark mass matrices in the Standard Electroweak Model is investigated. The commutator of the quark mass matrices is found to provide a conventional independent measure of CP-violation. The question of maximal CP-violation is discussed. The present experimental data indicate that CP is nowhere maximally violated. (author)

  19. Accurate quantitation standards of glutathione via traceable sulfur measurement by inductively coupled plasma optical emission spectrometry and ion chromatography

    Science.gov (United States)

    Rastogi, L.; Dash, K.; Arunachalam, J.

    2013-01-01

    The quantitative analysis of glutathione (GSH) is important in different fields like medicine, biology, and biotechnology. Accurate quantitative measurements of this analyte have been hampered by the lack of well characterized reference standards. The proposed procedure is intended to provide an accurate and definitive method for the quantitation of GSH for reference measurements. Measurement of the stoichiometrically existing sulfur content in purified GSH offers an approach for its quantitation and calibration through an appropriate characterized reference material (CRM) for sulfur would provide a methodology for the certification of GSH quantity, that is traceable to SI (International system of units). The inductively coupled plasma optical emission spectrometry (ICP-OES) approach negates the need for any sample digestion. The sulfur content of the purified GSH is quantitatively converted into sulfate ions by microwave-assisted UV digestion in the presence of hydrogen peroxide prior to ion chromatography (IC) measurements. The measurement of sulfur by ICP-OES and IC (as sulfate) using the “high performance” methodology could be useful for characterizing primary calibration standards and certified reference materials with low uncertainties. The relative expanded uncertainties (% U) expressed at 95% confidence interval for ICP-OES analyses varied from 0.1% to 0.3%, while in the case of IC, they were between 0.2% and 1.2%. The described methods are more suitable for characterizing primary calibration standards and certifying reference materials of GSH, than for routine measurements. PMID:29403814

  20. Novel Modeling of Task vs. Rest Brain State Predictability Using a Dynamic Time Warping Spectrum: Comparisons and Contrasts with Other Standard Measures of Brain Dynamics.

    Science.gov (United States)

    Dinov, Martin; Lorenz, Romy; Scott, Gregory; Sharp, David J; Fagerholm, Erik D; Leech, Robert

    2016-01-01

    Dynamic time warping, or DTW, is a powerful and domain-general sequence alignment method for computing a similarity measure. Such dynamic programming-based techniques like DTW are now the backbone and driver of most bioinformatics methods and discoveries. In neuroscience it has had far less use, though this has begun to change. We wanted to explore new ways of applying DTW, not simply as a measure with which to cluster or compare similarity between features but in a conceptually different way. We have used DTW to provide a more interpretable spectral description of the data, compared to standard approaches such as the Fourier and related transforms. The DTW approach and standard discrete Fourier transform (DFT) are assessed against benchmark measures of neural dynamics. These include EEG microstates, EEG avalanches, and the sum squared error (SSE) from a multilayer perceptron (MLP) prediction of the EEG time series, and simultaneously acquired FMRI BOLD signal. We explored the relationships between these variables of interest in an EEG-FMRI dataset acquired during a standard cognitive task, which allowed us to explore how DTW differentially performs in different task settings. We found that despite strong correlations between DTW and DFT-spectra, DTW was a better predictor for almost every measure of brain dynamics. Using these DTW measures, we show that predictability is almost always higher in task than in rest states, which is consistent to other theoretical and empirical findings, providing additional evidence for the utility of the DTW approach.

  1. Novel modeling of task versus rest brain state predictability using a dynamic time warping spectrum: comparisons and contrasts with other standard measures of brain dynamics

    Directory of Open Access Journals (Sweden)

    Martin eDinov

    2016-05-01

    Full Text Available Dynamic time warping, or DTW, is a powerful and domain-general sequence alignment method for computing a similarity measure. Such dynamic programming-based techniques like DTW are now the backbone and driver of most bioinformatics methods and discoveries. In neuroscience it has had far less use, though this has begun to change. We wanted to explore new ways of applying DTW, not simply as a measure with which to cluster or compare similarity between features but in a conceptually different way. We have used DTW to provide a more interpretable spectral description of the data, compared to standard approaches such as the Fourier and related transforms. The DTW approach and standard discrete Fourier transform (DFT are assessed against benchmark measures of neural dynamics. These include EEG microstates, EEG avalanches and the sum squared error (SSE from a multilayer perceptron (MLP prediction of the EEG timeseries, and simultaneously acquired FMRI BOLD signal. We explored the relationships between these variables of interest in an EEG-FMRI dataset acquired during a standard cognitive task, which allowed us to explore how DTW differentially performs in different task settings. We found that despite strong correlations between DTW and DFT-spectra, DTW was a better predictor for almost every measure of brain dynamics. Using these DTW measures, we show that predictability is almost always higher in task than in rest states, which is consistent to other theoretical and empirical findings, providing additional evidence for the utility of the DTW approach.

  2. Evaluation of the Standard Ion Transfer Potentials for PVC Plasticized Membranes from Voltammetric Measurements

    Czech Academy of Sciences Publication Activity Database

    Langmaier, Jan; Stejskalová, Květoslava; Samec, Zdeněk

    2001-01-01

    Roč. 496, č. 1 (2001), s. 143-147 ISSN 0022-0728. [Symposium in Kyoto. Kyoto, 02.03.2000] R&D Projects: GA AV ČR IAA4040902 Institutional research plan: CEZ:AV0Z4040901 Keywords : ion voltammetry * PVC plasticized membrane * standard ion transfer potential Subject RIV: CG - Electrochemistry Impact factor: 1.960, year: 2001

  3. 38 CFR 21.7672 - Measurement of courses not leading to a standard college degree.

    Science.gov (United States)

    2010-07-01

    ... basis as provided in § 21.7670 a reservist's enrollment in a course not leading to a standard college... attendance as required in paragraph (c) of this section. If the educational institution does not require at least the same number of clock-hours of attendance as required in paragraph (c) of this section, VA will...

  4. The Legislative Requirements for Measuring Quality in Transnational Education: Understanding Divergence While Maintaining Standards

    Science.gov (United States)

    Bentley, Duncan; Henderson, Fiona; Lim, Choon Boey

    2017-01-01

    Australian universities have been actively engaged in transnational education since the 1990s. The challenges of assuring quality have seen a changing regulatory framework increasingly designed to ensure equivalence of standards wherever a course of study is offered and however it is delivered. Transnational Higher Education has grown…

  5. Non-invasive measurement of adrenal response after standardized exercise tests in prepubertal children

    NARCIS (Netherlands)

    Heijsman, Sigrid M.; Koers, Nicoline F.; Bocca, Gianni; van der Veen, Betty S.; Appelhof, Maaike; Kamps, Arvid W. A.

    Objective: To determine the feasibility of non-invasive evaluation of adrenal response in healthy prepubertal children by standardized exercise tests. Methods: On separate occasions, healthy prepubertal children performed a submaximal cycling test, a maximal cycling test, and a 20-m shuttle-run

  6. Standard test method for measurement of roll wave optical distortion in heat-treated flat glass

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    2008-01-01

    1.1 This test method is applicable to the determination of the peak-to-valley depth and peak-to-peak distances of the out-of-plane deformation referred to as roll wave which occurs in flat, heat-treated architectural glass substrates processed in a heat processing continuous or oscillating conveyance oven. 1.2 The values stated in inch-pound units are to be regarded as standard. The values given in parentheses are mathematical conversions to SI units that are provided for information only and are not considered standard. 1.3 This test method does not address other flatness issues like edge kink, ream, pocket distortion, bow, or other distortions outside of roll wave as defined in this test method. 1.4 This standard does not purport to address all of the safety concerns, if any, associated with its use. It is the responsibility of the user of this standard to establish appropriate safety and health practices and determine the applicability of regulatory limitations prior to use.

  7. Is the Green Key standard the golden key for sustainability measurement in the hospitality sector?

    NARCIS (Netherlands)

    Rietbergen, M.G.; Van Rheede, A.

    2014-01-01

    The Green Key is an eco-rating program that aims at promoting sustainable business practices in the hospitality sector. The Green Key assesses amongst others the sustainable management of energy, water and waste within hotels and other hospitality firms. The Green Key standard awards points if

  8. Comparison of standard fibrinogen measurement methods with fibrin clot firmness assessed by thromboelastometry in patients with cirrhosis.

    Science.gov (United States)

    Vucelic, Dragica; Jesic, Rada; Jovicic, Snezana; Zivotic, Maja; Grubor, Nikica; Trajkovic, Goran; Canic, Ivana; Elezovic, Ivo; Antovic, Aleksandra

    2015-06-01

    The Clauss fibrinogen method and thrombin clotting time (TCT) are still routinely used in patients with cirrhosis to define fibrinogen concentration and clotting potential. The thromboelastometric functional fibrinogen FIBTEM assay evaluates the strength of fibrin-based clots in whole blood, providing information on both quantitative deficit and fibrin polymerization disorders. To compare these three methods of assessing fibrinogen in patients with cirrhosis of different aetiologies, characterized by impairment in fibrinogen concentration as well as functional aberrance. Sixty patients with alcoholic and 24 patients with cholestatic cirrhosis were included (Child-Pugh score (CPs)A, n=24; B, n=32; C, n=28). All parameters were compared with those from a control group. Maximum clot firmness (MCF) in the FIBTEM test was assessed in regard to its relevance in detection of qualitative fibrinogen disorders in comparison with results obtained by standard measurement methods, i.e. the Clauss fibrinogen method and TCT. With increased cirrhosis severity, fibrinogen and FIBTEM-MCF levels significantly declined (p=0.002), while TCT was significantly prolonged (p=0.002). In all CPs groups, fibrinogen strongly correlated with FIBTEM-MCF (r=0.77, r=0.72, r=0.74; pmeasurement in cirrhotic patients, especially in evaluating fibrin polymerization disorders in these patients. Further studies are needed to evaluate the usefulness of this assay in predicting bleeding complications in cirrhotic patients as well as monitoring replacement treatment. Copyright © 2015 Elsevier Ltd. All rights reserved.

  9. Measurement of area and personal breathing zone concentrations of diesel particulate matter (DPM) during oil and gas extraction operations, including hydraulic fracturing.

    Science.gov (United States)

    Esswein, Eric J; Alexander-Scott, Marissa; Snawder, John; Breitenstein, Michael

    2018-01-01

    Diesel engines serve many purposes in modern oil and gas extraction activities. Diesel particulate matter (DPM) emitted from diesel engines is a complex aerosol that may cause adverse health effects depending on exposure dose and duration. This study reports on personal breathing zone (PBZ) and area measurements for DPM (expressed as elemental carbon) during oil and gas extraction operations including drilling, completions (which includes hydraulic fracturing), and servicing work. Researchers at the National Institute for Occupational Safety and Health (NIOSH) collected 104 full-shift air samples (49 PBZ and 55 area) in Colorado, North Dakota, Texas, and New Mexico during a four-year period from 2008-2012. The arithmetic mean (AM) of the full shift TWA PBZ samples was 10 µg/m 3 ; measurements ranged from 0.1-52 µg/m 3 . The geometric mean (GM) for the PBZ samples was 7 µg/m 3 . The AM of the TWA area measurements was 17 µg/m 3 and ranged from 0.1-68 µg/m 3 . The GM for the area measurements was 9.5 µg/m 3 . Differences between the GMs of the PBZ samples and area samples were not statistically different (P > 0.05). Neither the Occupational Safety and Health Administration (OSHA), NIOSH, nor the American Conference of Governmental Industrial Hygienists (ACGIH) have established occupational exposure limits (OEL) for DPM. However, the State of California, Department of Health Services lists a time-weighted average (TWA) OEL for DPM as elemental carbon (EC) exposure of 20 µg/m 3 . Five of 49 (10.2%) PBZ TWA measurements exceeded the 20 µg/m 3 EC criterion. These measurements were collected on Sandmover and Transfer Belt (T-belt) Operators, Blender and Chemical Truck Operators, and Water Transfer Operators during hydraulic fracturing operations. Recommendations to minimize DPM exposures include elimination (locating diesel-driven pumps away from well sites), substitution, (use of alternative fuels), engineering controls using advanced emission control

  10. Prediction of preterm birth in multiple pregnancies: development of a multivariable model including cervical length measurement at 16 to 21 weeks' gestation.

    Science.gov (United States)

    van de Mheen, Lidewij; Schuit, Ewoud; Lim, Arianne C; Porath, Martina M; Papatsonis, Dimitri; Erwich, Jan J; van Eyck, Jim; van Oirschot, Charlotte M; Hummel, Piet; Duvekot, Johannes J; Hasaart, Tom H M; Groenwold, Rolf H H; Moons, Karl G M; de Groot, Christianne J M; Bruinse, Hein W; van Pampus, Maria G; Mol, Ben W J

    2014-04-01

    To develop a multivariable prognostic model for the risk of preterm delivery in women with multiple pregnancy that includes cervical length measurement at 16 to 21 weeks' gestation and other variables. We used data from a previous randomized trial. We assessed the association between maternal and pregnancy characteristics including cervical length measurement at 16 to 21 weeks' gestation and time to delivery using multivariable Cox regression modelling. Performance of the final model was assessed for the outcomes of preterm and very preterm delivery using calibration and discrimination measures. We studied 507 women, of whom 270 (53%) delivered models for preterm and very preterm delivery had a c-index of 0.68 (95% CI 0.63 to 0.72) and 0.68 (95% CI 0.62 to 0.75), respectively, and showed good calibration. In women with a multiple pregnancy, the risk of preterm delivery can be assessed with a multivariable model incorporating cervical length and other predictors.

  11. Measurement of fermionic couplings of the Standard Model Higgs boson using the bb, tautau and mumu decay channels with the ATLAS detector

    CERN Document Server

    Wang, Song-Ming; The ATLAS collaboration

    2017-01-01

    In this report we present the latest results from the ATLAS experiment on the measurements of fermionic couplings of the Standard Model Higgs boson via the searches of the Higgs boson in the $b\\bar{b}$, $\\tau^{+}\\tau^{-}$ and $\\mu^{+}\\mu^{-}$ decay channels. The searches are performed on proton-proton collision data samples collected during Run 1 and the first two years of Run 2 data taking by ATLAS. These results also include the most recent measurements where ATLAS observed evidence of $H \\rightarrow b\\bar{b}$ decay in the $WH$ and $ZH$ associated production channels.

  12. A Two-Sinker Densimeter for Accurate Measurements of the Density of Natural Gases at Standard Conditions

    Science.gov (United States)

    Richter, Markus; Kleinrahm, Reiner; Glos, Stefan; Wagner, Wolfgang; Span, Roland; Schley, Peter; Uhrig, Martin

    2010-05-01

    A special reference densimeter has been developed for accurate measurements of densities of natural gases and multicomponent gas mixtures at standard conditions of temperature and pressure ( T s = 273.15 K and p s = 0.101325 MPa). The densimeter covers the range from 0.7 kg · m-3 to 1.3 kg · m-3; the total measurement uncertainty in density is 0.020 % (95 % level of confidence). The measurement principle used is the two-sinker method, which is based on the Archimedes buoyancy principle. The certified calibration laboratory of E.ON Ruhrgas AG, Germany, uses this densimeter to verify the standard densities of certified calibration gases (binary and multicomponent gas mixtures). Moreover, the densimeter is used to determine the compositions of commercially available binary gas mixtures with a small uncertainty of (0.01-0.03) mol%.

  13. The TERRA-PNW Dataset: A New Source for Standardized Plant Trait, Forest Carbon Cycling, and Soil Properties Measurements from the Pacific Northwest US, 2000-2014.

    Science.gov (United States)

    Berner, L. T.; Law, B. E.

    2015-12-01

    Plant traits include physiological, morphological, and biogeochemical characteristics that in combination determine a species sensitivity to environmental conditions. Standardized, co-located, and geo-referenced species- and plot-level measurements are needed to address variation in species sensitivity to climate change impacts and for ecosystem process model development, parameterization and testing. We present a new database of plant trait, forest carbon cycling, and soil property measurements derived from multiple TERRA-PNW projects in the Pacific Northwest US, spanning 2000-2014. The database includes measurements from over 200 forest plots across Oregon and northern California, where the data were explicitly collected for scaling and modeling regional terrestrial carbon processes with models such as Biome-BGC and the Community Land Model. Some of the data are co-located at AmeriFlux sites in the region. The database currently contains leaf trait measurements (specific leaf area, leaf longevity, leaf carbon and nitrogen) from over 1,200 branch samples and 30 species, as well as plot-level biomass and productivity components, and soil carbon and nitrogen. Standardized protocols were used across projects, as summarized in an FAO protocols document. The database continues to expand and will include agricultural crops. The database will be hosted by the Oak Ridge National Laboratory (ORLN) Distributed Active Archive Center (DAAC). We hope that other regional databases will become publicly available to help enable Earth System Modeling to simulate species-level sensitivity to climate at regional to global scales.

  14. Global Positioning and Inertial Measurements Range Safety Tracking Systems Commonality Standard

    Science.gov (United States)

    2011-02-01

    requalification test series may be needed to recharacterize RTS performance. Paragraphs A2.1 (b)(c)(d) allow alternative methods to the standard...subjected to some level of electrical functional requalification testing to demonstrate that component performance has not changed. Environmental...level of requalification testing can only validate the environmental survivability of a component with added piece- parts. The extent of testing

  15. Introducing Vouchers and Standardized Tests for Higher Education in Russia: Expectations and Measurements

    OpenAIRE

    Osipian, Ararat

    2008-01-01

    The reform of higher education in Russia, based on standardized tests and educational vouchers, was intended to reduce inequalities in access to higher education. The initiative with the vouchers has failed and by now is already forgotten while the national test is planned to be introduced nationwide in 2009. The national test called to replace the present corrupt system of entry examinations has experienced numerous problems so far and will likely have even more problems in the future. This ...

  16. Fourier transform power spectrum is a potential measure of tissue alignment in standard MRI: A multiple sclerosis study.

    Directory of Open Access Journals (Sweden)

    Shrushrita Sharma

    Full Text Available Loss of tissue coherency in brain white matter is found in many neurological diseases such as multiple sclerosis (MS. While several approaches have been proposed to evaluate white matter coherency including fractional anisotropy and fiber tracking in diffusion-weighted imaging, few are available for standard magnetic resonance imaging (MRI. Here we present an image post-processing method for this purpose based on Fourier transform (FT power spectrum. T2-weighted images were collected from 19 patients (10 relapsing-remitting and 9 secondary progressive MS and 19 age- and gender-matched controls. Image processing steps included: computation, normalization, and thresholding of FT power spectrum; determination of tissue alignment profile and dominant alignment direction; and calculation of alignment complexity using a new measure named angular entropy. To test the validity of this method, we used a highly organized brain white matter structure, corpus callosum. Six regions of interest were examined from the left, central and right aspects of both genu and splenium. We found that the dominant orientation of each ROI derived from our method was significantly correlated with the predicted directions based on anatomy. There was greater angular entropy in patients than controls, and a trend to be greater in secondary progressive MS patients. These findings suggest that it is possible to detect tissue alignment and anisotropy using traditional MRI, which are routinely acquired in clinical practice. Analysis of FT power spectrum may become a new approach for advancing the evaluation and management of patients with MS and similar disorders. Further confirmation is warranted.

  17. Fourier transform power spectrum is a potential measure of tissue alignment in standard MRI: A multiple sclerosis study.

    Science.gov (United States)

    Sharma, Shrushrita; Zhang, Yunyan

    2017-01-01

    Loss of tissue coherency in brain white matter is found in many neurological diseases such as multiple sclerosis (MS). While several approaches have been proposed to evaluate white matter coherency including fractional anisotropy and fiber tracking in diffusion-weighted imaging, few are available for standard magnetic resonance imaging (MRI). Here we present an image post-processing method for this purpose based on Fourier transform (FT) power spectrum. T2-weighted images were collected from 19 patients (10 relapsing-remitting and 9 secondary progressive MS) and 19 age- and gender-matched controls. Image processing steps included: computation, normalization, and thresholding of FT power spectrum; determination of tissue alignment profile and dominant alignment direction; and calculation of alignment complexity using a new measure named angular entropy. To test the validity of this method, we used a highly organized brain white matter structure, corpus callosum. Six regions of interest were examined from the left, central and right aspects of both genu and splenium. We found that the dominant orientation of each ROI derived from our method was significantly correlated with the predicted directions based on anatomy. There was greater angular entropy in patients than controls, and a trend to be greater in secondary progressive MS patients. These findings suggest that it is possible to detect tissue alignment and anisotropy using traditional MRI, which are routinely acquired in clinical practice. Analysis of FT power spectrum may become a new approach for advancing the evaluation and management of patients with MS and similar disorders. Further confirmation is warranted.

  18. Using patient reported outcome measures in health services: A qualitative study on including people with low literacy skills and learning disabilities

    Directory of Open Access Journals (Sweden)

    Jahagirdar Deepa

    2012-11-01

    Full Text Available Abstract Background Patient reported outcome measures (PROMs are self-report measures of health status increasingly promoted for use in healthcare quality improvement. However people with low literacy skills or learning disabilities may find PROMs hard to complete. Our study investigated stakeholder views on the accessibility and use of PROMs to develop suggestions for more inclusive practice. Methods Taking PROMs recommended for chronic obstructive pulmonary disease (COPD as an example, we conducted 8 interviews with people with low literacy skills and/or learning disabilities, and 4 focus groups with 20 health professionals and people with COPD. Discussions covered the format and delivery of PROMs using the EQ-5D and St George Respiratory Questionnaire as prompts. Thematic framework analysis focused on three main themes: Accessibility, Ease of Use, and Contextual factors. Results Accessibility included issues concerning the questionnaire format, and suggestions for improvement included larger font sizes and more white space. Ease of Use included discussion about PROMs’ administration. While health professionals suggested PROMs could be completed in waiting rooms, patients preferred settings with more privacy and where they could access help from people they know. Contextual Factors included other challenges and wider issues associated with completing PROMs. While health professionals highlighted difficulties created by the system in managing patients with low literacy/learning disabilities, patient participants stressed that understanding the purpose of PROMs was important to reduce intimidation. Conclusions Adjusting PROMs’ format, giving an explicit choice of where patients can complete them, and clearly conveying PROMs’ purpose and benefit to patients may help to prevent inequality when using PROMs in health services.

  19. The Measurement of Quality of Semantic Standards: the Application of a Quality Model on the SETU standard for eGovernment

    NARCIS (Netherlands)

    Folmer, E.J.A.; Bekkum, M.A. van; Oude Luttighuis, P.; Hillegersberg, J. van

    2011-01-01

    eGovernment interoperability should be dealt with using high-quality standards. A quality model for standards is presented based on knowledge from the software engineering domain. In the tradition of action research the model is used on the SETU standard, a standard that is mandatory in the public

  20. The Measurement of Quality of Semantic Standards : the Application of a Quality Model on the SETU standard for eGovernment

    NARCIS (Netherlands)

    Folmer, Erwin; van Bekkum, Michael; Oude Luttighuis, Paul; van Hillegersberg, Jos

    2011-01-01

    eGovernment interoperability should be dealt with using high-quality standards. A quality model for standards is presented based on knowledge from the software engineering domain. In the tradition of action research the model is used on the SETU standard, a standard that is mandatory in the public

  1. Effectiveness of interventions on physical activity in overweight or obese children: a systematic review and meta-analysis including studies with objectively measured outcomes.

    Science.gov (United States)

    Nooijen, C F J; Galanti, M R; Engström, K; Möller, J; Forsell, Y

    2017-02-01

    There is no consensus on interventions to be recommended in order to promote physical activity among overweight or obese children. The objective of this review was to assess the effects on objectively measured physical activity, of interventions promoting physical activity among overweight or obese children or adolescents, compared to no intervention or to interventions without a physical activity component. Publications up to December 2015 were located through electronic searches for randomized controlled trials resulting in inclusion of 33 studies. Standardized mean differences from baseline to post-intervention and to long-term follow-up were determined for intervention and control groups and meta-analysed using random effects models. The meta-analysis showed that interventions had no effect on total physical activity of overweight and obese children, neither directly post-intervention (-0.02 [-0.15, 0.11]) nor at long-term follow-up (0.07 [-0.27, 0.40]). Separate analyses by typology of intervention (with or without physical fitness, behavioural or environmental components) showed similar results (no effect). In conclusion, there is no evidence that currently available interventions are able to increase physical activity among overweight or obese children. This questions the contribution of physical activity to the treatment of overweight and obesity in children in the studied interventions and calls for other treatment strategies. © 2017 World Obesity Federation.

  2. The use of a knowledge translation program to increase use of standardized outcome measures in an outpatient pediatric physical therapy clinic: administrative case report.

    Science.gov (United States)

    Schreiber, Joseph; Marchetti, Gregory F; Racicot, Brook; Kaminski, Ellen

    2015-04-01

    Pediatric physical therapists face many challenges related to the application of research evidence to clinical practice. A multicomponent knowledge translation (KT) program may be an effective strategy to support practice change. The purpose of this case report is to describe the use of a KT program to improve the knowledge and frequency of use of standardized outcome measures by pediatric physical therapists practicing in an outpatient clinic. This program occurred at a pediatric outpatient facility with 1 primary clinic and 3 additional satellite clinics, and a total of 17 physical therapists. The initial underlying problem was inconsistency across staff recommendations for frequency and duration of physical therapist services. Formal and informal discussion with the department administrator and staff identified a need for increased use of standardized outcome measures to inform these decisions. The KT program to address this need spanned 6 months and included identification of barriers, the use of a knowledge broker, multiple workshop and practice sessions, online and hard-copy resources, and ongoing evaluation of the KT program with dissemination of results to staff. Outcome measures included pre- and post-knowledge assessment and self-report surveys and chart review data on use of outcome measures. Participants (N=17) gained knowledge and increased the frequency of use of standardized outcome measures based on data from self-report surveys, a knowledge assessment, and chart reviews. Administrators and others interested in supporting practice change in physical therapy may consider implementing a systematic KT program that includes a knowledge broker, ongoing engagement with staff, and a variety of accessible resources. © 2015 American Physical Therapy Association.

  3. 77 FR 37409 - Request for Domains, Instruments, and Measures for Development of a Standardized Instrument for...

    Science.gov (United States)

    2012-06-21

    ..., communication, coordination of care, customer service), instruments, and measures for measuring the level of enrollee satisfaction with qualified health plans plus the experience of the consumer interacting with the health care system and the experience of the consumer interacting with the Exchange (for example...

  4. Quality of healthcare in Canada: potential for a pan-Canadian measurement standard.

    Science.gov (United States)

    Florizone, Dan

    2012-01-01

    Saskatchewan has embarked on a journey to transform the quality of its healthcare. Through our experiences, we have learned many lessons that could be useful to the development of a pan-Canadian system of measurement aimed at bettering care. However, measurement in isolation is insufficient to achieve improved healthcare. The system needs to be linked to a common improvement agenda. Creating a systematic approach to improvement is only possible through developing the capacities of leaders and front-line staff, by alignment through a common purpose, by focusing on value from the perspective of the customer and by creating measures backed by best practice that are transparent and accountable.

  5. Standard practice for measurement of time-of-wetness on surfaces exposed to wetting conditions as in atmospheric corrosion testing

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    1989-01-01

    1.1 This practice covers a technique for monitoring time-of-wetness (TOW) on surfaces exposed to cyclic atmospheric conditions which produce depositions of moisture. 1.2 The practice is also applicable for detecting and monitoring condensation within a wall or roof assembly and in test apparatus. 1.3 Exposure site calibration or characterization can be significantly enhanced if TOW is measured for comparison with other sites, particularly if this data is used in conjunction with other site-specific instrumentation techniques. 1.4 The values stated in SI units are to be regarded as the standard. This standard does not purport to address all of the safety concerns, if any, associated with its use. It is the responsibility of the user of this standard to establish appropriate safety and health practices and determine the applicability of regulatory limitations prior to use.

  6. Legal applications of the "best interest of the child" standard: judicial rationalization or a measure of institutional competence?

    Science.gov (United States)

    Carbone, June

    2014-10-01

    This article explores the use of the best interest standard in the context of third-party interventions in ongoing parent-child relationships. I start by examining the history of the best interest standard and show that it has had different meanings in different eras. I then address the nature of the family and the question of whether interests beyond those addressed in the child's best interest standard are a legitimate part of family decision-making. I conclude that ongoing families are entitled to at least a measure of deference in their decisions about their children. Third-party interventions, such as those of doctors or judges, should require something more than simply a difference of opinion about where the child's interests lie. Copyright © 2014 by the American Academy of Pediatrics.

  7. Standardizing lightweight deflectometer modulus measurements for compaction quality assurance : research summary.

    Science.gov (United States)

    2017-09-01

    The mechanistic-empirical pavement design method requires the elastic resilient modulus as the key input for characterization of geomaterials. Current density-based QA procedures do not measure resilient modulus. Additionally, the density-based metho...

  8. Top production measurements with ATLAS: probes at the frontier of the Standard Model

    CERN Multimedia

    CERN. Geneva

    2018-01-01

    The results are compared to predictions of Monte Carlo generators implementing NLO matrix elements matched with parton showers and NNLO QCD theory calculations and allow for an alternative measurement of the top quark mass.

  9. Accurate quantitation standards of glutathione via traceable sulfur measurement by inductively coupled plasma optical emission spectrometry and ion chromatography

    Directory of Open Access Journals (Sweden)

    L. Rastogi

    2013-06-01

    Full Text Available The quantitative analysis of glutathione (GSH is important in different fields like medicine, biology, and biotechnology. Accurate quantitative measurements of this analyte have been hampered by the lack of well characterized reference standards. The proposed procedure is intended to provide an accurate and definitive method for the quantitation of GSH for reference measurements. Measurement of the stoichiometrically existing sulfur content in purified GSH offers an approach for its quantitation and calibration through an appropriate characterized reference material (CRM for sulfur would provide a methodology for the certification of GSH quantity, that is traceable to SI (International system of units. The inductively coupled plasma optical emission spectrometry (ICP-OES approach negates the need for any sample digestion. The sulfur content of the purified GSH is quantitatively converted into sulfate ions by microwave-assisted UV digestion in the presence of hydrogen peroxide prior to ion chromatography (IC measurements. The measurement of sulfur by ICP-OES and IC (as sulfate using the “high performance” methodology could be useful for characterizing primary calibration standards and certified reference materials with low uncertainties. The relative expanded uncertainties (% U expressed at 95% confidence interval for ICP-OES analyses varied from 0.1% to 0.3%, while in the case of IC, they were between 0.2% and 1.2%. The described methods are more suitable for characterizing primary calibration standards and certifying reference materials of GSH, than for routine measurements. Keywords: Glutathione, High performance methodology, MW-UV photolysis, Measurement uncertainty

  10. Standard test method for measuring waste glass or glass ceramic durability by vapor hydration test

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    2009-01-01

    1.1 The vapor hydration test method can be used to study the corrosion of a waste forms such as glasses and glass ceramics upon exposure to water vapor at elevated temperatures. In addition, the alteration phases that form can be used as indicators of those phases that may form under repository conditions. These tests; which allow altering of glass at high surface area to solution volume ratio; provide useful information regarding the alteration phases that are formed, the disposition of radioactive and hazardous components, and the alteration kinetics under the specific test conditions. This information may be used in performance assessment (McGrail et al, 2002 (1) for example). 1.2 This test method must be performed in accordance with all quality assurance requirements for acceptance of the data. 1.3 This standard does not purport to address all of the safety concerns, if any, associated with its use. It is the responsibility of the user of this standard to establish appropriate safety and health practice...

  11. Standardization of Compton suppression gamma spectrometry system for the measurement of 129I in environmental samples

    International Nuclear Information System (INIS)

    Baburajan, A.; Sudheendran, V.; Dalvi, Sushant S.; Ravi, P.M.; Tripathi, R.M.

    2014-01-01

    This paper presents the results of standardization of a Compton Suppression Gamma spectrometry system for the determination of 129 I in various environmental samples in the pathways of Air - grass - milk (thyroid) - Man. The Compton Suppression System consists of an HPGe detector surrounded by an assembly of NaI(Tl) detectors, used to reduce the contribution of scattered gamma rays in the gamma spectrum that originate within the HPGe detector. A Compton suppression system consisting of N-type HPGe detector, 50% relative efficiency having low window thickness of about 0.3μm of Boron, and NaI(TI) detector as Compton shield was commissioned at ESL, Tarapur and various performance parameters were evaluated and documented. Being a reverse electrode HPGe system with low window thickness it can be used for the detection of low energy gamma lines from 3.0 keV to >10 MeV (Ortec). Hence, it is proposed to use the system for the determination of 129 I in environmental samples. The paper deals with the standardization of the system for efficiency and Minimum Detectable Activity (MDA) for various sample geometries in the detection of low level 129 I in various pathways of its exposure

  12. Application of standard and modified Eh-Star test method for induction motor stray load losses and efficiency measurement

    Directory of Open Access Journals (Sweden)

    Koprivica Branko

    2012-01-01

    Full Text Available The aim of this paper is to present the application of one simple and accurate method for the measurement of stray load losses (additional load losses in induction machines. That is the Eh-Star method given in the IEC 60034-2-1 standard. In this paper the theoretical background of the method and the measurement procedure have been explained. All measurements have been performed using modern measurement systems based on a personal computer, data acquisition cards and LabVIEW software. According to the measured results for the stray load losses, the efficiency of the induction motor has been calculated. The efficiency obtained has been compared with the IEC standard efficiency classes, in order to determine the efficiency class of the tested motor. Additionally, measurements have been performed using the modified Eh-Star method. The results obtained have been compared with those obtained using the Eh-Star method. The advantages and disadvantages of both methods have been analyzed in this paper.

  13. Wrist blood pressure-measuring devices: a comparative study of accuracy with a standard auscultatory method using a mercury manometer.

    Science.gov (United States)

    Altunkan, Sekip; Yildiz, Sevil; Azer, Sabir

    2002-10-01

    In this study, we compared two wrist blood pressure-measuring devices, the Omron RX and the Nissei WS-310, against a mercury manometer. A total of 152 subjects attending an out-patient hypertensive clinic were recruited from a randomized blood pressure survey, 87 patients (mean 44.4 +/- 14.5 years of age) being selected according to the Association for the Advancement of Medical Instrumentation/British Hypertension Society standards. Device validation was assessed through the use of sequential same-arm readings compared with readings taken using a mercury sphygmomanometer by the two trained observers. There were no differences between the observers and the monitors for diastolic readings (2.8 +/- 4.8 mmHg for the Omron and 4.2 +/- 6.4 mmHg for the Nissei) according to the Association for the Advancement of Medical Instrumentation standards. The largest standard deviations -- 8.3 mmHg for the Omron and 8.8 mmHg for the Nissei, respectively -- were seen for systolic readings recorded by the observers and the monitors. According to the British Hypertension Society standards, the Omron achieved an A grade for diastolic readings and a B grade for systolic readings within 5 and 10 mmHg. The Nissei monitor achieved an A grade for diastolic readings and a B grade for systolic readings within 5 and 10 mmHg. Patients found the wrist oscillometric devices that we tested to be comfortable and easy to use. These devices are appropriate for measuring diastolic blood pressure according to the standards, but the reliability of both devices decreased when measuring systolic blood pressure.

  14. [An integral approach to insomnia in primary care: Non-pharmacological and phytotherapy measures compared to standard treatment].

    Science.gov (United States)

    Viniegra Domínguez, M Adela; Parellada Esquius, Neus; Miranda de Moraes Ribeiro, Rafaela; Parellada Pérez, Laura Mar; Planas Olives, Carme; Momblan Trejo, Cristina

    2015-01-01

    Insomnia is a sleep disorder in which there is an inability to fall asleep or to stay asleep. At some point in life, 50% of adults suffer from it, usually in stress situations. To evaluate the impact of sleep hygiene measures, relaxations techniques, and herbal medicine to deal with insomnia, compared with standard measures (drug treatment). An experimental, retrospective, non-randomized study was conducted by means of a review of patients diagnosed with insomnia (2008-2010). Patients in the intervention group (IG) received an integrative approach (hygiene measures, relaxation techniques, and herbal medicine) and a control group (CG) with conventional treatment. A comparison was made of the resources used in the two groups (average monthly visits pre- and post-diagnosis), type of prescribed drug therapy and total dose. Sleep quality was evaluated at 18-24 months (Epworth test). A total of 48 patients were included in the IG and 47 in the CG (70% women, mean age 46 years (SD: 14.3). Average monthly visit pre-diagnosis was 0.54 (SD: 0.42) in the IG and 0.53 (SD: 0.53) in the CG (P=.88). Post-diagnosis it was 0.36 (SD: 0.24) and 0.65 (SD: 0.46), respectively (P<.0001), with a statistically significant reduction being observed in the IG. More than half (52.5%) of the IG patients and 93.6% in the CG had received a benzodiazepine (P<.0001). Alprazolam and lorazepam were the most prescribed in the CG and with higher cumulative dose. In the subsequent evaluation, 17% of patients in the IG and 5% in CG did not have insomnia. Severe insomnia was present in 13% of patients in the IG and none in CG (P<.0001). The integrative approach to insomnia may be worthwhile as it reduces resource use and side effects, as well as dependence to benzodiazepines. Copyright © 2014 Elsevier España, S.L.U. All rights reserved.

  15. Quality Oncology Practice Initiative Certification Program: measuring implementation of chemotherapy administration safety standards in the outpatient oncology setting.

    Science.gov (United States)

    Gilmore, Terry R; Schulmeister, Lisa; Jacobson, Joseph O

    2013-03-01

    The Quality Oncology Practice Initiative (QOPI) Certification Program (QCP) evaluates individual outpatient oncology practice performance in areas that affect patient care and safety and builds on the American Society of Clinical Oncology (ASCO) QOPI by assessing the compliance of a practice with certification standards based on the ASCO/Oncology Nursing Society standards for safe chemotherapy administration. To become certified, a practice must attain a benchmark quality score on certification measures in QOPI and attest that it complies with 17 QCP standards. Structured on-site reviews, initially performed in randomly selected practices, became mandatory beginning in September 2011. Of 111 practices that have undergone on-site review, only two were fully concordant with all of the standards (median, 11; range, seven to 17). Most practices were subsequently able to modify practice to become QOPI certified. The QCP addresses the call from the Institute of Medicine to close the quality gap by aligning evidence-based guidelines and consensus-driven standards with requirements for oncology practices to develop and maintain structural safety components, such as policies and procedures that ensure practice performance. On-site practice evaluation is a high-impact component of the program.

  16. SPSS programs for the measurement of nonindependence in standard dyadic designs.

    Science.gov (United States)

    Alferes, Valentim R; Kenny, David A

    2009-02-01

    Dyadic research is becoming more common in the social and behavioral sciences. The most common dyadic design is one in which two persons are measured on the same set of variables. Very often, the first analysis of dyadic data is to determine the extent to which the responses of the two persons are correlated-that is, whether there is nonindependence in the data. We describe two user-friendly SPSS programs for measuring nonindependence of dyadic data. Both programs can be used for distinguishable and indistinguishable dyad members. Inter1.sps is appropriate for interval measures. Inter2.sps applies to categorical variables. The SPSS syntax and data files related to this article may be downloaded as supplemental materials from brm.psychonomic-journals.org/content/supplemental.

  17. Socioeconomic Inequalities in Non-Communicable Diseases Prevalence in India: Disparities between Self-Reported Diagnoses and Standardized Measures

    Science.gov (United States)

    Vellakkal, Sukumar; Subramanian, S. V.; Millett, Christopher; Basu, Sanjay; Stuckler, David; Ebrahim, Shah

    2013-01-01

    Background Whether non-communicable diseases (NCDs) are diseases of poverty or affluence in low-and-middle income countries has been vigorously debated. Most analyses of NCDs have used self-reported data, which is biased by differential access to healthcare services between groups of different socioeconomic status (SES). We sought to compare self-reported diagnoses versus standardised measures of NCD prevalence across SES groups in India. Methods We calculated age-adjusted prevalence rates of common NCDs from the Study on Global Ageing and Adult Health, a nationally representative cross-sectional survey. We compared self-reported diagnoses to standardized measures of disease for five NCDs. We calculated wealth-related and education-related disparities in NCD prevalence by calculating concentration index (C), which ranges from −1 to +1 (concentration of disease among lower and higher SES groups, respectively). Findings NCD prevalence was higher (range 5.2 to 19.1%) for standardised measures than self-reported diagnoses (range 3.1 to 9.4%). Several NCDs were particularly concentrated among higher SES groups according to self-reported diagnoses (Csrd) but were concentrated either among lower SES groups or showed no strong socioeconomic gradient using standardized measures (Csm): age-standardised wealth-related C: angina Csrd 0.02 vs. Csm −0.17; asthma and lung diseases Csrd −0.05 vs. Csm −0.04 (age-standardised education-related Csrd 0.04 vs. Csm −0.05); vision problems Csrd 0.07 vs. Csm −0.05; depression Csrd 0.07 vs. Csm −0.13. Indicating similar trends of standardized measures detecting more cases among low SES, concentration of hypertension declined among higher SES (Csrd 0.19 vs. Csm 0.03). Conclusions The socio-economic patterning of NCD prevalence differs markedly when assessed by standardized criteria versus self-reported diagnoses. NCDs in India are not necessarily diseases of affluence but also of poverty, indicating likely under-diagnosis and

  18. Standard practice for calibration of torque-measuring instruments for verifying the torque indication of torque testing machines

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    2008-01-01

    1.1 This practice is to specify procedure for the calibration of elastic torque-measuring instruments. Note 1—Verification by deadweight and a lever arm is an acceptable method of verifying the torque indication of a torque testing machine. Tolerances for weights used are tabulated in Practice WK6364; methods for calibration of the weights are given in NIST Technical Note 577, Methods of Calibrating Weights for Piston Gages. 1.2 The values stated in either SI units or inch-pound units are to be regarded separately as standard. The values stated in each system may not be exact equivalents; therefore, each system shall be used independently of the other. Combining values from the two systems may result in non-conformance with the standard. 1.3 This practice is intended for the calibration of static or quasi-static torque measuring instruments. The practice is not applicable for high speed torque calibrations or measurements. 1.4 This standard does not purport to address all of the safety concerns, if any,...

  19. Normalization of cortical thickness measurements across different T1 magnetic resonance imaging protocols by novel W-Score standardization.

    Science.gov (United States)

    Chung, Jinyong; Yoo, Kwangsun; Lee, Peter; Kim, Chan Mi; Roh, Jee Hoon; Park, Ji Eun; Kim, Sang Joon; Seo, Sang Won; Shin, Jeong-Hyeon; Seong, Joon-Kyung; Jeong, Yong

    2017-10-01

    The use of different 3D T1-weighted magnetic resonance (T1 MR) imaging protocols induces image incompatibility across multicenter studies, negating the many advantages of multicenter studies. A few methods have been developed to address this problem, but significant image incompatibility still remains. Thus, we developed a novel and convenient method to improve image compatibility. W-score standardization creates quality reference values by using a healthy group to obtain normalized disease values. We developed a protocol-specific w-score standardization to control the protocol effect, which is applied to each protocol separately. We used three data sets. In dataset 1, brain T1 MR images of normal controls (NC) and patients with Alzheimer's disease (AD) from two centers, acquired with different T1 MR protocols, were used (Protocol 1 and 2, n = 45/group). In dataset 2, data from six subjects, who underwent MRI with two different protocols (Protocol 1 and 2), were used with different repetition times, echo times, and slice thicknesses. In dataset 3, T1 MR images from a large number of healthy normal controls (Protocol 1: n = 148, Protocol 2: n = 343) were collected for w-score standardization. The protocol effect and disease effect on subjects' cortical thickness were analyzed before and after the application of protocol-specific w-score standardization. As expected, different protocols resulted in differing cortical thickness measurements in both NC and AD subjects. Different measurements were obtained for the same subject when imaged with different protocols. Multivariate pattern difference between measurements was observed between the protocols. Classification accuracy between two protocols was nearly 90%. After applying protocol-specific w-score standardization, the differences between the protocols substantially decreased. Most importantly, protocol-specific w-score standardization reduced both univariate and multivariate differences in the images while

  20. Development of Objective Standard Setting Using Rasch Measurement Model in Malaysian Institution of Higher Learning

    Science.gov (United States)

    Khatimin, Nuraini; Aziz, Azrilah Abdul; Zaharim, Azami; Yasin, Siti Hanani Mat

    2013-01-01

    Measurement and evaluation of students' achievement are an important aspect to make sure that students really understand the course content and monitor students' achievement level. Performance is not only reflected from the numbers of high achievers of the students, but also on quality of the grade obtained; does the grade "A" truly…

  1. Method for Estimating Evaporative Potential (IM/CLO) from ASTM Standard Single Wind Velocity Measures

    Science.gov (United States)

    2016-08-10

    Introduction .................................................................................................................. 2 Methods...testing. 2 INTRODUCTION Sweating thermal manikins have long been used to provide biophysical measures of clothing and equipment worn by the...AW, Gonzalez JA, and Xu X. Ebola response: Modeling the risk of heat stress from personal protective clothing. PLoS ONE 10(11), e0143461, 2015. 16

  2. Time, tire measurements forces and moments: a new standard for steady state cornering tyre testing

    NARCIS (Netherlands)

    Oosten, J.J.M. van; Savi, C.; Augustin, M.; Bouhet, O.; Sommer, J.; Colinot, J.P.

    1999-01-01

    In order to develop vehicles which have maximum active safety, car manufacturers need information about the so-called force and moment properties of tyres. Vehicle manufacturers, tyre suppliers and automotive research organisations have advanced test equipment to measure the forces between a tyre

  3. 7 CFR 1755.400 - RUS standard for acceptance tests and measurements of telecommunications plant.

    Science.gov (United States)

    2010-01-01

    ... telecommunications plant. 1755.400 Section 1755.400 Agriculture Regulations of the Department of Agriculture (Continued) RURAL UTILITIES SERVICE, DEPARTMENT OF AGRICULTURE TELECOMMUNICATIONS POLICIES ON SPECIFICATIONS... measurements of telecommunications plant. Sections 1755.400 through 1755.407 cover the requirements for...

  4. Where does the standard application end and where does the solution made-to measure begin?

    International Nuclear Information System (INIS)

    Chovan, P.

    2004-01-01

    Aim of this presentation is to explain necessity of made-to measure solutions and their extent within the enterprise information systems development. The author presents DELTA E S, Plc approach to these questions, explains possible risks and benefits from these tailored made solutions. All the presentation is going to be supported by experiences and results from the realised or active projects

  5. System Energy Assessment (SEA, Defining a Standard Measure of EROI for Energy Businesses as Whole Systems

    Directory of Open Access Journals (Sweden)

    Jay Zarnikau

    2011-10-01

    Full Text Available A more objective method for measuring the energy needs of businesses, System Energy Assessment (SEA, measures the combined impacts of material supply chains and service supply chains, to assess businesses as whole self-managing net-energy systems. The method is demonstrated using a model Wind Farm, and defines a physical measure of their energy productivity for society (EROI-S, a ratio of total energy delivered to total energy expended. Energy use records for technology and proxy measures for clearly understood but not individually recorded energy uses for services are combined for a whole system estimate of consumption required for production. Current methods count only energy needs for technology. Business services outsource their own energy needs to operate, leaving no traceable record. That uncounted business energy demand is often 80% of the total, an amount of “dark energy” hidden from view, discovered by finding the average energy estimated needs for businesses far below the world average energy consumed per dollar of GDP. Presently for lack of information the energy needs of business services are counted to be “0”. Our default assumption is to treat them as “average”. The result is a hard measure of total business demand for energy services, a “Scope 4” energy use or GHG impact assessment. Counting recorded energy uses and discounting unrecorded ones misrepresents labor intensive work as highly energy efficient. The result confirms a similar finding by Hall et al. in 1981 [1]. We use exhaustive search for what a business needs to operate as a whole, tracing internal business relationships rather than energy data, to locate its natural physical boundary as a working unit, and so define a business as a physical rather than statistical subject of scientific study. See also online resource materials and notes [2].

  6. Relationship between alveolar bone measured by 125I absorptiometry with analysis of standardized radiographs: 2. Bjorn technique

    International Nuclear Information System (INIS)

    Ortman, L.F.; McHenry, K.; Hausmann, E.

    1982-01-01

    The Bjorn technique is widely used in periodontal studies as a standardized measure of alveolar bone. Recent studies have demonstrated the feasibility of using 125 I absorptiometry to measure bone mass. The purpose of this study was to compare 125 I absorptiometry with the Bjorn technique in detecting small sequential losses of alveolary bone. Four periodontal-like defects of incrementally increasing size were produced in alveolar bone in the posterior segment of the maxilla of a human skull. An attempt was made to sequentially reduce the amount of bone in 10% increments until no bone remained, a through and through defect. The bone remaining at each step was measured using 125 I absorptiometry. At each site the 125 I absorptiometry measurements were made at the same location by fixing the photon source to a prefabricated precision-made occlusal splint. This site was just beneath the crest and midway between the borders of two adjacent teeth. Bone loss was also determined by the Bjorn technique. Standardized intraoral films were taken using a custom-fitted acrylic clutch, and bone measurements were made from the root apex to coronal height of the lamina dura. A comparison of the data indicates that: (1) in early bone loss, less than 30%, the Bjorn technique underestimates the amount of loss, and (2) in advanced bone loss, more than 60% the Bjorn technique overestimates it

  7. Development of departmental standard for traceability of measured activity for I-131 therapy capsules used in nuclear medicine

    Directory of Open Access Journals (Sweden)

    Ravichandran Ramamoorthy

    2011-01-01

    Full Text Available International Basic Safety Standards (International Atomic Energy Agency, IAEA provide guidance levels for diagnostic procedures in nuclear medicine indicating the maximum usual activity for various diagnostic tests in terms of activities of injected radioactive formulations. An accuracy of ± 10% in the activities of administered radio-pharmaceuticals is being recommended, for expected outcome in diagnostic and therapeutic nuclear medicine procedures. It is recommended that the long-term stability of isotope calibrators used in nuclear medicine is to be checked periodically for their performance using a long-lived check source, such as Cs-137, of suitable activity. In view of the un-availability of such a radioactive source, we tried to develop methods to maintain traceability of these instruments, for certifying measured activities for human use. Two re-entrant chambers [(HDR 1000 and Selectron Source Dosimetry System (SSDS] with I-125 and Ir-192 calibration factors in the Department of Radiotherapy were used to measure Iodine-131 (I-131 therapy capsules to establish traceability to Mark V isotope calibrator of the Department of Nuclear Medicine. Special nylon jigs were fabricated to keep I-131 capsule holder in position. Measured activities in all the chambers showed good agreement. The accuracy of SSDS chamber in measuring Ir-192 activities in the last 5 years was within 0.5%, validating its role as departmental standard for measuring activity. The above method is adopted because mean energies of I-131 and Ir-192 are comparable.

  8. The PROMIS Physical Function item bank was calibrated to a standardized metric and shown to improve measurement efficiency.

    Science.gov (United States)

    Rose, Matthias; Bjorner, Jakob B; Gandek, Barbara; Bruce, Bonnie; Fries, James F; Ware, John E

    2014-05-01

    To document the development and psychometric evaluation of the Patient-Reported Outcomes Measurement Information System (PROMIS) Physical Function (PF) item bank and static instruments. The items were evaluated using qualitative and quantitative methods. A total of 16,065 adults answered item subsets (n>2,200/item) on the Internet, with oversampling of the chronically ill. Classical test and item response theory methods were used to evaluate 149 PROMIS PF items plus 10 Short Form-36 and 20 Health Assessment Questionnaire-Disability Index items. A graded response model was used to estimate item parameters, which were normed to a mean of 50 (standard deviation [SD]=10) in a US general population sample. The final bank consists of 124 PROMIS items covering upper, central, and lower extremity functions and instrumental activities of daily living. In simulations, a 10-item computerized adaptive test (CAT) eliminated floor and decreased ceiling effects, achieving higher measurement precision than any comparable length static tool across four SDs of the measurement range. Improved psychometric properties were transferred to the CAT's superior ability to identify differences between age and disease groups. The item bank provides a common metric and can improve the measurement of PF by facilitating the standardization of patient-reported outcome measures and implementation of CATs for more efficient PF assessments over a larger range. Copyright © 2014. Published by Elsevier Inc.

  9. [Poverty and Health: The Living Standard Approach as a Supplementary Concept to Measure Relative Poverty. Results from the German Socio-Economic Panel (GSOEP 2011)].

    Science.gov (United States)

    Pförtner, T-K

    2016-06-01

    A common indicator of the measurement of relative poverty is the disposable income of a household. Current research introduces the living standard approach as an alternative concept for describing and measuring relative poverty. This study compares both approaches with regard to subjective health status of the German population, and provides theoretical implications for the utilisation of the income and living standard approach in health research. Analyses are based on the German Socio-Economic Panel (GSOEP) from the year 2011 that includes 12 290 private households and 21106 survey members. Self-rated health was based on a subjective assessment of general health status. Income poverty is based on the equalised disposable income and is applied to a threshold of 60% of the median-based average income. A person will be denoted as deprived (inadequate living standard) if 3 or more out of 11 living standard items are lacking due to financial reasons. To calculate the discriminate power of both poverty indicators, descriptive analyses and stepwise logistic regression models were applied separately for men and women adjusted for age, residence, nationality, educational level, occupational status and marital status. The results of the stepwise regression revealed a stronger poverty-health relationship for the living standard indicator. After adjusting for all control variables and the respective poverty indicator, income poverty was statistically not significantly associated with a poor subjective health status among men (OR Men: 1.33; 95% CI: 1.00-1.77) and women (OR Women: 0.98; 95% CI: 0.78-1.22). In contrast, the association between deprivation and subjective health status was statistically significant for men (OR Men: 2.00; 95% CI: 1.57-2.52) and women (OR Women: 2.11; 95% CI: 1.76-2.64). The results of the present study indicate that the income and standard of living approach measure different dimensions of poverty. In comparison to the income approach, the living

  10. Development of a direct measurement system for the standardization of neutron emission rates

    International Nuclear Information System (INIS)

    Ogheard, Florestan

    2012-01-01

    The manganese bath technique is the reference method for neutron source emission rates calibration. It is used to calibrate neutron sources using radionuclides (AmBe, PuBe, 252 Cf,...) in terms of neutron emission rate under 4π sr. As a complement to this technique, the anisotropy of the source is measured using a rotating source holder and a neutron long counter. The neutron source to be measured is immersed in a manganese sulphate solution whereby the emitted neutrons are captured within the bath contents. In a typical configuration (a 1 m diameter sphere and a concentrated solution), approximately half of the neutrons lead to the creation of 56 Mn via the 55 Mn(n, γ) capture reaction. The 56 Mn radionuclide has a half-life of approximately 2.6 hours and the bath reaches saturation when the number of nuclei decaying is equal to the number of nuclei created per unit time. The neutron emission rate from the source can then be deduced from the 56 Mn activity at saturation, assuming proper modelling of the nuclear reactions occurring in the bath. The manganese bath facility at LNE-LNHB has been recently refurbished in order to comply with appropriate safety and radioprotection regulations. This has lead to the upgrading of both the measurement methodology and the modelling of the bath, and a study on the development of a new detector for the on-line measurement of the manganese activity was started. This new detector uses the β-γ coincidence measurement method. The beta channel consists of two photomultipliers tubes which allow the detection of Cerenkov light, and the gamma channel uses a solid scintillation detector. The advantage of this measurement method is that it allows the determination of the bath activity without any prior calibration, unlike the former method which uses a gamma-ray detector calibrated using a high activity manganese source. The principle of the Cerenkov-gamma coincidence measurement has been validated by a prototype of the detector and

  11. A Standardized Method for Measuring Internal Lung Surface Area via Mouse Pneumonectomy and Prosthesis Implantation.

    Science.gov (United States)

    Liu, Zhe; Fu, Siling; Tang, Nan

    2017-07-26

    Pulmonary morphology, physiology, and respiratory functions change in both physiological and pathological conditions. Internal lung surface area (ISA), representing the gas-exchange capacity of the lung, is a critical criterion to assess respiratory function. However, observer bias can significantly influence measured values for lung morphological parameters. The protocol that we describe here minimizes variations during measurements of two morphological parameters used for ISA calculation: internal lung volume (ILV) and mean linear intercept (MLI). Using ISA as a morphometric and functional parameter to determine the outcome of alveolar regeneration in both pneumonectomy (PNX) and prosthesis implantation mouse models, we found that the increased ISA following PNX treatment was significantly blocked by implantation of a prosthesis into the thoracic cavity 1 . The ability to accurately quantify ISA is not only expected to improve the reliability and reproducibility of lung function studies in injured-induced alveolar regeneration models, but also to promote mechanistic discoveries of multiple pulmonary diseases.

  12. Prospects for standard model measurements at the High-Luminosity LHC with CMS

    CERN Document Server

    Rezaei Hosseinabadi, Ferdos

    2017-01-01

    LHC are presented. These studies investigate the potential of the upgraded CMS detector at the HL-LHC for precision measurements of the top mass and a flavour changing neutral currents search in single top quark production. An extrapolation to the proton proton collisions at a centre-of-mass energy of 14 TeV with the HL-LHC run conditions is performed assuming an integrated luminosity of up to 3 $\\mathrm{ab}^{-1}$.

  13. On the measurement of movement difficulty in the standard approach to Fitts' law.

    Directory of Open Access Journals (Sweden)

    Yves Guiard

    Full Text Available Fitts' law is an empirical rule of thumb which predicts the time it takes people, under time pressure, to reach with some pointer a target of width W located at a distance D. It has been traditionally assumed that the predictor of movement time must be some mathematical transform of the quotient of D/W, called the index of difficulty (ID of the movement task. We ask about the scale of measurement involved in this independent variable. We show that because there is no such thing as a zero-difficulty movement, the IDs of the literature run on non-ratio scales of measurement. One notable consequence is that, contrary to a widespread belief, the value of the y-intercept of Fitts' law is uninterpretable. To improve the traditional Fitts paradigm, we suggest grounding difficulty on relative target tolerance W/D, which has a physical zero, unlike relative target distance D/W. If no one can explain what is meant by a zero-difficulty movement task, everyone can understand what is meant by a target layout whose relative tolerance W/D is zero, and hence whose relative intolerance 1-W/D is 1 or 100%. We use the data of Fitts' famous tapping experiment to illustrate these points. Beyond the scale of measurement issue, there is reason to doubt that task difficulty is the right object to try to measure in basic research on Fitts' law, target layout manipulations having never provided users of the traditional Fitts paradigm with satisfactory control over the variations of the speed and accuracy of movements. We advocate the trade-off paradigm, a recently proposed alternative, which is immune to this criticism.

  14. Flow Control and Measurement in Electric Propulsion Systems: Towards an AIAA Reference Standard

    Science.gov (United States)

    2013-10-01

    The manometer is connected to PID control circuitry which compares the measured pressure to the desired set-point and then varies the valve position...briefly discussed. While most flight systems have employed a type of pressure -fed flow restrictor for flow control, both thermal-based and pressure ...AMU p = pressure , Pa P = power, W R = universal gas constant, 8.314 kJ/kmol K Q = thermal energy, J Q = heat flow rate, W t = time, s T

  15. Standard error of measurement of 5 health utility indexes across the range of health for use in estimating reliability and responsiveness.

    Science.gov (United States)

    Palta, Mari; Chen, Han-Yang; Kaplan, Robert M; Feeny, David; Cherepanov, Dasha; Fryback, Dennis G

    2011-01-01

    Standard errors of measurement (SEMs) of health-related quality of life (HRQoL) indexes are not well characterized. SEM is needed to estimate responsiveness statistics, and is a component of reliability. To estimate the SEM of 5 HRQoL indexes. The National Health Measurement Study (NHMS) was a population-based survey. The Clinical Outcomes and Measurement of Health Study (COMHS) provided repeated measures. A total of 3844 randomly selected adults from the noninstitutionalized population aged 35 to 89 y in the contiguous United States and 265 cataract patients. The SF6-36v2™, QWB-SA, EQ-5D, HUI2, and HUI3 were included. An item-response theory approach captured joint variation in indexes into a composite construct of health (theta). The authors estimated 1) the test-retest standard deviation (SEM-TR) from COMHS, 2) the structural standard deviation (SEM-S) around theta from NHMS, and 3) reliability coefficients. SEM-TR was 0.068 (SF-6D), 0.087 (QWB-SA), 0.093 (EQ-5D), 0.100 (HUI2), and 0.134 (HUI3), whereas SEM-S was 0.071, 0.094, 0.084, 0.074, and 0.117, respectively. These yield reliability coefficients 0.66 (COMHS) and 0.71 (NHMS) for SF-6D, 0.59 and 0.64 for QWB-SA, 0.61 and 0.70 for EQ-5D, 0.64 and 0.80 for HUI2, and 0.75 and 0.77 for HUI3, respectively. The SEM varied across levels of health, especially for HUI2, HUI3, and EQ-5D, and was influenced by ceiling effects. Limitations. Repeated measures were 5 mo apart, and estimated theta contained measurement error. The 2 types of SEM are similar and substantial for all the indexes and vary across health.

  16. Standardization of the measurement of dihydro testosterone by radioimmunoassay by after celite column chromatography

    International Nuclear Information System (INIS)

    Lando, V.S.; Mendonca, B.B. de; Nicolau, W.

    1993-01-01

    We developed a method for measuring dihydro testosterone after previous chromatography. One ml of serum containing 1000 cpm of tritiated dihydro testosterone was extracted with hexane ethyl acetate (2:3) dried diluted with non saturated iso octane and injected in the column previously washed with 3.5 ml of pure iso octane. The serum was eluted from the column with pure iso octane (3.5 ml) followed by 5% ethyl acetate in iso octane. The quantity of tritiated dihydro testosterone which was recovered 50% and 80%in all assays. This paper measured dihydro testosterone e testosterone and testosterone/dihydro testosterone ratio in the following groups. GROUP I: Forty one normal adult subjects in basal conditions, with chronological age varying from 19 to 40 years old. GROUP II: Six normal adult subjects with chronological age between 21 and 33 years old, evaluated in basal conditions and after stimulus with 6000 UI of human chorionic Gonadotropin. GROUP III: Seven pre-puberal children with chronological age varying from 2 to 7 years old. GROUP IV: Eight patients with chronological age varying from 13 to 33 years old with male pseudo- hermaphroditism due to 5-alpha-reductase deficiency in basal conditions. This paper conclude that the measurement of dihydro testosterone in celite column chromatograph is an affordable, easy-to-perform method which was able to identify patients with male pseudo-hermaphroditism due to 5-alpha-reductase deficiency. (author)

  17. Clinical balance assessment: perceptions of commonly-used standardized measures and current practices among physiotherapists in Ontario, Canada.

    Science.gov (United States)

    Sibley, Kathryn M; Straus, Sharon E; Inness, Elizabeth L; Salbach, Nancy M; Jaglal, Susan B

    2013-03-20

    Balance impairment is common in multiple clinical populations, and comprehensive assessment is important for identifying impairments, planning individualized treatment programs, and evaluating change over time. However, little information is available regarding whether clinicians who treat balance are satisfied with existing assessment tools. In 2010 we conducted a cross-sectional survey of balance assessment practices among physiotherapists in Ontario, Canada, and reported on the use of standardized balance measures (Sibley et al. 2011 Physical Therapy; 91: 1583-91). The purpose of this study was to analyse additional survey data and i) evaluate satisfaction with current balance assessment practices and standardized measures among physiotherapists who treat adult or geriatric populations with balance impairment, and ii) identify factors associated with satisfaction. The questionnaire was distributed to 1000 practicing physiotherapists. This analysis focuses on questions in which respondents were asked to rate their general perceptions about balance assessment, the perceived utility of individual standardized balance measures, whether they wanted to improve balance assessment practices, and why. Data were summarized with descriptive statistics and utility of individual measures was compared across clinical practice areas (orthopaedic, neurological, geriatric or general rehabilitation). The questionnaire was completed by 369 respondents, of which 43.4% of respondents agreed that existing standardized measures of balance meet their needs. In ratings of individual measures, the Single Leg Stance test and Berg Balance Scale were perceived as useful for clinical decision-making and evaluating change over time by over 70% of respondents, and the Timed Up-and-Go test was perceived as useful for decision-making by 56.9% of respondents and useful for evaluating change over time by 62.9% of respondents, but there were significant differences across practice groups. Seventy

  18. Tooth shade measurements under standard and nonstandard illumination and their agreement with skin color.

    Science.gov (United States)

    Al-Dwairi, Ziad; Shaweesh, Ashraf; Kamkarfar, Sohrab; Kamkarfar, Shahrzad; Borzabadi-Farahani, Ali; Lynch, Edward

    2014-01-01

    The purpose of this study was to examine the relationship between skin color (shade) and tooth shade under standard and nonstandard illumination sources. Four hundred Jordanian participants (200 males, 200 females, 20 to 50 years of age) were studied. Skin colors were assessed and categorized using the L'Oreal and Revlon foundation shade guides (light, medium, dark). The Vita Pan Classical Shade Guide (VPCSG; Vident) and digital Vita EasyShade Intraoral Dental Spectrophotometer (VESIDS; Vident) were used to select shades in the middle thirds of maxillary central incisors; tooth shades were classified into four categories (highest, high, medium, low). Significant gender differences were observed for skin colors (P = .000) and tooth shade guide systems (P = .001 and .050 for VPCSG and VESIDS, respectively). The observed agreement was 100% and 93% for skin and tooth shade guides, respectively. The corresponding kappa statistic values were 1.00 and 0.79, respectively (substantial agreement, P < .001). The observed agreement between skin color and tooth shades (VPCSG and VESIDS) was approximately 50%. The digital tooth shade guide system can be a satisfactory substitute for classical tooth shade guides and clinical shade matching. There was only moderate agreement between skin color and tooth shade.

  19. Measuring error rates in genomic perturbation screens: gold standards for human functional genomics.

    Science.gov (United States)

    Hart, Traver; Brown, Kevin R; Sircoulomb, Fabrice; Rottapel, Robert; Moffat, Jason

    2014-07-01

    Technological advancement has opened the door to systematic genetics in mammalian cells. Genome-scale loss-of-function screens can assay fitness defects induced by partial gene knockdown, using RNA interference, or complete gene knockout, using new CRISPR techniques. These screens can reveal the basic blueprint required for cellular proliferation. Moreover, comparing healthy to cancerous tissue can uncover genes that are essential only in the tumor; these genes are targets for the development of specific anticancer therapies. Unfortunately, progress in this field has been hampered by off-target effects of perturbation reagents and poorly quantified error rates in large-scale screens. To improve the quality of information derived from these screens, and to provide a framework for understanding the capabilities and limitations of CRISPR technology, we derive gold-standard reference sets of essential and nonessential genes, and provide a Bayesian classifier of gene essentiality that outperforms current methods on both RNAi and CRISPR screens. Our results indicate that CRISPR technology is more sensitive than RNAi and that both techniques have nontrivial false discovery rates that can be mitigated by rigorous analytical methods. © 2014 The Authors. Published under the terms of the CC BY 4.0 license.

  20. Application of a method to measure uranium enrichment without use of standards

    International Nuclear Information System (INIS)

    Saule, F.A.; Righetti, M.A.

    1998-01-01

    Full text: The determination of uranium enrichment in the many different stages present at a gaseous diffusion enrichment plant (diffusers, cisterns, deposits in pipes, drums with rests of process), or materials of deposit (plates of fuel elements not irradiated and recipients with uranium oxide), that have several geometries and physics properties of the containers, is very important for safeguards inspections. In this work is tested a non destructive analysis technique to determine the value of uranium enrichment of different samples with uranium materials without use of standards, to apply in safeguards inspections. It was used a hyper pure germanium detector with efficiency of 20% to obtain the gamma spectrum of the samples. In each spectrum, were used the net area values corresponding to four lines of U-235 (at 143, 163, 186 and 205 keV) and three lines of U-238 (258, 766 and 1001 keV); these values were analysed with two different methods. The comparison of the calculated and declared values showed a discrepancy of about 10%. (author) [es

  1. Delivery and Measurement of High-Value Care in Standardized Patient Encounters.

    Science.gov (United States)

    Baldwin, Jennifer DeLuca; Cox, Jaclyn; Wu, Zhao Helen; Kenny, Anne; Angus, Steven

    2017-10-01

    Residencies have incorporated high-value care (HVC) training to contain health care expenditures. Assessment methods of HVC curricula are limited. In our clinical skills laboratory, we evaluated the effectiveness of HVC curricula using standardized patients (SPs) to determine if there is a correlation with performance in counseling, history and physical, HVC knowledge, and demographics. Through ambulatory cases, SPs evaluated postgraduate year 2 (PGY-2) residents using checklists to determine if they obtained the chief complaint, medical and social history, focused physical examination, and conveyed information regarding patient management. Investigators scored knowledge-based questions on the need for imaging in low back pain, annual stress testing in coronary artery disease, and chest x-ray for gastroesophageal reflux disease. Univariate analysis was used to calculate percentage distribution of residents' ordering of inappropriate tests. All 56 PGY-2 residents participated in the study and completed at least 2 of 3 HVC cases. Analysis showed that 48% (27 of 56) ordered at least 1 inappropriate test. Residents who ordered unnecessary testing had similar performance in history and physical as well as knowledge of HVC. Inappropriate ordering was significantly associated with poorer performance in counseling (mean percentage counseling score of 68% versus 56% for those who ordered inappropriately, P ordered inappropriately, P ordering by demographics. Our evaluation of residents during SP encounters found a correlation between the use of inappropriate testing and lower counseling and communication skills.

  2. Numerical efficiency calibration of in vivo measurement systems. Monte Carlo simulations of in vivo measurement scenarios for the detection of incorporated radionuclides, including validation, analysis of efficiency-sensitive parameters and customized anthropomorphic voxel models

    International Nuclear Information System (INIS)

    Hegenbart, Lars

    2010-01-01

    Detector efficiency calibration of in vivo bioassay measurements is based on physical anthropomorphic phantoms that can be loaded with radionuclides of the suspected incorporation. Systematic errors of traditional calibration methods can cause considerable over- or underestimation of the incorporated activity and hence the absorbed dose in the human body. In this work Monte Carlo methods for radiation transport problem are used. Virtual models of the in vivo measurement equipment used at the Institute of Radiation Research, including detectors and anthropomorphic phantoms have been developed. Software tools have been coded to handle memory intensive human models for the visualization, preparation and evaluation of simulations of in vivo measurement scenarios. The used tools, methods, and models have been validated. Various parameters have been investigated for their sensitivity on the detector efficiency to identify and quantify possible systematic errors. Measures have been implemented to improve the determination of the detector efficiency in regard to apply them in the routine of the in vivo measurement laboratory of the institute. A positioning system has been designed and installed in the Partial Body Counter measurement chamber to measure the relative position of the detector to the test person, which has been identified to be a sensitive parameter. A computer cluster has been set up to facilitate the Monte Carlo simulations and reduce computing time. Methods based on image registration techniques have been developed to transform existing human models to match with an individual test person. The measures and methods developed have improved the classic detector efficiency methods successfully. (orig.)

  3. Do Standard Bibliometric Measures Correlate with Academic Rank of Full-Time Pediatric Dentistry Faculty Members?

    Science.gov (United States)

    Susarla, Harlyn K; Dhar, Vineet; Karimbux, Nadeem Y; Tinanoff, Norman

    2017-04-01

    The aim of this cross-sectional study was to assess the relationship between quantitative measures of research productivity and academic rank for full-time pediatric dentistry faculty members in accredited U.S. and Canadian residency programs. For each pediatric dentist in the study group, academic rank and bibliometric factors derived from publicly available databases were recorded. Academic ranks were lecturer/instructor, assistant professor, associate professor, and professor. Bibliometric factors were mean total number of publications, mean total number of citations, maximum number of citations for a single work, and h-index (a measure of the impact of publications, determined by total number of publications h that had at least h citations each). The study sample was comprised of 267 pediatric dentists: 4% were lecturers/instructors, 44% were assistant professors, 30% were associate professors, and 22% were professors. The mean number of publications for the sample was 15.4±27.8. The mean number of citations was 218.4±482.0. The mean h-index was 4.9±6.6. The h-index was strongly correlated with academic rank (r=0.60, p=0.001). For this sample, an h-index of ≥3 was identified as a threshold for promotion to associate professor, and an h-index of ≥6 was identified as a threshold for promotion to professor. The h-index was strongly correlated with the academic rank of these pediatric dental faculty members, suggesting that this index may be considered a measure for promotion, along with a faculty member's quality and quantity of research, teaching, service, and clinical activities.

  4. Extended wavelength anisotropy resolved multidimensional emission spectroscopy (ARMES) measurements: better filters, validation standards, and Rayleigh scatter removal methods

    Science.gov (United States)

    Casamayou-Boucau, Yannick; Ryder, Alan G.

    2017-09-01

    Anisotropy resolved multidimensional emission spectroscopy (ARMES) provides valuable insights into multi-fluorophore proteins (Groza et al 2015 Anal. Chim. Acta 886 133-42). Fluorescence anisotropy adds to the multidimensional fluorescence dataset information about the physical size of the fluorophores and/or the rigidity of the surrounding micro-environment. The first ARMES studies used standard thin film polarizers (TFP) that had negligible transmission between 250 and 290 nm, preventing accurate measurement of intrinsic protein fluorescence from tyrosine and tryptophan. Replacing TFP with pairs of broadband wire grid polarizers enabled standard fluorescence spectrometers to accurately measure anisotropies between 250 and 300 nm, which was validated with solutions of perylene in the UV and Erythrosin B and Phloxine B in the visible. In all cases, anisotropies were accurate to better than ±1% when compared to literature measurements made with Glan Thompson or TFP polarizers. Better dual wire grid polarizer UV transmittance and the use of excitation-emission matrix measurements for ARMES required complete Rayleigh scatter elimination. This was achieved by chemometric modelling rather than classical interpolation, which enabled the acquisition of pure anisotropy patterns over wider spectral ranges. In combination, these three improvements permit the accurate implementation of ARMES for studying intrinsic protein fluorescence.

  5. Comparison between IEEE and CIGRE Thermal Behaviour Standards and Measured Temperature on a 132-kV Overhead Power Line

    Directory of Open Access Journals (Sweden)

    Alberto Arroyo

    2015-12-01

    Full Text Available This paper presents the steady and dynamic thermal balances of an overhead power line proposed by CIGRE (Technical Brochure 601, 2014 and IEEE (Std.738, 2012 standards. The estimated temperatures calculated by the standards are compared with the averaged conductor temperature obtained every 8 min during a year. The conductor is a LA 280 Hawk type, used in a 132-kV overhead line. The steady and dynamic state comparison shows that the number of cases with deviations to conductor temperatures higher than 5 ∘ C decreases from around 20% to 15% when the dynamic analysis is used. As some of the most critical variables are magnitude and direction of the wind speed, ambient temperature and solar radiation, their influence on the conductor temperature is studied. Both standards give similar results with slight differences due to the different way to calculate the solar radiation and convection. Considering the wind, both standards provide better results for the estimated conductor temperature as the wind speed increases and the angle with the line is closer to 90 ∘ . In addition, if the theoretical radiation is replaced by that measured with the pyranometer, the number of samples with deviations higher than 5 ∘ C is reduced from around 15% to 5%.

  6. Molecular Form Differences Between Prostate-Specific Antigen (PSA) Standards Create Quantitative Discordances in PSA ELISA Measurements

    Science.gov (United States)

    McJimpsey, Erica L.

    2016-02-01

    The prostate-specific antigen (PSA) assays currently employed for the detection of prostate cancer (PCa) lack the specificity needed to differentiate PCa from benign prostatic hyperplasia and have high false positive rates. The PSA calibrants used to create calibration curves in these assays are typically purified from seminal plasma and contain many molecular forms (intact PSA and cleaved subforms). The purpose of this study was to determine if the composition of the PSA molecular forms found in these PSA standards contribute to the lack of PSA test reliability. To this end, seminal plasma purified PSA standards from different commercial sources were investigated by western blot (WB) and in multiple research grade PSA ELISAs. The WB results revealed that all of the PSA standards contained different mass concentrations of intact and cleaved molecular forms. Increased mass concentrations of intact PSA yielded higher immunoassay absorbance values, even between lots from the same manufacturer. Standardization of seminal plasma derived PSA calibrant molecular form mass concentrations and purification methods will assist in closing the gaps in PCa testing measurements that require the use of PSA values, such as the % free PSA and Prostate Health Index by increasing the accuracy of the calibration curves.

  7. Molecular Form Differences Between Prostate-Specific Antigen (PSA) Standards Create Quantitative Discordances in PSA ELISA Measurements

    Science.gov (United States)

    McJimpsey, Erica L.

    2016-01-01

    The prostate-specific antigen (PSA) assays currently employed for the detection of prostate cancer (PCa) lack the specificity needed to differentiate PCa from benign prostatic hyperplasia and have high false positive rates. The PSA calibrants used to create calibration curves in these assays are typically purified from seminal plasma and contain many molecular forms (intact PSA and cleaved subforms). The purpose of this study was to determine if the composition of the PSA molecular forms found in these PSA standards contribute to the lack of PSA test reliability. To this end, seminal plasma purified PSA standards from different commercial sources were investigated by western blot (WB) and in multiple research grade PSA ELISAs. The WB results revealed that all of the PSA standards contained different mass concentrations of intact and cleaved molecular forms. Increased mass concentrations of intact PSA yielded higher immunoassay absorbance values, even between lots from the same manufacturer. Standardization of seminal plasma derived PSA calibrant molecular form mass concentrations and purification methods will assist in closing the gaps in PCa testing measurements that require the use of PSA values, such as the % free PSA and Prostate Health Index by increasing the accuracy of the calibration curves. PMID:26911983

  8. SNR and Standard Deviation of cGNSS-R and iGNSS-R Scatterometric Measurements

    Directory of Open Access Journals (Sweden)

    Alberto Alonso-Arroyo

    2017-01-01

    Full Text Available This work addresses the accuracy of the Global Navigation Satellite Systems (GNSS-Reflectometry (GNSS-R scatterometric measurements considering the presence of both coherent and incoherent scattered components, for both conventional GNSS-R (cGNSS-R and interferometric GNSS-R (iGNSS-R techniques. The coherent component is present for some type of surfaces, and it has been neglected until now because it vanishes for the sea surface scattering case. Taking into account the presence of both scattering components, the estimated Signal-to-Noise Ratio (SNR for both techniques is computed based on the detectability criterion, as it is done in conventional GNSS applications. The non-coherent averaging operation is considered from a general point of view, taking into account that thermal noise contributions can be reduced by an extra factor of 0.88 dB when using partially overlapped or partially correlated samples. After the SNRs are derived, the received waveform’s peak variability is computed, which determines the system’s capability to measure geophysical parameters. This theoretical derivations are applied to the United Kingdom (UK TechDemoSat-1 (UK TDS-1 and to the future GNSS REflectometry, Radio Occultation and Scatterometry on board the International Space Station (ISS (GEROS-ISS scenarios, in order to estimate the expected scatterometric performance of both missions.

  9. Using DNA origami nanorulers as traceable distance measurement standards and nanoscopic benchmark structures.

    Science.gov (United States)

    Raab, Mario; Jusuk, Ija; Molle, Julia; Buhr, Egbert; Bodermann, Bernd; Bergmann, Detlef; Bosse, Harald; Tinnefeld, Philip

    2018-01-29

    In recent years, DNA origami nanorulers for superresolution (SR) fluorescence microscopy have been developed from fundamental proof-of-principle experiments to commercially available test structures. The self-assembled nanostructures allow placing a defined number of fluorescent dye molecules in defined geometries in the nanometer range. Besides the unprecedented control over matter on the nanoscale, robust DNA origami nanorulers are reproducibly obtained in high yields. The distances between their fluorescent marks can be easily analysed yielding intermark distance histograms from many identical structures. Thus, DNA origami nanorulers have become excellent reference and training structures for superresolution microscopy. In this work, we go one step further and develop a calibration process for the measured distances between the fluorescent marks on DNA origami nanorulers. The superresolution technique DNA-PAINT is used to achieve nanometrological traceability of nanoruler distances following the guide to the expression of uncertainty in measurement (GUM). We further show two examples how these nanorulers are used to evaluate the performance of TIRF microscopes that are capable of single-molecule localization microscopy (SMLM).

  10. Can anthropometry measure gender discrimination? An analysis using WHO standards to assess the growth of Bangladeshi children.

    Science.gov (United States)

    Moestue, Helen

    2009-08-01

    To examine the potential of anthropometry as a tool to measure gender discrimination, with particular attention to the WHO growth standards. Surveillance data collected from 1990 to 1999 were analysed. Height-for-age Z-scores were calculated using three norms: the WHO standards, the 1978 National Center for Health Statistics (NCHS) reference and the 1990 British growth reference (UK90). Bangladesh. Boys and girls aged 6-59 months (n 504 358). The three sets of growth curves provided conflicting pictures of the relative growth of girls and boys by age and over time. Conclusions on sex differences in growth depended also on the method used to analyse the curves, be it according to the shape or the relative position of the sex-specific curves. The shapes of the WHO-generated curves uniquely implied that Bangladeshi girls faltered faster or caught up slower than boys throughout their pre-school years, a finding consistent with the literature. In contrast, analysis of the relative position of the curves suggested that girls had higher WHO Z-scores than boys below 24 months of age. Further research is needed to help establish whether and how the WHO international standards can measure gender discrimination in practice, which continues to be a serious problem in many parts of the world.

  11. Developing Standard Exercises and Statistics to Measure the Impact of Cyber Defenses

    Science.gov (United States)

    2014-06-01

    the Hacking Exposed Anatomy. To further supplement this categorization, however, the following life cycle included in the Mandiant APT1 report...reveal systems similarly vulnerable, so it is not out of the question that such a server might be running in the wild . Phase 2: Exploitation...Defense  Examine IDS Alerts and FW  Logs  Examine AV and other  Network/Host Technologies  Examine  Forensic   Images  Submit IP and signature  Indicators

  12. Caffeine consumption questionnaire: a standardized measure for caffeine consumption in undergraduate students.

    Science.gov (United States)

    Shohet, K L; Landrum, R E

    2001-12-01

    Undergraduate students (N=691) were given the 1992 Caffeine Consumption Questionnaire of Landrum and provided information on age, sex, and year in school. A subset (n = 168) of those completing the questionnaire were also given the Morningness-Eveningness Questionnaire of Horne and Ostberg. Analysis indicated that the average intake of caffeine was roughly 1,600 mg, i.e., a range from 13 mg to 21,840 mg per week. Older students consumed more caffeine than younger ones, and students with an Evening personality preference consumed more caffeine in the evening and nighttime hours than those with a Morning personality preference. These results are discussed in the context of other caffeine studies. Caffeine consumption is an important issue, and a consistent measurement system should be used by various researchers testing different populations.

  13. Measurement of reading speed with standardized texts: a comparison of single sentences and paragraphs.

    Science.gov (United States)

    Altpeter, Elke Karin; Marx, Tobias; Nguyen, Nhung Xuan; Naumann, Aline; Trauzettel-Klosinski, Susanne

    2015-08-01

    We examined the influence of text length (single sentence versus a paragraph of several sentences) on the repeatability of reading speed measurements in normal-sighted subjects. We compared reading speeds for the German versions of the Radner charts (single sentences of 14 words each) and the International Reading Speed Texts (IReST) charts (paragraphs, on average 132 words) in 30 normal-sighted elderly subjects aged 51-81 years (mean 64.5 years ± 7.2 SD). Three texts each of both lengths were read aloud in random order. The influence of text length (single sentence or paragraph) and text sample (each single text) on reading speed was calculated by a regression model and Bland-Altman analysis. Mean reading speed (words per minute) showed no significant difference for single sentences (170 wpm ± 33 SD) and paragraphs (167 wpm ±31 SD). Differences in reading speeds within one type of reading material were higher between single sentences than between paragraphs. Correlation coefficients between speeds were higher for paragraphs (r = 0.96-0.98) than for single sentences (r = 0.69-0.78). Variations between reading speeds for three texts of each length were markedly lower for paragraphs than for single sentences: (median, interquartile range [IQR]): 6.7, IQR 13.9; 3.0, IQR 8.3; -2.0, IQR 9.7 versus -8.8, IQR 29.6; 15.6, IQR 29.4; 22.7, IQR 19.4, respectively. Since reading speeds assessed with paragraphs show lower variance among texts than those for single sentences, they are better suited for repeated measurements, especially for long-term monitoring of the course of reading performance and for assessing effects of interventions in subjects with reading disorders.

  14. Accounting for Carbon Stocks in Soils and Measuring GHGs Emission Fluxes from Soils: Do We Have the Necessary Standards?

    Directory of Open Access Journals (Sweden)

    Antonio Bispo

    2017-07-01

    Full Text Available Soil is a key compartment for climate regulation as a source of greenhouse gases (GHGs emissions and as a sink of carbon. Thus, soil carbon sequestration strategies should be considered alongside reduction strategies for other greenhouse gas emissions. Taking this into account, several international and European policies on climate change are now acknowledging the importance of soils, which means that proper, comparable and reliable information is needed to report on carbon stocks and GHGs emissions from soil. It also implies a need for consensus on the adoption and verification of mitigation options that soil can provide. Where consensus is a key aspect, formal standards and guidelines come into play. This paper describes the existing ISO soil quality standards that can be used in this context, and calls for new ones to be developed through (international collaboration. Available standards cover the relevant basic soil parameters including carbon and nitrogen content but do not yet consider the dynamics of those elements. Such methods have to be developed together with guidelines consistent with the scale to be investigated and the specific use of the collected data. We argue that this standardization strategy will improve the reliability of the reporting procedures and results of the different climate models that rely on soil quality data.

  15. Tests of the Standard Model and Constraints on New Physics from Measurements of Fermion-pair Production at 130-172 GeV at LEP

    CERN Document Server

    Ackerstaff, K.; Allison, John; Altekamp, N.; Anderson, K.J.; Anderson, S.; Arcelli, S.; Asai, S.; Axen, D.; Azuelos, G.; Ball, A.H.; Barberio, E.; Barlow, Roger J.; Bartoldus, R.; Batley, J.R.; Baumann, S.; Bechtluft, J.; Beeston, C.; Behnke, T.; Bell, A.N.; Bell, Kenneth Watson; Bella, G.; Bentvelsen, S.; Bethke, S.; Biebel, O.; Biguzzi, A.; Bird, S.D.; Blobel, V.; Bloodworth, I.J.; Bloomer, J.E.; Bobinski, M.; Bock, P.; Bonacorsi, D.; Boutemeur, M.; Bouwens, B.T.; Braibant, S.; Brigliadori, L.; Brown, Robert M.; Burckhart, H.J.; Burgard, C.; Burgin, R.; Capiluppi, P.; Carnegie, R.K.; Carter, A.A.; Carter, J.R.; Chang, C.Y.; Charlton, David G.; Chrisman, D.; Clarke, P.E.L.; Cohen, I.; Conboy, J.E.; Cooke, O.C.; Cuffiani, M.; Dado, S.; Dallapiccola, C.; Dallavalle, G.Marco; Davies, R.; De Jong, S.; del Pozo, L.A.; Desch, K.; Dienes, B.; Dixit, M.S.; do Couto e Silva, E.; Doucet, M.; Duchovni, E.; Duckeck, G.; Duerdoth, I.P.; Eatough, D.; Edwards, J.E.G.; Estabrooks, P.G.; Evans, H.G.; Evans, M.; Fabbri, F.; Fanti, M.; Faust, A.A.; Fiedler, F.; Fierro, M.; Fischer, H.M.; Fleck, I.; Folman, R.; Fong, D.G.; Foucher, M.; Furtjes, A.; Futyan, D.I.; Gagnon, P.; Gary, J.W.; Gascon, J.; Gascon-Shotkin, S.M.; Geddes, N.I.; Geich-Gimbel, C.; Geralis, T.; Giacomelli, G.; Giacomelli, P.; Giacomelli, R.; Gibson, V.; Gibson, W.R.; Gingrich, D.M.; Glenzinski, D.; Goldberg, J.; Goodrick, M.J.; Gorn, W.; Grandi, C.; Gross, E.; Grunhaus, J.; Gruwe, M.; Hajdu, C.; Hanson, G.G.; Hansroul, M.; Hapke, M.; Hargrove, C.K.; Hart, P.A.; Hartmann, C.; Hauschild, M.; Hawkes, C.M.; Hawkings, R.; Hemingway, R.J.; Herndon, M.; Herten, G.; Heuer, R.D.; Hildreth, M.D.; Hill, J.C.; Hillier, S.J.; Hobson, P.R.; Homer, R.J.; Honma, A.K.; Horvath, D.; Hossain, K.R.; Howard, R.; Huntemeyer, P.; Hutchcroft, D.E.; Igo-Kemenes, P.; Imrie, D.C.; Ingram, M.R.; Ishii, K.; Jawahery, A.; Jeffreys, P.W.; Jeremie, H.; Jimack, M.; Joly, A.; Jones, C.R.; Jones, G.; Jones, M.; Jost, U.; Jovanovic, P.; Junk, T.R.; Karlen, D.; Kartvelishvili, V.; Kawagoe, K.; Kawamoto, T.; Kayal, P.I.; Keeler, R.K.; Kellogg, R.G.; Kennedy, B.W.; Kirk, J.; Klier, A.; Kluth, S.; Kobayashi, T.; Kobel, M.; Koetke, D.S.; Kokott, T.P.; Kolrep, M.; Komamiya, S.; Kress, T.; Krieger, P.; von Krogh, J.; Kyberd, P.; Lafferty, G.D.; Lahmann, R.; Lai, W.P.; Lanske, D.; Lauber, J.; Lautenschlager, S.R.; Layter, J.G.; Lazic, D.; Lee, A.M.; Lefebvre, E.; Lellouch, D.; Letts, J.; Levinson, L.; Lloyd, S.L.; Loebinger, F.K.; Long, G.D.; Losty, M.J.; Ludwig, J.; Macchiolo, A.; Macpherson, A.; Mannelli, M.; Marcellini, S.; Markus, C.; Martin, A.J.; Martin, J.P.; Martinez, G.; Mashimo, T.; Mattig, Peter; McDonald, W.John; McKenna, J.; Mckigney, E.A.; McMahon, T.J.; McPherson, R.A.; Meijers, F.; Menke, S.; Merritt, F.S.; Mes, H.; Meyer, J.; Michelini, A.; Mikenberg, G.; Miller, D.J.; Mincer, A.; Mir, R.; Mohr, W.; Montanari, A.; Mori, T.; Morii, M.; Muller, U.; Mihara, S.; Nagai, K.; Nakamura, I.; Neal, H.A.; Nellen, B.; Nisius, R.; O'Neale, S.W.; Oakham, F.G.; Odorici, F.; Ogren, H.O.; Oh, A.; Oldershaw, N.J.; Oreglia, M.J.; Orito, S.; Palinkas, J.; Pasztor, G.; Pater, J.R.; Patrick, G.N.; Patt, J.; Pearce, M.J.; Perez-Ochoa, R.; Petzold, S.; Pfeifenschneider, P.; Pilcher, J.E.; Pinfold, J.; Plane, David E.; Poffenberger, P.; Poli, B.; Posthaus, A.; Rees, D.L.; Rigby, D.; Robertson, S.; Robins, S.A.; Rodning, N.; Roney, J.M.; Rooke, A.; Ros, E.; Rossi, A.M.; Routenburg, P.; Rozen, Y.; Runge, K.; Runolfsson, O.; Ruppel, U.; Rust, D.R.; Rylko, R.; Sachs, K.; Saeki, T.; Sarkisian, E.K.G.; Sbarra, C.; Schaile, A.D.; Schaile, O.; Scharf, F.; Scharff-Hansen, P.; Schenk, P.; Schieck, J.; Schleper, P.; Schmitt, B.; Schmitt, S.; Schoning, A.; Schroder, Matthias; Schultz-Coulon, H.C.; Schumacher, M.; Schwick, C.; Scott, W.G.; Shears, T.G.; Shen, B.C.; Shepherd-Themistocleous, C.H.; Sherwood, P.; Siroli, G.P.; Sittler, A.; Skillman, A.; Skuja, A.; Smith, A.M.; Snow, G.A.; Sobie, R.; Soldner-Rembold, S.; Springer, Robert Wayne; Sproston, M.; Stephens, K.; Steuerer, J.; Stockhausen, B.; Stoll, K.; Strom, David M.; Szymanski, P.; Tafirout, R.; Talbot, S.D.; Tanaka, S.; Taras, P.; Tarem, S.; Teuscher, R.; Thiergen, M.; Thomson, M.A.; von Torne, E.; Towers, S.; Trigger, I.; Trocsanyi, Z.; Tsur, E.; Turcot, A.S.; Turner-Watson, M.F.; Utzat, P.; Van Kooten, Rick J.; Verzocchi, M.; Vikas, P.; Vokurka, E.H.; Voss, H.; Wackerle, F.; Wagner, A.; Ward, C.P.; Ward, D.R.; Watkins, P.M.; Watson, A.T.; Watson, N.K.; Wells, P.S.; Wermes, N.; White, J.S.; Wilkens, B.; Wilson, G.W.; Wilson, J.A.; Wolf, G.; Wyatt, T.R.; Yamashita, S.; Yekutieli, G.; Zacek, V.; Zer-Zion, D.

    1998-01-01

    Production of events with hadronic and leptonic final states has been measured in e^+e^- collisions at centre-of-mass energies of 130-172 GeV, using the OPAL detector at LEP. Cross-sections and leptonic forward-backward asymmetries are presented, both including and excluding the dominant production of radiative Z \\gamma events, and compared to Standard Model expectations. The ratio R_b of the cross-section for bb(bar) production to the hadronic cross-section has been measured. In a model-independent fit to the Z lineshape, the data have been used to obtain an improved precision on the measurement of \\gamma-Z interference. The energy dependence of \\alpha_em has been investigated. The measurements have also been used to obtain limits on extensions of the Standard Model described by effective four-fermion contact interactions, to search for t-channel contributions from new massive particles and to place limits on chargino pair production with subsequent decay of the chargino into a light gluino and a quark pair.

  16. In-Situ XRF Measurements in Lunar Surface Exploration Using Apollo Samples as a Standard

    Science.gov (United States)

    Young, Kelsey E.; Evans, C.; Allen, C.; Mosie, A.; Hodges, K. V.

    2011-01-01

    Samples collected during the Apollo lunar surface missions were sampled and returned to Earth by astronauts with varying degrees of geological experience. The technology used in these EVAs, or extravehicular activities, included nothing more advanced than traditional terrestrial field instruments: rock hammer, scoop, claw tool, and sample bags. 40 years after Apollo, technology is being developed that will allow for a high-resolution geochemical map to be created in the field real-time. Handheld x-ray fluorescence (XRF) technology is one such technology. We use handheld XRF to enable a broad in-situ characterization of a geologic site of interest based on fairly rapid techniques that can be implemented by either an astronaut or a robotic explorer. The handheld XRF instrument we used for this study was the Innov-X Systems Delta XRF spectrometer.

  17. Standardization of 68Ge-68Ga using 4πβ(LS)-γ coincidence counting system for activity measurements.

    Science.gov (United States)

    Kulkarni, D B; Joseph, Leena; Anuradha, R; Kulkarni, M S; Tomar, B S

    2017-05-01

    68 Ga has great scope for use in future for positron emission tomography (PET) imaging due to its very fast blood clearance and fast target localization, even though at present 18 F is widely used. 68 Ge in equilibrium with 68 Ga ( 68 Ge- 68 Ga) can also be used as a surrogate for 18 F calibration, as 18 F source standardization can be done at national metrology institute (NMI) but, these standards cannot be sent to nuclear medicine centers (NMCs) across India for calibration of isotope calibrators, due to the short half-life of 18 F (110min). Providing 68 Ge- 68 Ga standards to NMCs requires that first standardization must be carried out at NMI (BARC in India) to provide traceability to the measurements carried out at NMCs. In the present work, standardization of 68 Ge- 68 Ga was carried out using 4πβ(LS)-γ coincidence counting system and CIEMAT/NIST efficiency tracing technique. The decay scheme correction factors for two gamma windows were calculated by Monte Carlo technique using general purpose code FLUKA. The activity concentration values were normalized by the activity concentration obtained by 4πβ(LS)-γ coincidence counting system using window-1. The final result reported to BIPM for 4πβ(LS)-γ coincidence counting was calculated by taking arithmetic mean of activity concentrations obtained for two gamma windows. The normalized activity concentration obtained by 4πβ(LS)-γ coincidence counting was 0.998±0.005 and that obtained using CIEMAT/NIST efficiency tracing was 1.002±0.007 which are in excellent agreement within uncertainty limits. Copyright © 2017 Elsevier Ltd. All rights reserved.

  18. Search for Physics Beyond the Standard Model via Positron Polarization Measurements with Polarized $ ^{17} $F.

    CERN Multimedia

    Versyck, S

    2002-01-01

    This proposal aims at measuring the longitudinal polarization of positrons emitted from polarized $^{17} $F~nuclei. The experiment will have a comparable sensitivity to possible right-handed current contributions in the weak interaction as the experiment which was recently carried out with $ ^{107} $In in Louvain-la-Neuve, but will provide a more stringent limit due to the fact that, since $ ^{17} $F decays through a superallowed $\\beta$ -transition, the recoil-order corrections to the allowed approximation can be taken into account very precisely. Furthermore, because $ ^{17} $F decays via a mixed Fermi/Gamow-Teller $\\beta$ -transition, this experiment will also yield a new limit on possible scalar contributions to the weak interaction. While the $^{17}$F beam is being developed, part of the beamtime was used to perform a similar experiment with $^{118}$ Sb. As this isotope decays via a pure GT $\\beta$ -transition, this experiment will yield new limits on the possible presence of both right-handed and tensor...

  19. Standard Test Method for Measuring Reaction Rates by Analysis of Barium-140 From Fission Dosimeters

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    2008-01-01

    1.1 This test method describes two procedures for the measurement of reaction rates by determining the amount of the fission product 140Ba produced by the non-threshold reactions 235U(n,f), 241Am(n,f), and 239Pu(n,f), and by the threshold reactions 238U(n,f), 237Np(n,f), and 232Th(n,f). 1.2 These reactions produce many fission products, among which is 140Ba, having a half-life of 12.752 days. 140Ba emits gamma rays of several energies; however, these are not easily detected in the presence of other fission products. Competing activity from other fission products requires that a chemical separation be employed or that the 140Ba activity be determined indirectly by counting its daughter product 140La. This test method describes both procedure (a), the nondestructive determination of 140Ba by the direct counting of 140La several days after irradiation, and procedure (b), the chemical separation of 140Ba and the subsequent counting of 140Ba or its daughter 140La. 1.3 With suitable techniques, fission neutron fl...

  20. Standard practice for measuring the ultrasonic velocity in polyethylene tank walls using lateral longitudinal (LCR) waves

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    2011-01-01

    1.1 This practice covers a procedure for measuring the ultrasonic velocities in the outer wall of polyethylene storage tanks. An angle beam lateral longitudinal (LCR) wave is excited with wedges along a circumferential chord of the tank wall. A digital ultrasonic flaw detector is used with sending-receiving search units in through transmission mode. The observed velocity is temperature corrected and compared to the expected velocity for a new, unexposed sample of material which is the same as the material being evaluated. The difference between the observed and temperature corrected velocities determines the degree of UV exposure of the tank. 1.2 The practice is intended for application to the outer surfaces of the wall of polyethylene tanks. Degradation typically occurs in an outer layer approximately 3.2-mm (0.125-in.) thick. Since the technique does not interrogate the inside wall of the tank, wall thickness is not a consideration other than to be aware of possible guided (Lamb) wave effects or reflection...

  1. Simple visual review of pre- to post-operative renal ultrasound images predicts pyeloplasty success equally as well as geometric measurements: A blinded comparison with a gold standard.

    Science.gov (United States)

    Kern, Adam J M; Schlomer, Bruce J; Timberlake, Matthew D; Peters, Craig A; Hammer, Matthew R; Jacobs, Micah A

    2017-08-01

    MAG3 diuretic renal scan remains the gold standard for determination of improvement in renal drainage following pyeloplasty for ureteropelvic junction obstruction. We hypothesized that (i) a change in geometric measurements between pre-operative and post-operative renal ultrasound (RUS) images and (ii) blinded simple visual review of images both would predict pyeloplasty success. To determine if simple visual review and/or novel geometric measurement of renal ultrasounds can detect pyeloplasty failure. This study was a retrospective, blinded comparison with a gold standard. Included were children aged ≤18 years undergoing pyeloplasty at our institution from 2009 to 2015. For each kidney, representative pre-operative and post-operative RUS images were chosen. Our standard for pyeloplasty success was improved drainage curve on MAG3 and lack of additional surgery. Measurements for collecting system circularity, roundness, and renal parenchymal to collecting system area ratio (RPCSR) were obtained by three raters (Figure), who were blinded to the outcome of the pyeloplasty. Changes in geometric measurements were analyzed as a diagnostic test for MAG3-defined pyeloplasty success using ROC curve analysis. In addition, six reviewers blinded to pyeloplasty success reviewed pre-operative and post-operative images visually for improved hydronephrosis and categorized pyeloplasty as success or failure based on simple visual review of RUS. Fifty-three repaired renal units were identified (50 children). There were five pyeloplasty failures, four of which underwent revision or nephrectomy. While all geometric measurements could discriminate pyeloplasty failure and success, the geometric measurements that discriminated best between pyeloplasty failure and success were change in collecting system roundness and change in RPCSR. Consensus opinion among six blinded reviewers using simple visual review had a sensitivity of 94% and PPV of 100% with respect to identifying pyeloplasty

  2. The Usability Analysis of Different Standard Single-Mode Optical Fibers and Its Installation Methods for the Interferometric Measurements

    Directory of Open Access Journals (Sweden)

    Jakub Cubik

    2013-01-01

    Full Text Available With optical fibers we are able to measure a variety of physical quantities. Optical fiber sensors sensitive to the change of the light phase, so-called interferometers referred in this article are one of the most sensitive sensors. Because we are able to detect phase changes with extreme precision, these sensors are thus suitable for demanding applications, where cost is not the main requirement. We have used the Mach-Zehnder configuration. The paper deals with the usage of different types of standard single-mode optical fibers in the civil engineering as an integrated acoustic sensor. Further experiments are focused on the different types of fiber installation methods, such as placement in the mounting foam, into the polystyrene or attachment onto the wooden surface and their effect on the measurements. Through the repeated measurements of harmonic frequencies were obtained information about the usable frequency range and sensitivity of the particular arrangement. Measurement was performed for both cases, where the specific type of fiber or specifically installed fiber was used as the measurement or as the reference. The final evaluation is based both on the experience gained during measurements and also using the statistical calculations.

  3. Report of the Standardized Outcomes in Nephrology-Hemodialysis (SONG-HD) Consensus Workshop on Establishing a Core Outcome Measure for Hemodialysis Vascular Access.

    Science.gov (United States)

    Viecelli, Andrea K; Tong, Allison; O'Lone, Emma; Ju, Angela; Hanson, Camilla S; Sautenet, Benedicte; Craig, Jonathan C; Manns, Braden; Howell, Martin; Chemla, Eric; Hooi, Lai-Seong; Johnson, David W; Lee, Timmy; Lok, Charmaine E; Polkinghorne, Kevan R; Quinn, Robert R; Vachharajani, Tushar; Vanholder, Raymond; Zuo, Li; Hawley, Carmel M

    2018-02-22

    Vascular access outcomes in hemodialysis are critically important for patients and clinicians, but frequently are neither patient relevant nor measured consistently in randomized trials. A Standardized Outcomes in Nephrology-Hemodialysis (SONG-HD) consensus workshop was convened to discuss the development of a core outcome measure for vascular access. 13 patients/caregivers and 46 professionals (clinicians, policy makers, industry representatives, and researchers) attended. Participants advocated for vascular access function to be a core outcome based on the broad applicability of function regardless of access type, involvement of a multidisciplinary team in achieving a functioning access, and the impact of access function on quality of life, survival, and other access-related outcomes. A core outcome measure for vascular access required demonstrable feasibility for implementation across different clinical and trial settings. Participants advocated for a practical and flexible outcome measure with a simple actionable definition. Integrating patients' values and preferences was warranted to enhance the relevance of the measure. Proposed outcome measures for function included "uninterrupted use of the access without the need for interventions" and "ability to receive prescribed dialysis," but not "access blood flow," which was deemed too expensive and unreliable. These recommendations will inform the definition and implementation of a core outcome measure for vascular access function in hemodialysis trials. Copyright © 2018 National Kidney Foundation, Inc. All rights reserved.

  4. Assessment of Measurement Tools of Observation Rate of Nursing Handover Standards in Clinical Wards of Hospital

    Directory of Open Access Journals (Sweden)

    Saadi Amini

    2015-08-01

    Full Text Available Background and objectives : In health centers, clinical information of patient is transferred among care staffs regularly. One of the common cases in information transferring is during the time of nurses’ handover in hospital which performing it correctly will help schedule patient care, providing safety and facilitating exact transferring of information. The aim of this study is investigating validity and reliability of assessment of observance rate of shift handover in clinical wards checklist. Material and Methods : In order to determine the reliability of checklist, two experts panel meetings were held with the presence of 10 experts in clinical field that in those meetings the reliability was investigated with discussion and consensus of participants. Checklist validity was investigated through pilot study in 4 wards of 4 hospitals and calculated by Kronbach- alpha method with 28 cases of shifts handover in morning, noon, and night shift. Results : In studying reliability, the primary checklist was divided into two checklists: patient handover, equipments and ward handover that included 27 and 72 items, respectively. The reliability of patient handover checklist was verified with 0.9155 Kronbach-alpha and that of equipments and ward handover was verified with 0.8779 Kronbach-alpha. Conclusion : Verifying checklists by mentioned scientific and statistical methods showed that these are very powerful instruments that can be used as one of the assessment tools of shift handover in clinical wards to be used towards promoting received services by customers of healthcare system.

  5. Introduction of an alternative standardized radiographic measurement method to evaluate volar angulation in subcapital fractures of the 5th metacarpal

    Energy Technology Data Exchange (ETDEWEB)

    Hoffelner, Thomas; Resch, Herbert; Moroder, Philipp; Korn, Gundobert; Steinhauer, Felix [University of Salzburg, Department of Traumatology and Sports Injuries, Salzburg (Austria); Atzwanger, Joerg [University of Salzburg, Department of Radiology, Salzburg (Austria); Minnich, Bernd [University of Salzburg, Department of Organismic Biology, Salzburg (Austria); Tauber, Mark [Shoulder and Elbow Surgery ATOS Clinic Munich, Munich (Germany)

    2012-10-15

    The purpose of the present study was to compare the intra- and interobserver reliability of two different measurement methods for volar angulation of the 5th metacarpal (MC) in an attempt to establish a new standard measurement method to reduce interobserver discrepancies for therapeutic decisions. Twenty patients with subcapital fractures of the 5th MC were radiologically investigated. Imaging consisted of a radiographs in antero-posterior and precise lateral view in addition to a CT scan of the 5th MC. Measurement of volar angulation was accomplished using the conventional and the shaft articular surface (SAS) method. The measurements of five investigators were exported to a spreadsheet for statistical analysis to evaluate the intra-and interobserver reliability. The conventional technique showed large differences among the investigators and poor interobserver reliability (W = 0.328 and 0.307) both at injury (p = 0.001) and at follow-up (p = 0.189). The intraobserver concordance of all investigators showed better results with the SAS than with the conventional technique. With the SAS technique, no statistically significant difference among the investigators could be detected at either the time of injury (p = 0.418) or at follow-up (p = 0.526) with excellent interobserver reliability (W = 0.051 and W = 0.041). Evaluation of volar angulation at follow-up using CT scans did not show any statistically significant difference between the techniques with better correlation among the observers with the SAS technique (p = 0.838). The interobserver correlation of volar angulation with lateral radiographs using the conventional technique was insufficient. Therefore, we recommend the use of the novel SAS technique as standardized measurement method which showed higher accuracy and interobserver reliability in order to facilitate the choice of adequate treatment option. (orig.)

  6. Whispering Gallery Modes in Standard Optical Fibres for Fibre Profiling Measurements and Sensing of Unlabelled Chemical Species

    Directory of Open Access Journals (Sweden)

    Anna Boleininger

    2010-03-01

    Full Text Available Whispering gallery mode resonances in liquid droplets and microspheres have attracted considerable attention due to their potential uses in a range of sensing and technological applications. We describe a whispering gallery mode sensor in which standard optical fibre is used as the whispering gallery mode resonator. The sensor is characterised in terms of the response of the whispering gallery mode spectrum to changes in resonator size, refractive index of the surrounding medium, and temperature, and its measurement capabilities are demonstrated through application to high-precision fibre geometry profiling and the detection of unlabelled biochemical species. The prototype sensor is capable of detecting unlabelled biomolecular species in attomole quantities.

  7. Whispering gallery modes in standard optical fibres for fibre profiling measurements and sensing of unlabelled chemical species.

    Science.gov (United States)

    Boleininger, Anna; Lake, Thomas; Hami, Sophia; Vallance, Claire

    2010-01-01

    Whispering gallery mode resonances in liquid droplets and microspheres have attracted considerable attention due to their potential uses in a range of sensing and technological applications. We describe a whispering gallery mode sensor in which standard optical fibre is used as the whispering gallery mode resonator. The sensor is characterised in terms of the response of the whispering gallery mode spectrum to changes in resonator size, refractive index of the surrounding medium, and temperature, and its measurement capabilities are demonstrated through application to high-precision fibre geometry profiling and the detection of unlabelled biochemical species. The prototype sensor is capable of detecting unlabelled biomolecular species in attomole quantities.

  8. Impact of using scatter-mimicking beams instead of standard beams to measure penetration when assessing the protective value of radiation-protective garments.

    Science.gov (United States)

    Jones, A Kyle; Pasciak, Alexander S; Wagner, Louis K

    2018-03-01

    Use standardized methods to determine how assessment of protective value of radiation-protective garments changes under conditions employing standard beam qualities, scatter-mimicking primary beams, and a modified H p (10) measurement. The shielding properties of radiation-protective garments depend on the spectrum of beam energies striking the garment and the attenuation properties of materials used to construct the garment, including x-ray fluorescence produced by these materials. In this study the primary beam spectra employed during clinical interventional radiology and cardiology procedures (clinical primary beams, CPB) were identified using radiation dose structured reports (RDSR) and fluoroscope log data. Monte Carlo simulation was used to determine the scattered radiation spectra produced by these CPB during typical clinical application. For these scattered spectra, scatter-mimicking primary beams (SMPB) were determined using numerical optimization-based spectral reconstruction that adjusted kV and filtration to produce the SMPB that optimally matched the scattered spectrum for each CPB. The penetration of a subset of SMPB through four radiation-protective garments of varying compositions and nominal thicknesses was measured using a geometry specified by the International Electrotechnical Commission (IEC). The diagnostic radiological index of protection (DRIP), which increases with increasing penetration through a garment, was calculated using these measurements. Penetration through the same garments was measured for standard beams specified by the American Society of Testing and Materials (ASTM). Finally, 10 mm of PMMA was affixed to the inside of each garment and the DRIP remeasured in this configuration to simulate H p (10). The SMPB based on actual CPB were in general characterized by lower kV (range 60-76) and higher half-value layer (HVL, range 3.44-4.89 mm Al) than standard beam qualities specified by ASTM (kV range 70-85; HVL range 3.4-4.0 mm Al

  9. 68(Ge+Ga) activity standardization by 4πβ(LS)-γ(NaI(Tl)) anticoincidence counting measurements.

    Science.gov (United States)

    da Silva, C J; da Cruz, P A L; Iwahara, A; de Oliveira, E M; Loureiro, J Dos S; Tauhata, L; da Silva, Ronaldo L; Poledna, R; Lopes, R T

    2018-04-01

    In this work, a 68 (Ge+Ga) solution has been standardized at the National Institute of Ionizing Radiation Metrology (LNMRI), in Brazil, in the frame of an international key comparison CCRI(II)-K2.Ge-68 piloted by National Institute of Standards and Technology (NIST/USA). The 4πβ(LS)-γ(NaI(Tl)) anticoincidence method with live-time and extended dead-time was used and its result was validated by 4πβ(LS)-γ(NaI(Tl)) coincidence counting and liquid scintillation counting using the Triple to Double Coincidence Ratio (TDCR) method. The deviations of the activity concentration values of coincidence and TDCR measurements from the anticoincidence result were 1.7% and 0.63%, respectively, which were within experimental evaluated uncertainties at ~95% level of confidence (coverage factor k = 2). The combined relative standard uncertainties were 0.65%, 0.70% and 0.53% for anticoincidence, coincidence and TDCR methods, respectively. These values are consistent with the results reported by Cessna at the ICRM2017 conference. Copyright © 2017. Published by Elsevier Ltd.

  10. “Which Box Should I Check?”: Examining Standard Check Box Approaches to Measuring Race and Ethnicity

    Science.gov (United States)

    Eisenhower, Abbey; Suyemoto, Karen; Lucchese, Fernanda; Canenguez, Katia

    2014-01-01

    Objective This study examined methodological concerns with standard approaches to measuring race and ethnicity using the federally defined race and ethnicity categories, as utilized in National Institutes of Health (NIH) funded research. Data Sources/Study Setting Surveys were administered to 219 economically disadvantaged, racially and ethnically diverse participants at Boston Women Infants and Children (WIC) clinics during 2010. Study Design We examined missingness and misclassification in responses to the closed-ended NIH measure of race and ethnicity compared with open-ended measures of self-identified race and ethnicity. Principal Findings Rates of missingness were 26 and 43 percent for NIH race and ethnicity items, respectively, compared with 11 and 18 percent for open-ended responses. NIH race responses matched racial self-identification in only 44 percent of cases. Missingness and misclassification were disproportionately higher for self-identified Latina(o)s, African-Americans, and Cape Verdeans. Race, but not ethnicity, was more often missing for immigrant versus mainland U.S.-born respondents. Results also indicated that ethnicity for Hispanic/Latina(o)s is more complex than captured in this measure. Conclusions The NIH's current race and ethnicity measure demonstrated poor differentiation of race and ethnicity, restricted response options, and lack of an inclusive ethnicity question. Separating race and ethnicity and providing respondents with adequate flexibility to identify themselves both racially and ethnically may improve valid operationalization. PMID:24298894

  11. Validation of two point-of-care tests against standard lab measures of NO in saliva and in serum.

    Science.gov (United States)

    Modi, Ashwin; Morou-Bermudez, Evangelia; Vergara, Jose; Patel, Rakesh P; Nichols, Alexandria; Joshipura, Kaumudi

    2017-04-01

    Nitric oxide (NO) is an endogenous signaling molecule, which plays important roles in cardiometabolic health. A significant source of NO is dietary nitrate (NO 3 ), which is initially metabolized by oral bacteria into nitrite (NO 2 - ) and is subsequently converted into NO once digested in the acidic gastric environment. Inexpensive non-invasive tests for measuring nitrite from saliva have been developed as a means for individuals to monitor their NO bioavailability. However, few studies exist in the literature validating and comparing these products with standard lab assays. The objective of this study was to validate two commonly used commercial strips: Nitric Oxide Test Strips (Berkeley Test) and Nitric Oxide Indicator Strips (Neogenesis) against standard lab measures for saliva and serum nitrite/nitrate. A stratified random sample of 20 non-smoking, overweight or obese participants between 40 to 65 years of age, were selected for this study from the baseline data of the San Juan Overweight Adults Longitudinal Study (SOALS). There was a significant correlation between the measures from the two nitrite-detecting-strips after controlling for the stratification variables (metabolic syndrome, and mouthwash use) (r = 0.75). Measurements from both strips correlated significantly with salivary nitrite levels (r = 0.76 for Berkeley strips; r = 0.59 for Neogenesis). Neither of the strips had a significant correlation with the levels of saliva nitrate, serum nitrite and serum nitrate. In conclusion, commercially available Berkeley and Neogenesis strips provide a reasonable surrogate for salivary, but not for systemic nitrite levels. Copyright © 2017 Elsevier Inc. All rights reserved.

  12. Measuring the Outcome of At-Risk Students on Biology Standardized Tests When Using Different Instructional Strategies

    Science.gov (United States)

    Burns, Dana

    Over the last two decades, online education has become a popular concept in universities as well as K-12 education. This generation of students has grown up using technology and has shown interest in incorporating technology into their learning. The idea of using technology in the classroom to enhance student learning and create higher achievement has become necessary for administrators, teachers, and policymakers. Although online education is a popular topic, there has been minimal research on the effectiveness of online and blended learning strategies compared to the student learning in a traditional K-12 classroom setting. The purpose of this study was to investigate differences in standardized test scores from the Biology End of Course exam when at-risk students completed the course using three different educational models: online format, blended learning, and traditional face-to-face learning. Data was collected from over 1,000 students over a five year time period. Correlation analyzed data from standardized tests scores of eighth grade students was used to define students as "at-risk" for failing high school courses. The results indicated a high correlation between eighth grade standardized test scores and Biology End of Course exam scores. These students were deemed "at-risk" for failing high school courses. Standardized test scores were measured for the at-risk students when those students completed Biology in the different models of learning. Results indicated significant differences existed among the learning models. Students had the highest test scores when completing Biology in the traditional face-to-face model. Further evaluation of subgroup populations indicated statistical differences in learning models for African-American populations, female students, and for male students.

  13. Directly relating gas-phase cluster measurements to solution-phase hydrolysis, the absolute standard hydrogen electrode potential, and the absolute proton solvation energy.

    Science.gov (United States)

    Donald, William A; Leib, Ryan D; O'Brien, Jeremy T; Williams, Evan R

    2009-06-08

    Solution-phase, half-cell potentials are measured relative to other half-cell potentials, resulting in a thermochemical ladder that is anchored to the standard hydrogen electrode (SHE), which is assigned an arbitrary value of 0 V. A new method for measuring the absolute SHE potential is demonstrated in which gaseous nanodrops containing divalent alkaline-earth or transition-metal ions are reduced by thermally generated electrons. Energies for the reactions 1) M(H(2)O)(24)(2+)(g) + e(-)(g)-->M(H(2)O)(24)(+)(g) and 2) M(H(2)O)(24)(2+)(g) + e(-)(g)-->MOH(H(2)O)(23)(+)(g) + H(g) and the hydrogen atom affinities of MOH(H(2)O)(23)(+)(g) are obtained from the number of water molecules lost through each pathway. From these measurements on clusters containing nine different metal ions and known thermochemical values that include solution hydrolysis energies, an average absolute SHE potential of +4.29 V vs. e(-)(g) (standard deviation of 0.02 V) and a real proton solvation free energy of -265 kcal mol(-1) are obtained. With this method, the absolute SHE potential can be obtained from a one-electron reduction of nanodrops containing divalent ions that are not observed to undergo one-electron reduction in aqueous solution.

  14. A data science based standardized Gini index as a Lorenz dominance preserving measure of the inequality of distributions.

    Directory of Open Access Journals (Sweden)

    Alfred Ultsch

    Full Text Available The Gini index is a measure of the inequality of a distribution that can be derived from Lorenz curves. While commonly used in, e.g., economic research, it suffers from ambiguity via lack of Lorenz dominance preservation. Here, investigation of large sets of empirical distributions of incomes of the World's countries over several years indicated firstly, that the Gini indices are centered on a value of 33.33% corresponding to the Gini index of the uniform distribution and secondly, that the Lorenz curves of these distributions are consistent with Lorenz curves of log-normal distributions. This can be employed to provide a Lorenz dominance preserving equivalent of the Gini index. Therefore, a modified measure based on log-normal approximation and standardization of Lorenz curves is proposed. The so-called UGini index provides a meaningful and intuitive standardization on the uniform distribution as this characterizes societies that provide equal chances. The novel UGini index preserves Lorenz dominance. Analysis of the probability density distributions of the UGini index of the World's counties income data indicated multimodality in two independent data sets. Applying Bayesian statistics provided a data-based classification of the World's countries' income distributions. The UGini index can be re-transferred into the classical index to preserve comparability with previous research.

  15. A data science based standardized Gini index as a Lorenz dominance preserving measure of the inequality of distributions.

    Science.gov (United States)

    Ultsch, Alfred; Lötsch, Jörn

    2017-01-01

    The Gini index is a measure of the inequality of a distribution that can be derived from Lorenz curves. While commonly used in, e.g., economic research, it suffers from ambiguity via lack of Lorenz dominance preservation. Here, investigation of large sets of empirical distributions of incomes of the World's countries over several years indicated firstly, that the Gini indices are centered on a value of 33.33% corresponding to the Gini index of the uniform distribution and secondly, that the Lorenz curves of these distributions are consistent with Lorenz curves of log-normal distributions. This can be employed to provide a Lorenz dominance preserving equivalent of the Gini index. Therefore, a modified measure based on log-normal approximation and standardization of Lorenz curves is proposed. The so-called UGini index provides a meaningful and intuitive standardization on the uniform distribution as this characterizes societies that provide equal chances. The novel UGini index preserves Lorenz dominance. Analysis of the probability density distributions of the UGini index of the World's counties income data indicated multimodality in two independent data sets. Applying Bayesian statistics provided a data-based classification of the World's countries' income distributions. The UGini index can be re-transferred into the classical index to preserve comparability with previous research.

  16. Methodological issues in systematic reviews of headache trials: adapting historical diagnostic classifications and outcome measures to present-day standards.

    Science.gov (United States)

    McCrory, Douglas C; Gray, Rebecca N; Tfelt-Hansen, Peer; Steiner, Timothy J; Taylor, Frederick R

    2005-05-01

    Recent efforts to make headache diagnostic classification and clinical trial methodology more consistent provide valuable advice to trialists generating new evidence on effectiveness of treatments for headache; however, interpreting older trials that do not conform to new standards remains problematic. Systematic reviewers seeking to utilize historical data can adapt currently recommended diagnostic classification and clinical trial methodological approaches to interpret all available data relative to current standards. In evaluating study populations, systematic reviewers can: (i) use available data to attempt to map study populations to diagnoses in the new International Classification of Headache Disorders; and (ii) stratify analyses based on the extent to which study populations are precisely specified. In evaluating outcome measures, systematic reviewers can: (i) summarize prevention studies using headache frequency, incorporating headache index in a stratified analysis if headache frequency is not available; (ii) summarize acute treatment studies using pain-free response as reported in directly measured headache improvement or headache severity outcomes; and (iii) avoid analysis of recurrence or relapse data not conforming to the sustained pain-free response definition.

  17. External quality assurance programs as a tool for verifying standardization of measurement procedures: Pilot collaboration in Europe.

    Science.gov (United States)

    Perich, C; Ricós, C; Alvarez, V; Biosca, C; Boned, B; Cava, F; Doménech, M V; Fernández-Calle, P; Fernández-Fernández, P; García-Lario, J V; Minchinela, J; Simón, M; Jansen, R

    2014-05-15

    Current external quality assurance schemes have been classified into six categories, according to their ability to verify the degree of standardization of the participating measurement procedures. SKML (Netherlands) is a Category 1 EQA scheme (commutable EQA materials with values assigned by reference methods), whereas SEQC (Spain) is a Category 5 scheme (replicate analyses of non-commutable materials with no values assigned by reference methods). The results obtained by a group of Spanish laboratories participating in a pilot study organized by SKML are examined, with the aim of pointing out the improvements over our current scheme that a Category 1 program could provide. Imprecision and bias are calculated for each analyte and laboratory, and compared with quality specifications derived from biological variation. Of the 26 analytes studied, 9 had results comparable with those from reference methods, and 10 analytes did not have comparable results. The remaining 7 analytes measured did not have available reference method values, and in these cases, comparison with the peer group showed comparable results. The reasons for disagreement in the second group can be summarized as: use of non-standard methods (IFCC without exogenous pyridoxal phosphate for AST and ALT, Jaffé kinetic at low-normal creatinine concentrations and with eGFR); non-commutability of the reference material used to assign values to the routine calibrator (calcium, magnesium and sodium); use of reference materials without established commutability instead of reference methods for AST and GGT, and lack of a systematic effort by manufacturers to harmonize results. Results obtained in this work demonstrate the important role of external quality assurance programs using commutable materials with values assigned by reference methods to correctly monitor the standardization of laboratory tests with consequent minimization of risk to patients. Copyright © 2013 Elsevier B.V. All rights reserved.

  18. The Applicability of Standard Error of Measurement and Minimal Detectable Change to Motor Learning Research-A Behavioral Study.

    Science.gov (United States)

    Furlan, Leonardo; Sterr, Annette

    2018-01-01

    Motor learning studies face the challenge of differentiating between real changes in performance and random measurement error. While the traditional p -value-based analyses of difference (e.g., t -tests, ANOVAs) provide information on the statistical significance of a reported change in performance scores, they do not inform as to the likely cause or origin of that change, that is, the contribution of both real modifications in performance and random measurement error to the reported change. One way of differentiating between real change and random measurement error is through the utilization of the statistics of standard error of measurement (SEM) and minimal detectable change (MDC). SEM is estimated from the standard deviation of a sample of scores at baseline and a test-retest reliability index of the measurement instrument or test employed. MDC, in turn, is estimated from SEM and a degree of confidence, usually 95%. The MDC value might be regarded as the minimum amount of change that needs to be observed for it to be considered a real change, or a change to which the contribution of real modifications in performance is likely to be greater than that of random measurement error. A computer-based motor task was designed to illustrate the applicability of SEM and MDC to motor learning research. Two studies were conducted with healthy participants. Study 1 assessed the test-retest reliability of the task and Study 2 consisted in a typical motor learning study, where participants practiced the task for five consecutive days. In Study 2, the data were analyzed with a traditional p -value-based analysis of difference (ANOVA) and also with SEM and MDC. The findings showed good test-retest reliability for the task and that the p -value-based analysis alone identified statistically significant improvements in performance over time even when the observed changes could in fact have been smaller than the MDC and thereby caused mostly by random measurement error, as opposed

  19. Evaluating Living Standard Indicators

    Directory of Open Access Journals (Sweden)

    Birčiaková Naďa

    2015-09-01

    Full Text Available This paper deals with the evaluation of selected available indicators of living standards, divided into three groups, namely economic, environmental, and social. We have selected six countries of the European Union for analysis: Bulgaria, the Czech Republic, Hungary, Luxembourg, France, and Great Britain. The aim of this paper is to evaluate indicators measuring living standards and suggest the most important factors which should be included in the final measurement. We have tried to determine what factors influence each indicator and what factors affect living standards. We have chosen regression analysis as our main method. From the study of factors, we can deduce their impact on living standards, and thus the value of indicators of living standards. Indicators with a high degree of reliability include the following factors: size and density of population, health care and spending on education. Emissions of carbon dioxide in the atmosphere also have a certain lower degree of reliability.

  20. Establishing a standard for assessing the appropriateness of trauma team activation: a retrospective evaluation of two outcome measures.

    Science.gov (United States)

    Bressan, Silvia; Franklin, Katherine L; Jowett, Helen E; King, Sebastian K; Oakley, Ed; Palmer, Cameron S

    2015-09-01

    Trauma team activation (TTA) is a well-recognised standard of care to provide rapid stabilisation of patients with time-critical, life-threatening injuries. TTA is associated with a substantial use of valuable hospital resources that may adversely impact upon the care of other patients if not carefully balanced. This study aimed to determine which of the two outcome measures would be a better standard for assessing the appropriateness of TTA at a paediatric centre: retrospective major trauma classification as defined within our state, and the use of emergency department high-level resources as recently published by Falcone et al (Falcone Interventions; FI). Trauma registry data and patients' charts between February 2011 and June 2013 were reviewed. Over-triage and under-triage rates for TTA, using both major trauma and FIs as outcome measures, were compared. Totally, 280 patients received TTA, 243 met major trauma definition and 102 received one or more FIs. The rates of over-triage and under-triage were 39.7% (95% CI 35.0 to 44.6%) and 30.5% (95% CI 26.2 to 35.2%), when the major trauma definition was used as the outcome measure, and 67.5% (95% CI 62.2 to 72.5%) and 10.8% (95% CI 7.9 to 14.8%) when FI was used. Only 17.1% (95% CI 11.4% to 24.7%) of the under-triaged patients using the major trauma definition received one or more FIs. Assessment of TTA appropriateness varied significantly based on the outcome measure used. FIs better reflected the use of acute-care TTA-related resources compared with the major trauma definition, and it should be used as the gold standard to prospectively assess and refine TTA criteria. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.