WorldWideScience

Sample records for practical exposure-equivalent metric

  1. A practical exposure-equivalent metric for instrumentation noise in x-ray imaging systems

    International Nuclear Information System (INIS)

    Yadava, G K; Kuhls-Gilcrist, A T; Rudin, S; Patel, V K; Hoffmann, K R; Bednarek, D R

    2008-01-01

    The performance of high-sensitivity x-ray imagers may be limited by additive instrumentation noise rather than by quantum noise when operated at the low exposure rates used in fluoroscopic procedures. The equipment-invasive instrumentation noise measures (in terms of electrons) are generally difficult to make and are potentially not as helpful in clinical practice as would be a direct radiological representation of such noise that may be determined in the field. In this work, we define a clinically relevant representation for instrumentation noise in terms of noise-equivalent detector entrance exposure, termed the instrumentation noise-equivalent exposure (INEE), which can be determined through experimental measurements of noise-variance or signal-to-noise ratio (SNR). The INEE was measured for various detectors, thus demonstrating its usefulness in terms of providing information about the effective operating range of the various detectors. A simulation study is presented to demonstrate the robustness of this metric against post-processing, and its dependence on inherent detector blur. These studies suggest that the INEE may be a practical gauge to determine and compare the range of quantum-limited performance for clinical x-ray detectors of different design, with the implication that detector performance at exposures below the INEE will be instrumentation-noise limited rather than quantum-noise limited

  2. Baby universe metric equivalent to an interior black-hole metric

    International Nuclear Information System (INIS)

    Gonzalez-Diaz, P.F.

    1991-01-01

    It is shown that the maximally extended metric corresponding to a large wormhole is the unique possible wormhole metric whose baby universe sector is conformally equivalent ot the maximal inextendible Kruskal metric corresponding to the interior region of a Schwarzschild black hole whose gravitational radius is half the wormhole neck radius. The physical implications of this result in the black hole evaporation process are discussed. (orig.)

  3. Comparing exposure zones by different exposure metrics using statistical parameters: contrast and precision.

    Science.gov (United States)

    Park, Ji Young; Ramachandran, Gurumurthy; Raynor, Peter C; Eberly, Lynn E; Olson, Greg

    2010-10-01

    Recently, the appropriateness of using the 'mass concentration' metric for ultrafine particles has been questioned and surface area (SA) or number concentration metrics has been proposed as alternatives. To assess the abilities of various exposure metrics to distinguish between different exposure zones in workplaces with nanoparticle aerosols, exposure concentrations were measured in preassigned 'high-' and 'low-'exposure zones in a restaurant, an aluminum die-casting factory, and a diesel engine laboratory using SA, number, and mass concentration metrics. Predetermined exposure classifications were compared by each metric using statistical parameters and concentration ratios that were calculated from the different exposure concentrations. In the restaurant, SA and fine particle number concentrations showed significant differences between the high- and low-exposure zones and they had higher contrast (the ratio of between-zone variance to the sum of the between-zone and within-zone variances) than mass concentrations. Mass concentrations did not show significant differences. In the die cast facility, concentrations of all metrics were significantly greater in the high zone than in the low zone. SA and fine particle number concentrations showed larger concentration ratios between the high and low zones and higher contrast than mass concentrations. None of the metrics were significantly different between the high- and low-exposure zones in the diesel engine laboratory. The SA and fine particle number concentrations appeared to be better at differentiating exposure zones and finding the particle generation sources in workplaces generating nanoparticles. Because the choice of an exposure metric has significant implications for epidemiologic studies and industrial hygiene practice, a multimetric sampling approach is recommended for nanoparticle exposure assessment.

  4. Committed dose equivalent in the practice of radiological protection

    International Nuclear Information System (INIS)

    Nenot, J.C.; Piechowski, J.

    1985-01-01

    In the case of internal exposure, the dose is not received at the moment of exposure, as happens with external exposure, since the incorporated radionuclide irradiates the various organs and tissues during the time it is present in the body. By definition, the committed dose equivalent corresponds to the received dose integrated over 50 years from the date of intake. In order to calculate it, one has to know the intake activity and the value of the committed dose equivalent per unit of intake activity. The uncertainties of the first parameter are such that the committed dose equivalent can only be regarded as an order of magnitude and not as a very accurate quantity. The use of it is justified, however, for, like the dose equivalent for external exposure, it expresses the risk of stochastic effects for the individual concerned since these effects, should they appear, would do so only after a latent period which is generally longer than the dose integration time. Moreover, the use of the committed dose equivalent offers certain advantages for dosimetric management, especially when it is simplified. A practical problem which may arise is that the annual dose limit is apparently exceeded by virtue of the fact that one is taking account, in the first year, of doses which will actually be received only in the following years. These problems are rare enough in practice to be dealt with individually in each case. (author)

  5. Temporal variability of daily personal magnetic field exposure metrics in pregnant women.

    Science.gov (United States)

    Lewis, Ryan C; Evenson, Kelly R; Savitz, David A; Meeker, John D

    2015-01-01

    Recent epidemiology studies of power-frequency magnetic fields and reproductive health have characterized exposures using data collected from personal exposure monitors over a single day, possibly resulting in exposure misclassification due to temporal variability in daily personal magnetic field exposure metrics, but relevant data in adults are limited. We assessed the temporal variability of daily central tendency (time-weighted average, median) and peak (upper percentiles, maximum) personal magnetic field exposure metrics over 7 consecutive days in 100 pregnant women. When exposure was modeled as a continuous variable, central tendency metrics had substantial reliability, whereas peak metrics had fair (maximum) to moderate (upper percentiles) reliability. The predictive ability of a single-day metric to accurately classify participants into exposure categories based on a weeklong metric depended on the selected exposure threshold, with sensitivity decreasing with increasing exposure threshold. Consistent with the continuous measures analysis, sensitivity was higher for central tendency metrics than for peak metrics. If there is interest in peak metrics, more than 1 day of measurement is needed over the window of disease susceptibility to minimize measurement error, but 1 day may be sufficient for central tendency metrics.

  6. Software metrics a rigorous and practical approach

    CERN Document Server

    Fenton, Norman

    2014-01-01

    A Framework for Managing, Measuring, and Predicting Attributes of Software Development Products and ProcessesReflecting the immense progress in the development and use of software metrics in the past decades, Software Metrics: A Rigorous and Practical Approach, Third Edition provides an up-to-date, accessible, and comprehensive introduction to software metrics. Like its popular predecessors, this third edition discusses important issues, explains essential concepts, and offers new approaches for tackling long-standing problems.New to the Third EditionThis edition contains new material relevant

  7. Experimental equivalent cluster-size distributions in nano-metric volumes of liquid water

    International Nuclear Information System (INIS)

    Grosswendt, B.; De Nardo, L.; Colautti, P.; Pszona, S.; Conte, V.; Tornielli, G.

    2004-01-01

    Ionisation cluster-size distributions in nano-metric volumes of liquid water were determined for alpha particles at 4.6 and 5.4 MeV by measuring cluster-size frequencies in small gaseous volumes of nitrogen or propane at low gas pressure as well as by applying a suitable scaling procedure. This scaling procedure was based on the mean free ionisation lengths of alpha particles in water and in the gases measured. For validation, the measurements of cluster sizes in gaseous volumes and the cluster-size formation in volumes of liquid water of equivalent size were simulated by Monte Carlo methods. The experimental water-equivalent cluster-size distributions in nitrogen and propane are compared with those in liquid water and show that cluster-size formation by alpha particles in nitrogen or propane can directly be related to those in liquid water. (authors)

  8. Analyses Of Two End-User Software Vulnerability Exposure Metrics

    Energy Technology Data Exchange (ETDEWEB)

    Jason L. Wright; Miles McQueen; Lawrence Wellman

    2012-08-01

    The risk due to software vulnerabilities will not be completely resolved in the near future. Instead, putting reliable vulnerability measures into the hands of end-users so that informed decisions can be made regarding the relative security exposure incurred by choosing one software package over another is of importance. To that end, we propose two new security metrics, average active vulnerabilities (AAV) and vulnerability free days (VFD). These metrics capture both the speed with which new vulnerabilities are reported to vendors and the rate at which software vendors fix them. We then examine how the metrics are computed using currently available datasets and demonstrate their estimation in a simulation experiment using four different browsers as a case study. Finally, we discuss how the metrics may be used by the various stakeholders of software and to software usage decisions.

  9. Assessment of every day extremely low frequency (Elf) electromagnetic fields (50-60 Hz) exposure: which metrics?

    International Nuclear Information System (INIS)

    Verrier, A.; Magne, I.; Souqes, M.; Lambrozo, J.

    2006-01-01

    Because electricity is encountered at every moment of the day, at home with household appliances, or in every type of transportation, people are most of the time exposed to extremely low frequency (E.L.F.) electromagnetic fields (50-60 Hz) in a various way. Due to a lack of knowledge about the biological mechanisms of 50 Hz magnetic fields, studies seeking to identify health effects of exposure use central tendency metrics. The objective of our study is to provide better information about these exposure measurements from three categories of metrics. We calculated metrics of exposure measurements from data series (79 very day exposed subjects), made up approximately 20,000 recordings of magnetic fields, measured every 30 seconds for 7 days with an E.M.D.E.X. II dosimeter. These indicators were divided into three categories : central tendency metrics, dispersion metrics and variability metrics.We use Principal Component Analysis, a multidimensional technique to examine the relations between different exposure metrics for a group of subjects. Principal component Analysis (P.C.A.) enabled us to identify from the foreground 71.7% of the variance. The first component (42.7%) was characterized by central tendency; the second (29.0%) was composed of dispersion characteristics. The third component (17.2%) was composed of variability characteristics. This study confirm the need to improve exposure measurements by using at least two dimensions intensity and dispersion. (authors)

  10. ELF Magnetic Fields, Transients and TWA Metrics (invited paper)

    International Nuclear Information System (INIS)

    Kavet, R.

    1999-01-01

    Residential measurements of ambient power frequency magnetic fields may serve as surrogates for personal exposures. There are few data available, however, to determine how far back in time this surrogacy holds. A limited amount of research on residential transients suggests that, all other factors being equivalent, larger transients may propagate within VHCC neighbourhoods than within LCC neighbourhoods. However, the presence of a conductive residential ground pathway also appears to be a potentially important factor associated with residential transient activity. The use of the TWA metric was prompted by the need for an exposure score simple enough to summarise an individual's exposure over a prior interval, yet specific to the agent of concern, namely the power frequency magnetic field. To the extent that the TWA exposure is associated with health outcomes in the absence of bias, including confounding, the TWA metric is important. (author)

  11. Top 10 metrics for life science software good practices.

    Science.gov (United States)

    Artaza, Haydee; Chue Hong, Neil; Corpas, Manuel; Corpuz, Angel; Hooft, Rob; Jimenez, Rafael C; Leskošek, Brane; Olivier, Brett G; Stourac, Jan; Svobodová Vařeková, Radka; Van Parys, Thomas; Vaughan, Daniel

    2016-01-01

    Metrics for assessing adoption of good development practices are a useful way to ensure that software is sustainable, reusable and functional. Sustainability means that the software used today will be available - and continue to be improved and supported - in the future. We report here an initial set of metrics that measure good practices in software development. This initiative differs from previously developed efforts in being a community-driven grassroots approach where experts from different organisations propose good software practices that have reasonable potential to be adopted by the communities they represent. We not only focus our efforts on understanding and prioritising good practices, we assess their feasibility for implementation and publish them here.

  12. Rapporteur Report: Sources and Exposure Metrics for ELF Epidemiology (Part 1) (invited paper)

    International Nuclear Information System (INIS)

    Matthes, R.

    1999-01-01

    High quality epidemiological studies on the possible link between exposure to non-ionizing radiation and human health effects are of great importance for radiation protection in this area. The main sources of ELF fields are domestic appliances, different electrical energy distribution systems and all kinds of electrical machinery and devices at the workplace. In general, ELF fields present in the environment, show complex temporal patterns and spatial distributions, depending on the generating source. The complete characterisation of the different field sources often requires highly sophisticated instrumentation, and this is therefore not feasible within the scope of epidemiological studies. On average, individual exposure from ELF fields is low in both the working environment and in residential areas. Only at certain workplaces are people subject to significant ELF exposure with regard to biological effects. Different methods have been developed to determine levels of exposure received by study subjects, with the aim to rank exposed and non-exposed groups in epidemiological studies. These include spot measurements, calculations or modelling. The different methods used to estimate total exposure in epidemiological studies may result to a differing extent in a misclassification of the study subjects. Equally important for future studies is the selection of the appropriate exposure metric. The most widely used metric so far is the time-weighted average and thus represents a quasi standard metric for use in epidemiological studies. Beside, wire codes have been used for a long time in residential studies and job titles are often used in occupational studies. On the basis of the experience gained in previous studies, it would be desirable to develop standardised, state-of-the-art protocols to improve exposure assessment. New surrogates and metrics were proposed as the basis for further studies. But only few of these have recently undergone preliminary testing. A

  13. ELF Magnetic Fields, Transients and TWA Metrics (invited paper)

    Energy Technology Data Exchange (ETDEWEB)

    Kavet, R

    1999-07-01

    Residential measurements of ambient power frequency magnetic fields may serve as surrogates for personal exposures. There are few data available, however, to determine how far back in time this surrogacy holds. A limited amount of research on residential transients suggests that, all other factors being equivalent, larger transients may propagate within VHCC neighbourhoods than within LCC neighbourhoods. However, the presence of a conductive residential ground pathway also appears to be a potentially important factor associated with residential transient activity. The use of the TWA metric was prompted by the need for an exposure score simple enough to summarise an individual's exposure over a prior interval, yet specific to the agent of concern, namely the power frequency magnetic field. To the extent that the TWA exposure is associated with health outcomes in the absence of bias, including confounding, the TWA metric is important. (author)

  14. Use of different exposure metrics for understanding multi-modal travel injury risk

    Directory of Open Access Journals (Sweden)

    S. Ilgin Guler

    2016-08-01

    Full Text Available The objective of this work is to identify characteristics of different metrics of exposure for quantifying multi-modal travel injury risk. First, a discussion on the use of time-based and trip-based metrics for road user exposure to injury risk, considering multiple travel modes, is presented. The main difference between a time-based and trip-based metric is argued to be that a time-based metric reflects the actual duration of time spent on the road exposed to the travel risks. This can be proven to be important when considering multiple modes since different modes typically different speeds and average travel distances. Next, the use of total number of trips, total time traveled, and mode share (time-based or trip-based is considered to compare the injury risk of a given mode at different locations. It is argued that using mode share the safety concept which focuses on absolute numbers can be generalized. Quantitative results are also obtained from combining travel survey data with police collision reports for ten counties in California. The data are aggregated for five modes: (i cars, (ii SUVs, (iii transit riders, (iv bicyclists, and (v pedestrians. These aggregated data are used to compare travel risk of different modes with time-based or trip-based exposure metrics. These quantitative results confirm the initial qualitative discussions. As the penetration of mobile probes for transportation data collection increases, the insights of this study can provide guidance on how to best utilize the added value of such data to better quantify travel injury risk, and improve safety.

  15. A Practical Method for Collecting Social Media Campaign Metrics

    Science.gov (United States)

    Gharis, Laurie W.; Hightower, Mary F.

    2017-01-01

    Today's Extension professionals are tasked with more work and fewer resources. Integrating social media campaigns into outreach efforts can be an efficient way to meet work demands. If resources go toward social media, a practical method for collecting metrics is needed. Collecting metrics adds one more task to the workloads of Extension…

  16. Temporal Variability of Daily Personal Magnetic Field Exposure Metrics in Pregnant Women

    OpenAIRE

    Lewis, Ryan C.; Evenson, Kelly R.; Savitz, David A.; Meeker, John D.

    2014-01-01

    Recent epidemiology studies of power-frequency magnetic fields and reproductive health have characterized exposures using data collected from personal exposure monitors over a single day, possibly resulting in exposure misclassification due to temporal variability in daily personal magnetic field exposure metrics, but relevant data in adults are limited. We assessed the temporal variability of daily central tendency (time-weighted average, median) and peak (upper percentiles, maximum) persona...

  17. Alignment of breast cancer screening guidelines, accountability metrics, and practice patterns.

    Science.gov (United States)

    Onega, Tracy; Haas, Jennifer S; Bitton, Asaf; Brackett, Charles; Weiss, Julie; Goodrich, Martha; Harris, Kimberly; Pyle, Steve; Tosteson, Anna N A

    2017-01-01

    Breast cancer screening guidelines and metrics are inconsistent with each other and may differ from breast screening practice patterns in primary care. This study measured breast cancer screening practice patterns in relation to common evidence-based guidelines and accountability metrics. Cohort study using primary data collected from a regional breast cancer screening research network between 2011 and 2014. Using information on women aged 30 to 89 years within 21 primary care practices of 2 large integrated health systems in New England, we measured the proportion of women screened overall and by age using 2 screening definition categories: any mammogram and screening mammogram. Of the 81,352 women in our cohort, 54,903 (67.5%) had at least 1 mammogram during the time period, 48,314 (59.4%) had a screening mammogram. Women aged 50 to 69 years were the highest proportion screened (82.4% any mammogram, 75% screening indication); 72.6% of women at age 40 had a screening mammogram with a median of 70% (range = 54.3%-84.8%) among the practices. Of women aged at least 75 years, 63.3% had a screening mammogram, with the median of 63.9% (range = 37.2%-78.3%) among the practices. Of women who had 2 or more mammograms, 79.5% were screened annually. Primary care practice patterns for breast cancer screening are not well aligned with some evidence-based guidelines and accountability metrics. Metrics and incentives should be designed with more uniformity and should also include shared decision making when the evidence does not clearly support one single conclusion.

  18. Characterising risk - aggregated metrics: radiation and noise

    International Nuclear Information System (INIS)

    Passchier, W.

    1998-01-01

    The characterisation of risk is an important phase in the risk assessment - risk management process. From the multitude of risk attributes a few have to be selected to obtain a risk characteristic or profile that is useful for risk management decisions and implementation of protective measures. One way to reduce the number of attributes is aggregation. In the field of radiation protection such an aggregated metric is firmly established: effective dose. For protection against environmental noise the Health Council of the Netherlands recently proposed a set of aggregated metrics for noise annoyance and sleep disturbance. The presentation will discuss similarities and differences between these two metrics and practical limitations. The effective dose has proven its usefulness in designing radiation protection measures, which are related to the level of risk associated with the radiation practice in question, given that implicit judgements on radiation induced health effects are accepted. However, as the metric does not take into account the nature of radiation practice, it is less useful in policy discussions on the benefits and harm of radiation practices. With respect to the noise exposure metric, only one effect is targeted (annoyance), and the differences between sources are explicitly taken into account. This should make the metric useful in policy discussions with respect to physical planning and siting problems. The metric proposed has only significance on a population level, and can not be used as a predictor for individual risk. (author)

  19. Metric and structural equivalence of core cognitive abilities measured with the Wechsler Adult Intelligence Scale-III in the United States and Australia.

    Science.gov (United States)

    Bowden, Stephen C; Lissner, Dianne; McCarthy, Kerri A L; Weiss, Lawrence G; Holdnack, James A

    2007-10-01

    Equivalence of the psychological model underlying Wechsler Adult Intelligence Scale-Third Edition (WAIS-III) scores obtained in the United States and Australia was examined in this study. Examination of metric invariance involves testing the hypothesis that all components of the measurement model relating observed scores to latent variables are numerically equal in different samples. The assumption of metric invariance is necessary for interpretation of scores derived from research studies that seek to generalize patterns of convergent and divergent validity and patterns of deficit or disability. An Australian community volunteer sample was compared to the US standardization data. A pattern of strict metric invariance was observed across samples. In addition, when the effects of different demographic characteristics of the US and Australian samples were included, structural parameters reflecting values of the latent cognitive variables were found not to differ. These results provide important evidence for the equivalence of measurement of core cognitive abilities with the WAIS-III and suggest that latent cognitive abilities in the US and Australia do not differ.

  20. Resilient Control Systems Practical Metrics Basis for Defining Mission Impact

    Energy Technology Data Exchange (ETDEWEB)

    Craig G. Rieger

    2014-08-01

    "Resilience” describes how systems operate at an acceptable level of normalcy despite disturbances or threats. In this paper we first consider the cognitive, cyber-physical interdependencies inherent in critical infrastructure systems and how resilience differs from reliability to mitigate these risks. Terminology and metrics basis are provided to integrate the cognitive, cyber-physical aspects that should be considered when defining solutions for resilience. A practical approach is taken to roll this metrics basis up to system integrity and business case metrics that establish “proper operation” and “impact.” A notional chemical processing plant is the use case for demonstrating how the system integrity metrics can be applied to establish performance, and

  1. Conversion between noise exposure indicators Leq24h, LDay, LEvening, LNight, Ldn and Lden: Principles and practical guidance.

    Science.gov (United States)

    Brink, Mark; Schäffer, Beat; Pieren, Reto; Wunderli, Jean Marc

    2018-01-01

    This article presents empirically derived conversion rules between the environmental noise exposure metrics Leq24h, LDay, LEvening, LNight, Ldn, and Lden for the noise sources road, rail and air traffic. It caters to researchers that need to estimate the value of one (unknown) noise metric from the value of another (known) metric, e.g. in the scope of epidemiological meta-analyses or systematic reviews, when results from different studies are pooled and need to be related to one common exposure metric. Conversion terms are derived using two empirical methods a) based on analyzing the diurnal variation of traffic, and b) by analyzing differences between calculated noise exposure metrics. For a) we collected and analyzed diurnal traffic share data from European and US airports as well as data on the diurnal variation of traffic from roads in several European countries and from railway lines in Switzerland which were derived from counting stations and official records. For b) we calculated differences between noise metrics in over 50'000 stratified randomly sampled dwellings in Switzerland. As a result of this exercise, conversion terms, including uncertainty estimates, are systematically tabulated for all variants of the target metrics. Guidance as to the practical applicability of the proposed conversions in different contexts is provided, and limitations of their use are discussed. Copyright © 2017 Elsevier GmbH. All rights reserved.

  2. Comparison of Highly Resolved Model-Based Exposure Metrics for Traffic-Related Air Pollutants to Support Environmental Health Studies

    Directory of Open Access Journals (Sweden)

    Shih Ying Chang

    2015-12-01

    Full Text Available Human exposure to air pollution in many studies is represented by ambient concentrations from space-time kriging of observed values. Space-time kriging techniques based on a limited number of ambient monitors may fail to capture the concentration from local sources. Further, because people spend more time indoors, using ambient concentration to represent exposure may cause error. To quantify the associated exposure error, we computed a series of six different hourly-based exposure metrics at 16,095 Census blocks of three Counties in North Carolina for CO, NOx, PM2.5, and elemental carbon (EC during 2012. These metrics include ambient background concentration from space-time ordinary kriging (STOK, ambient on-road concentration from the Research LINE source dispersion model (R-LINE, a hybrid concentration combining STOK and R-LINE, and their associated indoor concentrations from an indoor infiltration mass balance model. Using a hybrid-based indoor concentration as the standard, the comparison showed that outdoor STOK metrics yielded large error at both population (67% to 93% and individual level (average bias between −10% to 95%. For pollutants with significant contribution from on-road emission (EC and NOx, the on-road based indoor metric performs the best at the population level (error less than 52%. At the individual level, however, the STOK-based indoor concentration performs the best (average bias below 30%. For PM2.5, due to the relatively low contribution from on-road emission (7%, STOK-based indoor metric performs the best at both population (error below 40% and individual level (error below 25%. The results of the study will help future epidemiology studies to select appropriate exposure metric and reduce potential bias in exposure characterization.

  3. Frequency of Extreme Heat Event as a Surrogate Exposure Metric for Examining the Human Health Effects of Climate Change.

    Directory of Open Access Journals (Sweden)

    Crystal Romeo Upperman

    Full Text Available Epidemiological investigation of the impact of climate change on human health, particularly chronic diseases, is hindered by the lack of exposure metrics that can be used as a marker of climate change that are compatible with health data. Here, we present a surrogate exposure metric created using a 30-year baseline (1960-1989 that allows users to quantify long-term changes in exposure to frequency of extreme heat events with near unabridged spatial coverage in a scale that is compatible with national/state health outcome data. We evaluate the exposure metric by decade, seasonality, area of the country, and its ability to capture long-term changes in weather (climate, including natural climate modes. Our findings show that this generic exposure metric is potentially useful to monitor trends in the frequency of extreme heat events across varying regions because it captures long-term changes; is sensitive to the natural climate modes (ENSO events; responds well to spatial variability, and; is amenable to spatial/temporal aggregation, making it useful for epidemiological studies.

  4. Editorial: New operational dose equivalent quantities

    International Nuclear Information System (INIS)

    Harvey, J.R.

    1985-01-01

    The ICRU Report 39 entitled ''Determination of Dose Equivalents Resulting from External Radiation Sources'' is briefly discussed. Four new operational dose equivalent quantities have been recommended in ICRU 39. The 'ambient dose equivalent' and the 'directional dose equivalent' are applicable to environmental monitoring and the 'individual dose equivalent, penetrating' and the 'individual dose equivalent, superficial' are applicable to individual monitoring. The quantities should meet the needs of day-to-day operational practice, while being acceptable to those concerned with metrological precision, and at the same time be used to give effective control consistent with current perceptions of the risks associated with exposure to ionizing radiations. (U.K.)

  5. Greenroads : a sustainability performance metric for roadway design and construction.

    Science.gov (United States)

    2009-11-01

    Greenroads is a performance metric for quantifying sustainable practices associated with roadway design and construction. Sustainability is defined as having seven key components: ecology, equity, economy, extent, expectations, experience and exposur...

  6. Harmonizing exposure metrics and methods for sustainability assessments of food contact materials

    DEFF Research Database (Denmark)

    Ernstoff, Alexi; Jolliet, Olivier; Niero, Monia

    2016-01-01

    ) and Cradle to Cradle to support packaging design. Each assessment has distinct context and goals, but can help manage exposure to toxic chemicals and other environmental impacts. Metrics a nd methods to quantify and characterize exposure to potentially toxic chemicals specifically in food packaging are......, however, notably lacking from such assessments. Furthermore, previous case studies demonstrated that sustainable packaging design focuses, such as decreasing greenhouse gas emissions or resource consumption, can increase exposure to toxic chemicals through packaging. Thereby, developing harmonized methods...... for quantifying exposure to chemicals in food packaging is critical to ensure ‘sustainable packages’ do not increase exposure to toxic chemicals. Therefore we developed modelling methods suitable for first-tier risk screening and environmental assessments. The modelling framework was based on the new product...

  7. The relationships between short-term exposure to particulate matter and mortality in Korea: impact of particulate matter exposure metrics for sub-daily exposures

    International Nuclear Information System (INIS)

    Son, Ji-Young; Bell, Michelle L

    2013-01-01

    Most studies of short-term particulate matter (PM) exposure use 24 h averages. However, other pollutants have stronger effects in shorter timeframes, which has influenced policy (e.g., ozone 8 h maximum). The selection of appropriate exposure timeframes is important for effective regulation. The US EPA identified health effects for sub-daily PM exposures as a critical research need. Unlike most areas, Seoul, Korea has hourly measurements of PM 10 , although not PM 2.5 . We investigated PM 10 and mortality (total, cardiovascular, respiratory) in Seoul (1999–2009) considering sub-daily exposures: 24 h, daytime (7 am–8 pm), morning (7–10 am), nighttime (8 pm–7 am), and 1 h daily maximum. We applied Poisson generalized linear modeling adjusting for temporal trends and meteorology. All PM 10 metrics were significantly associated with total mortality. Compared to other exposure timeframes, morning exposure had the most certain effect on total mortality (based on statistical significance). Increases of 10 μg m −3 in 24 h, daytime, morning, nighttime, and 1 h maximum PM 10 were associated with 0.15% (95% confidence interval 0.02–0.28%), 0.14% (0.01–0.27%), 0.10% (0.03–0.18%), 0.12% (0.03–0.22%), and 0.10% (0.00–0.21%) increases in total mortality, respectively. PM 10 was significantly associated with cardiovascular mortality for 24 h, morning, and nighttime exposures. We did not identify significant associations with respiratory mortality. The results support use of a 24 h averaging time as an appropriate metric for health studies and regulation, particularly for PM 10 and mortality. (letter)

  8. Dynamic equivalence relation on the fuzzy measure algebras

    Directory of Open Access Journals (Sweden)

    Roya Ghasemkhani

    2017-04-01

    Full Text Available The main goal of the present paper is to extend classical results from the measure theory and dynamical systems to the fuzzy subset setting. In this paper, the notion of  dynamic equivalence relation is introduced and then it is proved that this relation is an equivalence relation. Also, a new metric on the collection of all equivalence classes is introduced and it is proved that this metric is complete.

  9. New exposure-based metric approach for evaluating O3 risk to North American aspen forests

    International Nuclear Information System (INIS)

    Percy, K.E.; Nosal, M.; Heilman, W.; Dann, T.; Sober, J.; Legge, A.H.; Karnosky, D.F.

    2007-01-01

    The United States and Canada currently use exposure-based metrics to protect vegetation from O 3 . Using 5 years (1999-2003) of co-measured O 3 , meteorology and growth response, we have developed exposure-based regression models that predict Populus tremuloides growth change within the North American ambient air quality context. The models comprised growing season fourth-highest daily maximum 8-h average O 3 concentration, growing degree days, and wind speed. They had high statistical significance, high goodness of fit, include 95% confidence intervals for tree growth change, and are simple to use. Averaged across a wide range of clonal sensitivity, historical 2001-2003 growth change over most of the 26 M ha P. tremuloides distribution was estimated to have ranged from no impact (0%) to strong negative impacts (-31%). With four aspen clones responding negatively (one responded positively) to O 3 , the growing season fourth-highest daily maximum 8-h average O 3 concentration performed much better than growing season SUM06, AOT40 or maximum 1 h average O 3 concentration metrics as a single indicator of aspen stem cross-sectional area growth. - A new exposure-based metric approach to predict O 3 risk to North American aspen forests has been developed

  10. Rapporteur Report: Sources and Exposure Metrics for RF Epidemiology (Part 1) (invited paper)

    Energy Technology Data Exchange (ETDEWEB)

    Allen, S

    1999-07-01

    A variety of sources was considered illustrating the predominantly uniform exposure at a distance and highly non-uniform exposures close to sources. The measurement of both electric and magnetic fields was considered for near field situations but where possible the use of induced body currents was judged to be the preferred metric. The salient features affecting exposure to fields from mobile telephones and their base stations were discussed for both existing and third generation systems. As an aid to future cancer studies, high resolution numerical modelling was used to illustrate left/right exposure discrimination of bilateral organs in the head. Factors influencing both numerical and experimental dosimetry were discussed and studies to investigate the ability to appropriately rank exposure were considered important areas for research. (author)

  11. Rapporteur Report: Sources and Exposure Metrics for RF Epidemiology (Part 1) (invited paper)

    International Nuclear Information System (INIS)

    Allen, S.

    1999-01-01

    A variety of sources was considered illustrating the predominantly uniform exposure at a distance and highly non-uniform exposures close to sources. The measurement of both electric and magnetic fields was considered for near field situations but where possible the use of induced body currents was judged to be the preferred metric. The salient features affecting exposure to fields from mobile telephones and their base stations were discussed for both existing and third generation systems. As an aid to future cancer studies, high resolution numerical modelling was used to illustrate left/right exposure discrimination of bilateral organs in the head. Factors influencing both numerical and experimental dosimetry were discussed and studies to investigate the ability to appropriately rank exposure were considered important areas for research. (author)

  12. Verification of Equivalence of the Axial Gauge to the Coulomb Gauge in QED by Embedding in the Indefinite Metric Hilbert Space : Particles and Fields

    OpenAIRE

    Yuji, NAKAWAKI; Azuma, TANAKA; Kazuhiko, OZAKI; Division of Physics and Mathematics, Faculty of Engineering Setsunan University; Junior College of Osaka Institute of Technology; Faculty of General Education, Osaka Institute of Technology

    1994-01-01

    Gauge Equivalence of the A_3=0 (axial) gauge to the Coulomb gauge is directly verified in QED. For that purpose a pair of conjugate zero-norm fields are introduced. This enables us to construct a canonical formulation in the axial gauge embedded in the indefinite metric Hilbert space in such a way that the Feynman rules are not altered. In the indefinite metric Hilbert space we can implement a gauge transformation, which otherwise has to be carried out only by hand, as main parts of a canonic...

  13. Top 10 metrics for life science software good practices [version 1; referees: 2 approved

    Directory of Open Access Journals (Sweden)

    Haydee Artaza

    2016-08-01

    Full Text Available Metrics for assessing adoption of good development practices are a useful way to ensure that software is sustainable, reusable and functional. Sustainability means that the software used today will be available - and continue to be improved and supported - in the future. We report here an initial set of metrics that measure good practices in software development. This initiative differs from previously developed efforts in being a community-driven grassroots approach where experts from different organisations propose good software practices that have reasonable potential to be adopted by the communities they represent. We not only focus our efforts on understanding and prioritising good practices, we assess their feasibility for implementation and publish them here.

  14. The Use of the Kurtosis-Adjusted Cumulative Noise Exposure Metric in Evaluating the Hearing Loss Risk for Complex Noise.

    Science.gov (United States)

    Xie, Hong-Wei; Qiu, Wei; Heyer, Nicholas J; Zhang, Mei-Bian; Zhang, Peng; Zhao, Yi-Ming; Hamernik, Roger P

    2016-01-01

    To test a kurtosis-adjusted cumulative noise exposure (CNE) metric for use in evaluating the risk of hearing loss among workers exposed to industrial noises. Specifically, to evaluate whether the kurtosis-adjusted CNE (1) provides a better association with observed industrial noise-induced hearing loss, and (2) provides a single metric applicable to both complex (non-Gaussian [non-G]) and continuous or steady state (Gaussian [G]) noise exposures for predicting noise-induced hearing loss (dose-response curves). Audiometric and noise exposure data were acquired on a population of screened workers (N = 341) from two steel manufacturing plants located in Zhejiang province and a textile manufacturing plant located in Henan province, China. All the subjects from the two steel manufacturing plants (N = 178) were exposed to complex noise, whereas the subjects from textile manufacturing plant (N = 163) were exposed to a G continuous noise. Each subject was given an otologic examination to determine their pure-tone HTL and had their personal 8-hr equivalent A-weighted noise exposure (LAeq) and full-shift noise kurtosis statistic (which is sensitive to the peaks and temporal characteristics of noise exposures) measured. For each subject, an unadjusted and kurtosis-adjusted CNE index for the years worked was created. Multiple linear regression analysis controlling for age was used to determine the relationship between CNE (unadjusted and kurtosis adjusted) and the mean HTL at 3, 4, and 6 kHz (HTL346) among the complex noise-exposed group. In addition, each subject's HTLs from 0.5 to 8.0 kHz were age and sex adjusted using Annex A (ISO-1999) to determine whether they had adjusted high-frequency noise-induced hearing loss (AHFNIHL), defined as an adjusted HTL shift of 30 dB or greater at 3.0, 4.0, or 6.0 kHz in either ear. Dose-response curves for AHFNIHL were developed separately for workers exposed to G and non-G noise using both unadjusted and adjusted CNE as the exposure

  15. Reliability, Validity, Comparability and Practical Utility of Cybercrime-Related Data, Metrics, and Information

    OpenAIRE

    Nir Kshetri

    2013-01-01

    With an increasing pervasiveness, prevalence and severity of cybercrimes, various metrics, measures and statistics have been developed and used to measure various aspects of this phenomenon. Cybercrime-related data, metrics, and information, however, pose important and difficult dilemmas regarding the issues of reliability, validity, comparability and practical utility. While many of the issues of the cybercrime economy are similar to other underground and underworld industries, this economy ...

  16. Proposed method for assigning metric tons of heavy metal values to defense high-level waste forms to be disposed of in a geologic repository

    International Nuclear Information System (INIS)

    1987-08-01

    A proposed method is described for assigning an equivalent metric ton heavy metal (eMTHM) value to defense high-level waste forms to be disposed of in a geologic repository. This method for establishing a curie equivalency between defense high-level waste and irradiated commercial fuel is based on the ratio of defense fuel exposure to the typical commercial fuel exposure, MWd/MTHM. application of this technique to defense high-level wastes is described. Additionally, this proposed technique is compared to several alternate calculations for eMTHM. 15 refs., 2 figs., 10 tabs

  17. Investigation of 1-cm dose equivalent for photons behind shielding materials

    International Nuclear Information System (INIS)

    Hirayama, Hideo; Tanaka, Shun-ichi

    1991-03-01

    The ambient dose equivalent at 1-cm depth, assumed equivalent to the 1-cm dose equivalent in practical dose estimations behind shielding slabs of water, concrete, iron or lead for normally incident photons having various energies was calculated by using conversion factors for a slab phantom. It was compared with the 1-cm depth dose calculated with the Monte Carlo code EGS4. It was concluded from this comparison that the ambient dose equivalent calculated by using the conversion factors for the ICRU sphere could be used for the evaluation of the 1-cm dose equivalent for the sphere phantom within 20% errors. Average and practical conversion factors are defined as the conversion factors from exposure to ambient dose equivalent in a finite slab or an infinite one, respectively. They were calculated with EGS4 and the discrete ordinates code PALLAS. The exposure calculated with simple estimation procedures such as point kernel methods can be easily converted to ambient dose equivalent by using these conversion factors. The maximum value between 1 and 30 mfp can be adopted as the conversion factor which depends only on material and incident photon energy. This gives the ambient dose equivalent on the safe side. 13 refs., 7 figs., 2 tabs

  18. Extremely low-frequency magnetic fields and childhood acute lymphoblastic leukemia: an exploratory analysis of alternative exposure metrics.

    Science.gov (United States)

    Auvinen, A; Linet, M S; Hatch, E E; Kleinerman, R A; Robison, L L; Kaune, W T; Misakian, M; Niwa, S; Wacholder, S; Tarone, R E

    2000-07-01

    Data collected by the National Cancer Institute-Children's Cancer Group were utilized to explore various metrics of magnetic field levels and risk of acute lymphoblastic leukemia (ALL) in children. Cases were aged 0-14 years, were diagnosed with ALL during 1989-1993, were registered with the Children's Cancer Group, and resided in one home for at least 70 percent of the 5 years immediately prior to diagnosis. Controls were identified by using random digit dialing and met the same residential requirements. With 30-second ("spot") measurements and components of the 24-hour measurement obtained in the subject's bedroom, metrics evaluated included measures of central tendency, peak exposures, threshold values, and measures of short-term temporal variability. Measures of central tendency and the threshold measures showed good-to-high correlation, but these metrics correlated less well with the others. Small increases in risk (ranging from 1.02 to 1.69 for subjects in the highest exposure category) were associated with some measures of central tendency, but peak exposures, threshold values, measures of short-term variability, and spot measurements demonstrated little association with risk of childhood ALL. In general, risk estimates were slightly higher for the nighttime (10 p.m.-6 a.m.) interval than for the corresponding 24-hour period.

  19. AN EVALUATION OF OZONE EXPOSURE METRICS FOR A SEASONALLY DROUGHT STRESSED PONDEROSA PINE ECOSYSTEM. (R826601)

    Science.gov (United States)

    Ozone stress has become an increasingly significant factor in cases of forest decline reported throughout the world. Current metrics to estimate ozone exposure for forest trees are derived from atmospheric concentrations and assume that the forest is physiologically active at ...

  20. Wildfire spread, hazard and exposure metric raster grids for central Catalonia

    Directory of Open Access Journals (Sweden)

    Fermín J. Alcasena

    2018-04-01

    Full Text Available We provide 40 m resolution wildfire spread, hazard and exposure metric raster grids for the 0.13 million ha fire-prone Bages County in central Catalonia (northeastern Spain corresponding to node influence grid (NIG, crown fraction burned (CFB and fire transmission to residential houses (TR. Fire spread and behavior data (NIG, CFB and fire perimeters were generated with fire simulation modeling considering wildfire season extreme fire weather conditions (97th percentile. Moreover, CFB was also generated for prescribed fire (Rx mild weather conditions. The TR smoothed grid was obtained with a geospatial analysis considering large fire perimeters and individual residential structures located within the study area. We made these raster grids available to assist in the optimization of wildfire risk management plans within the study area and to help mitigate potential losses from catastrophic events. Keywords: Catalonia, Wildfire exposure, Fire transmission, Crown fire activity, Prescribed fires

  1. Equivalent magnetic vector potential model for low-frequency magnetic exposure assessment

    Science.gov (United States)

    Diao, Y. L.; Sun, W. N.; He, Y. Q.; Leung, S. W.; Siu, Y. M.

    2017-10-01

    In this paper, a novel source model based on a magnetic vector potential for the assessment of induced electric field strength in a human body exposed to the low-frequency (LF) magnetic field of an electrical appliance is presented. The construction of the vector potential model requires only a single-component magnetic field to be measured close to the appliance under test, hence relieving considerable practical measurement effort—the radial basis functions (RBFs) are adopted for the interpolation of discrete measurements; the magnetic vector potential model can then be directly constructed by summing a set of simple algebraic functions of RBF parameters. The vector potentials are then incorporated into numerical calculations as the equivalent source for evaluations of the induced electric field in the human body model. The accuracy and effectiveness of the proposed model are demonstrated by comparing the induced electric field in a human model to that of the full-wave simulation. This study presents a simple and effective approach for modelling the LF magnetic source. The result of this study could simplify the compliance test procedure for assessing an electrical appliance regarding LF magnetic exposure.

  2. Temporary threshold shifts from exposures to equal equivalent continuous A-weighted sound pressure level

    DEFF Research Database (Denmark)

    Ordoñez, Rodrigo Pizarro; Hammershøi, Dorte

    2014-01-01

    the assumptions made using the A-weighting curve for the assessment of hearing damage. By modifying exposure ratings to compensate for the build-up of energy at mid and high-frequencies (above 1 kHz) due to the presence of the listener in the sound field and for the levels below an effect threshold that does...... not induce changes in hearing (equivalent quiet levels), ratings of the sound exposure that reflect the observed temporary changes in auditory function can be obtained.......According to existing methods for the assessment of hearing damage, signals with the same A-weighted equivalent level should pose the same hazard to the auditory system. As a measure of hazard, it is assumed that Temporary Thresholds Shifts (TTS) reflect the onset of alterations to the hearing...

  3. Individual monitoring of external exposure in terms of personal dose equivalent, Hp(d)

    International Nuclear Information System (INIS)

    Fantuzzi, E.

    2001-01-01

    The institute for Radiation Protection of ENEA - Bologna has organised a one day-workshop on the subject: Individual monitoring of external exposure in terms of personal dose equivalent, H p (d). The aim of the workshop was the discussion of the new implications and modifications to be expected in the routine individual monitoring of external radiation, due to the issue of the Decree 241/00 (G.U. 31/8/2000) in charge since 01/01/2001. The decree set up in Italian law the standards contained in the European Directive EURATOM 96/29-Basic Standards for the Protection of Health of Workers and the General Public against Dangers arising from Ionizing Radiation. Among others, the definition of the operational quantities for external radiation for personal and environmental monitoring, H p (d) e H * (d) respectively as defined by ICRU (International Commission for Radiation Units and Measurements), requires to update the methods of measurements and calibration of the personal dosemeters and environmental monitors. This report collects the papers presented at the workshop dealing with the Personal Dose Equivalent, H p (d), the conversion coefficients, H p (d)/K a e H p (d)/ , obtained through Monte Carlo calculations published by ICRU and ICRP (International Commission for Radiation Protection), the new calibration procedures and the practical implication in the routine of individual monitoring in terms of H p (d). Eventually, in the last chapter, the answers to Frequently Asked Questions (FAQ) are briefly reported [it

  4. A new method for generating distributions of biomonitoring equivalents to support exposure assessment and prioritization.

    Science.gov (United States)

    Phillips, Martin B; Sobus, Jon R; George, Barbara J; Isaacs, Kristin; Conolly, Rory; Tan, Yu-Mei

    2014-08-01

    Biomonitoring data are now available for hundreds of chemicals through state and national health surveys. Exposure guidance values also exist for many of these chemicals. Several methods are frequently used to evaluate biomarker data with respect to a guidance value. The "biomonitoring equivalent" (BE) approach estimates a single biomarker concentration (called the BE) that corresponds to a guidance value (e.g., Maximum Contaminant Level, Reference Dose, etc.), which can then be compared with measured biomarker data. The resulting "hazard quotient" estimates (HQ=biomarker concentration/BE) can then be used to prioritize chemicals for follow-up examinations. This approach is used exclusively for population-level assessments, and works best when the central tendency of measurement data is considered. Complementary approaches are therefore needed for assessing individual biomarker levels, particularly those that fall within the upper percentiles of measurement distributions. In this case study, probabilistic models were first used to generate distributions of BEs for perchlorate based on the point-of-departure (POD) of 7μg/kg/day. These distributions reflect possible biomarker concentrations in a hypothetical population where all individuals are exposed at the POD. A statistical analysis was then performed to evaluate urinary perchlorate measurements from adults in the 2001 to 2002 National Health and Nutrition Examination Survey (NHANES). Each NHANES adult was assumed to have experienced repeated exposure at the POD, and their biomarker concentration was interpreted probabilistically with respect to a BE distribution. The HQ based on the geometric mean (GM) urinary perchlorate concentration was estimated to be much lower than unity (HQ≈0.07). This result suggests that the average NHANES adult was exposed to perchlorate at a level well below the POD. Regarding individuals, at least a 99.8% probability was calculated for all but two NHANES adults that a higher

  5. Path integral measure for first-order and metric gravities

    International Nuclear Information System (INIS)

    Aros, Rodrigo; Contreras, Mauricio; Zanelli, Jorge

    2003-01-01

    The equivalence between the path integrals for first-order gravity and the standard torsion-free, metric gravity in 3 + 1 dimensions is analysed. Starting with the path integral for first-order gravity, the correct measure for the path integral of the metric theory is obtained

  6. Energy conservation and the principle of equivalence

    International Nuclear Information System (INIS)

    Haugan, M.P.

    1979-01-01

    If the equivalence principle is violated, then observers performing local experiments can detect effects due to their position in an external gravitational environment (preferred-location effects) or can detect effects due to their velocity through some preferred frame (preferred frame effects). We show that the principle of energy conservation implies a quantitative connection between such effects and structure-dependence of the gravitational acceleration of test bodies (violation of the Weak Equivalence Principle). We analyze this connection within a general theoretical framework that encompasses both non-gravitational local experiments and test bodies as well as gravitational experiments and test bodies, and we use it to discuss specific experimental tests of the equivalence principle, including non-gravitational tests such as gravitational redshift experiments, Eoetvoes experiments, the Hughes-Drever experiment, and the Turner-Hill experiment, and gravitational tests such as the lunar-laser-ranging ''Eoetvoes'' experiment, and measurements of anisotropies and variations in the gravitational constant. This framework is illustrated by analyses within two theoretical formalisms for studying gravitational theories: the PPN formalism, which deals with the motion of gravitating bodies within metric theories of gravity, and the THepsilonμ formalism that deals with the motion of charged particles within all metric theories and a broad class of non-metric theories of gravity

  7. Noise annoyance from stationary sources: Relationships with exposure metric day-evening-night level (DENL) and their confidence intervals

    NARCIS (Netherlands)

    Miedema, H.M.E.; Vos, H.

    2004-01-01

    Relationships between exposure to noise [metric: day-evening-night levels (DENL)] from stationary sources (shunting yards, a seasonal industry, and other industries) and annoyance are presented. Curves are presented for expected annoyance score, the percentage "highly annoyed" (%HA, cutoff at 72 on

  8. Development of new VOC exposure metrics and their relationship to ''Sick Building Syndrome'' symptoms

    Energy Technology Data Exchange (ETDEWEB)

    Ten Brinke, JoAnn [Univ. of California, Berkeley, CA (United States); Lawrence Berkeley National Lab., Berkeley, CA (United States)

    1995-08-01

    Volatile organic compounds (VOCs) are suspected to contribute significantly to ''Sick Building Syndrome'' (SBS), a complex of subchronic symptoms that occurs during and in general decreases away from occupancy of the building in question. A new approach takes into account individual VOC potencies, as well as the highly correlated nature of the complex VOC mixtures found indoors. The new VOC metrics are statistically significant predictors of symptom outcomes from the California Healthy Buildings Study data. Multivariate logistic regression analyses were used to test the hypothesis that a summary measure of the VOC mixture, other risk factors, and covariates for each worker will lead to better prediction of symptom outcome. VOC metrics based on animal irritancy measures and principal component analysis had the most influence in the prediction of eye, dermal, and nasal symptoms. After adjustment, a water-based paints and solvents source was found to be associated with dermal and eye irritation. The more typical VOC exposure metrics used in prior analyses were not useful in symptom prediction in the adjusted model (total VOC (TVOC), or sum of individually identified VOCs (ΣVOCi)). Also not useful were three other VOC metrics that took into account potency, but did not adjust for the highly correlated nature of the data set, or the presence of VOCs that were not measured. High TVOC values (2--7 mg m-3) due to the presence of liquid-process photocopiers observed in several study spaces significantly influenced symptoms. Analyses without the high TVOC values reduced, but did not eliminate the ability of the VOC exposure metric based on irritancy and principal component analysis to explain symptom outcome.

  9. Metrics in Keplerian orbits quotient spaces

    Science.gov (United States)

    Milanov, Danila V.

    2018-03-01

    Quotient spaces of Keplerian orbits are important instruments for the modelling of orbit samples of celestial bodies on a large time span. We suppose that variations of the orbital eccentricities, inclinations and semi-major axes remain sufficiently small, while arbitrary perturbations are allowed for the arguments of pericentres or longitudes of the nodes, or both. The distance between orbits or their images in quotient spaces serves as a numerical criterion for such problems of Celestial Mechanics as search for common origin of meteoroid streams, comets, and asteroids, asteroid families identification, and others. In this paper, we consider quotient sets of the non-rectilinear Keplerian orbits space H. Their elements are identified irrespective of the values of pericentre arguments or node longitudes. We prove that distance functions on the quotient sets, introduced in Kholshevnikov et al. (Mon Not R Astron Soc 462:2275-2283, 2016), satisfy metric space axioms and discuss theoretical and practical importance of this result. Isometric embeddings of the quotient spaces into R^n, and a space of compact subsets of H with Hausdorff metric are constructed. The Euclidean representations of the orbits spaces find its applications in a problem of orbit averaging and computational algorithms specific to Euclidean space. We also explore completions of H and its quotient spaces with respect to corresponding metrics and establish a relation between elements of the extended spaces and rectilinear trajectories. Distance between an orbit and subsets of elliptic and hyperbolic orbits is calculated. This quantity provides an upper bound for the metric value in a problem of close orbits identification. Finally the invariance of the equivalence relations in H under coordinates change is discussed.

  10. Reliability, Validity, Comparability and Practical Utility of Cybercrime-Related Data, Metrics, and Information

    Directory of Open Access Journals (Sweden)

    Nir Kshetri

    2013-02-01

    Full Text Available With an increasing pervasiveness, prevalence and severity of cybercrimes, various metrics, measures and statistics have been developed and used to measure various aspects of this phenomenon. Cybercrime-related data, metrics, and information, however, pose important and difficult dilemmas regarding the issues of reliability, validity, comparability and practical utility. While many of the issues of the cybercrime economy are similar to other underground and underworld industries, this economy also has various unique aspects. For one thing, this industry also suffers from a problem partly rooted in the incredibly broad definition of the term “cybercrime”. This article seeks to provide insights and analysis into this phenomenon, which is expected to advance our understanding into cybercrime-related information.

  11. Neutron Damage Metrics and the Quantification of the Associated Uncertainty

    International Nuclear Information System (INIS)

    Griffin, P.J.

    2012-01-01

    The motivation for this work is the determination of a methodology for deriving and validating a reference metric that can be used to correlate radiation damage from neutrons of various energies and from charged particles with observed damage modes. Exposure functions for some damage modes are being used by the radiation effects community, e.g. 1-MeV-Equivalent damage in Si and in GaAs semiconductors as well as displacements per atom (dpa) and subsequent material embrittlement in iron. The limitations with the current treatment of these energy-dependent metrics include a lack of an associated covariance matrix and incomplete validation. In addition, the analytical approaches used to derive the current metrics fail to properly treat damage in compound/poly-atomic materials, the evolution and recombination of defects as a function of time since exposure, as well as the influence of dopant materials and impurities in the material of interest. The current metrics only provide a crude correlation with the damage modes of interest. They do not, typically, even distinguish between the damage effectiveness of different types of neutron-induced lattice defects, e.g. they fail to distinguish between a vacancy-oxygen defect and a divacancy with respect to the minority carrier lifetime and the decrease in gain in a Si bipolar transistor. The goal of this work is to facilitate the generation of more advanced radiation metrics that will provide an easier intercomparison of radiation damage as delivered from various types of test facilities and with various real-world nuclear applications. One first needs to properly define the scope of the radiation damage application that is a concern before an appropriate damage metric is selected. The fidelity of the metric selected and the range of environmental parameters under which the metric can be correlated with the damage should match the intended application. It should address the scope of real-world conditions where the metric will

  12. Prevalence of hazardous exposures in veterinary practice

    Energy Technology Data Exchange (ETDEWEB)

    Wiggins, P.; Schenker, M.B.; Green, R.; Samuels, S.

    1989-01-01

    All female graduates of a major U.S. veterinary school were surveyed by mailed questionnaire to obtain details of work practice and hazard exposure during the most recent year worked and during all pregnancies. Exposure questions were based on previously implicated occupational hazards which included anesthetic gases, radiation, zoonoses, prostaglandins, vaccines, physical trauma, and pesticides. The response rate was 86% (462/537). We found that practice type and pregnancy status were major determinants of hazard exposure within the veterinary profession. Small-animal practitioners reported the highest rates of exposure to anesthetic gas (94%), X-ray (90%), and pesticides (57%). Large-animal practitioners reported greater rates of trauma (64%) and potential exposure to prostaglandins (92%), Brucella abortus vaccine (23%), and carbon monoxide (18%). Potentially hazardous workplace practices or equipment were common. Forty-one percent of respondents who reported taking X-rays did not wear film badges, and 76% reported physically restraining animals for X-ray procedures. Twenty-seven percent of the respondents exposed to anesthetic gases worked at facilities which did not have waste anesthetic gas scavenging systems. Women who worked as veterinarians during a pregnancy attempted to reduce exposures to X-rays, insecticides, and other potentially hazardous exposures. Some potentially hazardous workplace exposures are common in veterinary practice, and measures to educate workers and to reduce these exposures should not await demonstration of adverse health effects.

  13. Prevalence of hazardous exposures in veterinary practice

    International Nuclear Information System (INIS)

    Wiggins, P.; Schenker, M.B.; Green, R.; Samuels, S.

    1989-01-01

    All female graduates of a major U.S. veterinary school were surveyed by mailed questionnaire to obtain details of work practice and hazard exposure during the most recent year worked and during all pregnancies. Exposure questions were based on previously implicated occupational hazards which included anesthetic gases, radiation, zoonoses, prostaglandins, vaccines, physical trauma, and pesticides. The response rate was 86% (462/537). We found that practice type and pregnancy status were major determinants of hazard exposure within the veterinary profession. Small-animal practitioners reported the highest rates of exposure to anesthetic gas (94%), X-ray (90%), and pesticides (57%). Large-animal practitioners reported greater rates of trauma (64%) and potential exposure to prostaglandins (92%), Brucella abortus vaccine (23%), and carbon monoxide (18%). Potentially hazardous workplace practices or equipment were common. Forty-one percent of respondents who reported taking X-rays did not wear film badges, and 76% reported physically restraining animals for X-ray procedures. Twenty-seven percent of the respondents exposed to anesthetic gases worked at facilities which did not have waste anesthetic gas scavenging systems. Women who worked as veterinarians during a pregnancy attempted to reduce exposures to X-rays, insecticides, and other potentially hazardous exposures. Some potentially hazardous workplace exposures are common in veterinary practice, and measures to educate workers and to reduce these exposures should not await demonstration of adverse health effects

  14. A method, using ICRP 26 weighting factors, to determine effective dose equivalent due to nonuniform external exposures

    International Nuclear Information System (INIS)

    Dyer, S.G.

    1993-01-01

    Westinghouse Savannah River Company (WSRC) has recently implemented a methodology and supporting procedures to calculate effective dose equivalent for external exposures. The calculations are based on ICRP 26 methodology and are used to evaluate exposures when multibadging is used. The methodology is based upon the concept of open-quotes whole bodyclose quotes compartmentalization (i.e., the whole body is separated into seven specific regions of radiological concern and weighted accordingly). The highest dose measured in each compartment is used to determine the weighted dose to that compartment. Benefits of determining effective dose equivalent are compliance with DOE Orders, more accurate dose assessments, and the opportunity for improved worker protection through new ALARA opportunities

  15. On the calibration of photon dosemeters in the equivalent dose units

    International Nuclear Information System (INIS)

    Bregadze, Yu.I.; Isaev, B.M.; Maslyaev, P.F.

    1980-01-01

    General aspects of transition from exposure dose of photo radiation to equivalent one are considered. By determination the equivalent dose is a function of point location in an irradiated object, that is why it is necessary to know equivalent dose distribution in the human body for uniform description of the risk degree. The international electrotechnical comission recommends to measure equivalent doses at 7 and 800 mg/cm 2 depths in a tissue-equivalent ball with 30 cm diameter, calling them skin equivalent dose and depth equivalent dose, respectively, and to compare them with the permissible 500 mZ and 50 mZ a year, respectively. Practical transition to using equivalent dose for evaluation of radiation danger of being in photon radiation field of low energy should include measures on regraduating already produced dose meters, graduating the dose meters under production and developing the system of their metrologic supply [ru

  16. Quantifying risk over the life course - latency, age-related susceptibility, and other time-varying exposure metrics.

    Science.gov (United States)

    Wang, Molin; Liao, Xiaomei; Laden, Francine; Spiegelman, Donna

    2016-06-15

    Identification of the latency period and age-related susceptibility, if any, is an important aspect of assessing risks of environmental, nutritional, and occupational exposures. We consider estimation and inference for latency and age-related susceptibility in relative risk and excess risk models. We focus on likelihood-based methods for point and interval estimation of the latency period and age-related windows of susceptibility coupled with several commonly considered exposure metrics. The method is illustrated in a study of the timing of the effects of constituents of air pollution on mortality in the Nurses' Health Study. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  17. Health effects assessment of staff involved in medical practices of radiation exposures

    Energy Technology Data Exchange (ETDEWEB)

    Popescu, I.A.; Lacob, O. [Institute of Public Health Iasi, Radiation Hygiene Lab. (Romania); Roman, I.; Havarneanu, D. [Institute of Public Health Iasi, Occupational Medicine Dept. (Romania)

    2006-07-01

    This study aimed, starting from new national recommendation appearance, to detect health effects of medical staff from six counties of Moldavia region involved in radiation practices and to create a national register data for radiation-induce cancer. Staff involved in medical ionizing radiation uses in Romania - health care level I are monitored on recent new recommendations for three years. The micro nuclei high levels and morphological lymphocytes changes vs. clinical diagnostic can be considered as early possible malignant signs. The micro nuclei test, although unspecific, as a new exam in our legislation can bring useful information on staff exposure and provides a guidance to occupational physician in making his medical recommendations. This cytogenetic test does not seem to correlate with smoking habit or length of exposure. Micro nuclei test both in oral mucous epithelial cells and peripheral culture lymphocytes can be considered of much specificity and correlates with a recent acute exposure level. The conclusions of individual health status surveillance and assessment of personal dose equivalent are very useful data for recording in the radiation cancer-induced register.

  18. Health effects assessment of staff involved in medical practices of radiation exposures

    International Nuclear Information System (INIS)

    Popescu, I.A.; Lacob, O.; Roman, I.; Havarneanu, D.

    2006-01-01

    This study aimed, starting from new national recommendation appearance, to detect health effects of medical staff from six counties of Moldavia region involved in radiation practices and to create a national register data for radiation-induce cancer. Staff involved in medical ionizing radiation uses in Romania - health care level I are monitored on recent new recommendations for three years. The micro nuclei high levels and morphological lymphocytes changes vs. clinical diagnostic can be considered as early possible malignant signs. The micro nuclei test, although unspecific, as a new exam in our legislation can bring useful information on staff exposure and provides a guidance to occupational physician in making his medical recommendations. This cytogenetic test does not seem to correlate with smoking habit or length of exposure. Micro nuclei test both in oral mucous epithelial cells and peripheral culture lymphocytes can be considered of much specificity and correlates with a recent acute exposure level. The conclusions of individual health status surveillance and assessment of personal dose equivalent are very useful data for recording in the radiation cancer-induced register

  19. Practical application of equivalent linearization approaches to nonlinear piping systems

    International Nuclear Information System (INIS)

    Park, Y.J.; Hofmayer, C.H.

    1995-01-01

    The use of mechanical energy absorbers as an alternative to conventional hydraulic and mechanical snubbers for piping supports has attracted a wide interest among researchers and practitioners in the nuclear industry. The basic design concept of energy absorbers (EA) is to dissipate the vibration energy of piping systems through nonlinear hysteretic actions of EA exclamation point s under design seismic loads. Therefore, some type of nonlinear analysis needs to be performed in the seismic design of piping systems with EA supports. The equivalent linearization approach (ELA) can be a practical analysis tool for this purpose, particularly when the response approach (RSA) is also incorporated in the analysis formulations. In this paper, the following ELA/RSA methods are presented and compared to each other regarding their practice and numerical accuracy: Response approach using the square root of sum of squares (SRSS) approximation (denoted RS in this paper). Classical ELA based on modal combinations and linear random vibration theory (denoted CELA in this paper). Stochastic ELA based on direct solution of response covariance matrix (denoted SELA in this paper). New algorithms to convert response spectra to the equivalent power spectral density (PSD) functions are presented for both the above CELA and SELA methods. The numerical accuracy of the three EL are studied through a parametric error analysis. Finally, the practicality of the presented analysis is demonstrated in two application examples for piping systems with EA supports

  20. A practical approach to determine dose metrics for nanomaterials.

    Science.gov (United States)

    Delmaar, Christiaan J E; Peijnenburg, Willie J G M; Oomen, Agnes G; Chen, Jingwen; de Jong, Wim H; Sips, Adriënne J A M; Wang, Zhuang; Park, Margriet V D Z

    2015-05-01

    Traditionally, administered mass is used to describe doses of conventional chemical substances in toxicity studies. For deriving toxic doses of nanomaterials, mass and chemical composition alone may not adequately describe the dose, because particles with the same chemical composition can have completely different toxic mass doses depending on properties such as particle size. Other dose metrics such as particle number, volume, or surface area have been suggested, but consensus is lacking. The discussion regarding the most adequate dose metric for nanomaterials clearly needs a systematic, unbiased approach to determine the most appropriate dose metric for nanomaterials. In the present study, the authors propose such an approach and apply it to results from in vitro and in vivo experiments with silver and silica nanomaterials. The proposed approach is shown to provide a convenient tool to systematically investigate and interpret dose metrics of nanomaterials. Recommendations for study designs aimed at investigating dose metrics are provided. © 2015 SETAC.

  1. Metrics for aggregating the climate effects of different emissions: a unifying framework

    NARCIS (Netherlands)

    Tol, R.S.J.; Berntsen, T.K.; O'Neill, B.C.; Fuglestvedt, J.S.; Shine, K.P.

    2012-01-01

    Multi-gas approaches to climate change policies require a metric establishing equivalences among emissions of various species. Climate scientists and economists have proposed four kinds of such metrics and debated their relative merits. We present a unifying framework that clarifies the

  2. Quantifying Risk Over the Life Course – Latency, Age-Related Susceptibility, and Other Time-Varying Exposure Metrics

    Science.gov (United States)

    Wang, Molin; Liao, Xiaomei; Laden, Francine; Spiegelman, Donna

    2016-01-01

    Identification of the latency period and age-related susceptibility, if any, is an important aspect of assessing risks of environmental, nutritional and occupational exposures. We consider estimation and inference for latency and age-related susceptibility in relative risk and excess risk models. We focus on likelihood-based methods for point and interval estimation of the latency period and age-related windows of susceptibility coupled with several commonly considered exposure metrics. The method is illustrated in a study of the timing of the effects of constituents of air pollution on mortality in the Nurses’ Health Study. PMID:26750582

  3. MUTZ-3 derived Langerhans cells in human skin equivalents show differential migration and phenotypic plasticity after allergen or irritant exposure

    Energy Technology Data Exchange (ETDEWEB)

    Kosten, Ilona J.; Spiekstra, Sander W. [Department of Dermatology, VU University Medical Center, Amsterdam (Netherlands); Gruijl, Tanja D. de [Department of Dermatology Medical Oncology, VU University Medical Center, Amsterdam (Netherlands); Gibbs, Susan, E-mail: s.gibbs@acta.nl [Department of Dermatology, VU University Medical Center, Amsterdam (Netherlands); Department of Oral Cell Biology, Academic Center for Dentistry (ACTA), Amsterdam (Netherlands)

    2015-08-15

    After allergen or irritant exposure, Langerhans cells (LC) undergo phenotypic changes and exit the epidermis. In this study we describe the unique ability of MUTZ-3 derived Langerhans cells (MUTZ-LC) to display similar phenotypic plasticity as their primary counterparts when incorporated into a physiologically relevant full-thickness skin equivalent model (SE-LC). We describe differences and similarities in the mechanisms regulating LC migration and plasticity upon allergen or irritant exposure. The skin equivalent consisted of a reconstructed epidermis containing primary differentiated keratinocytes and CD1a{sup +} MUTZ-LC on a primary fibroblast-populated dermis. Skin equivalents were exposed to a panel of allergens and irritants. Topical exposure to sub-toxic concentrations of allergens (nickel sulfate, resorcinol, cinnamaldehyde) and irritants (Triton X-100, SDS, Tween 80) resulted in LC migration out of the epidermis and into the dermis. Neutralizing antibody to CXCL12 blocked allergen-induced migration, whereas anti-CCL5 blocked irritant-induced migration. In contrast to allergen exposure, irritant exposure resulted in cells within the dermis becoming CD1a{sup −}/CD14{sup +}/CD68{sup +} which is characteristic of a phenotypic switch of MUTZ-LC to a macrophage-like cell in the dermis. This phenotypic switch was blocked with anti-IL-10. Mechanisms previously identified as being involved in LC activation and migration in native human skin could thus be reproduced in the in vitro constructed skin equivalent model containing functional LC. This model therefore provides a unique and relevant research tool to study human LC biology in situ under controlled in vitro conditions, and will provide a powerful tool for hazard identification, testing novel therapeutics and identifying new drug targets. - Highlights: • MUTZ-3 derived Langerhans cells integrated into skin equivalents are fully functional. • Anti-CXCL12 blocks allergen-induced MUTZ-LC migration.

  4. Antipsychotic dose equivalents and dose-years: a standardized method for comparing exposure to different drugs.

    Science.gov (United States)

    Andreasen, Nancy C; Pressler, Marcus; Nopoulos, Peg; Miller, Del; Ho, Beng-Choon

    2010-02-01

    A standardized quantitative method for comparing dosages of different drugs is a useful tool for designing clinical trials and for examining the effects of long-term medication side effects such as tardive dyskinesia. Such a method requires establishing dose equivalents. An expert consensus group has published charts of equivalent doses for various antipsychotic medications for first- and second-generation medications. These charts were used in this study. Regression was used to compare each drug in the experts' charts to chlorpromazine and haloperidol and to create formulas for each relationship. The formulas were solved for chlorpromazine 100 mg and haloperidol 2 mg to derive new chlorpromazine and haloperidol equivalents. The formulas were incorporated into our definition of dose-years such that 100 mg/day of chlorpromazine equivalent or 2 mg/day of haloperidol equivalent taken for 1 year is equal to one dose-year. All comparisons to chlorpromazine and haloperidol were highly linear with R(2) values greater than .9. A power transformation further improved linearity. By deriving a unique formula that converts doses to chlorpromazine or haloperidol equivalents, we can compare otherwise dissimilar drugs. These equivalents can be multiplied by the time an individual has been on a given dose to derive a cumulative value measured in dose-years in the form of (chlorpromazine equivalent in mg) x (time on dose measured in years). After each dose has been converted to dose-years, the results can be summed to provide a cumulative quantitative measure of lifetime exposure. Copyright 2010 Society of Biological Psychiatry. Published by Elsevier Inc. All rights reserved.

  5. Sun Protection Practices and Sun Exposure among Children with a Parental History of Melanoma

    Science.gov (United States)

    Glenn, Beth A.; Lin, Tiffany; Chang, L. Cindy; Okada, Ashley; Wong, Weng Kee; Glanz, Karen; Bastani, Roshan

    2014-01-01

    Background First-degree relatives of melanoma survivors have a substantially higher lifetime risk for melanoma than individuals with no family history. Exposure to ultraviolet radiation is the primary modifiable risk factor for the disease. Reducing UV exposure through sun protection may be particularly important for children with a parental history of melanoma. Nonetheless, limited prior research has investigated sun protection practices and sun exposure among these children. Methods The California Cancer Registry was used to identify melanoma survivors eligible to participate in a survey to assess their children's sun protection practices and sun exposure. The survey was administered by mail, telephone, or web to Latino and non-Latino white melanoma survivors with at least one child (0–17 years; N = 324). Results Sun exposure was high and the rate of sunburn was equivalent to or higher than estimates from average risk populations. Use of sun protection was suboptimal. Latino children were less likely to wear sunscreen and hats and more likely to wear sunglasses, although these differences disappeared in adjusted analyses. Increasing age of the child was associated with lower sun protection and higher risk for sunburn whereas higher objective risk for melanoma predicted improved sun protection and a higher risk for sunburns. Perception of high barriers to sun protection was the strongest modifiable correlate of sun protection. Conclusions Interventions to improve sun protection and reduce sun exposure and sunburns in high risk children are needed. Impact Intervening in high risk populations may help reduce the burden of melanoma in the U.S. PMID:25587110

  6. Relaxed metrics and indistinguishability operators: the relationship

    Energy Technology Data Exchange (ETDEWEB)

    Martin, J.

    2017-07-01

    In 1982, the notion of indistinguishability operator was introduced by E. Trillas in order to fuzzify the crisp notion of equivalence relation (/cite{Trillas}). In the study of such a class of operators, an outstanding property must be pointed out. Concretely, there exists a duality relationship between indistinguishability operators and metrics. The aforesaid relationship was deeply studied by several authors that introduced a few techniques to generate metrics from indistinguishability operators and vice-versa (see, for instance, /cite{BaetsMesiar,BaetsMesiar2}). In the last years a new generalization of the metric notion has been introduced in the literature with the purpose of developing mathematical tools for quantitative models in Computer Science and Artificial Intelligence (/cite{BKMatthews,Ma}). The aforementioned generalized metrics are known as relaxed metrics. The main target of this talk is to present a study of the duality relationship between indistinguishability operators and relaxed metrics in such a way that the aforementioned classical techniques to generate both concepts, one from the other, can be extended to the new framework. (Author)

  7. Metrics for Evaluation of Student Models

    Science.gov (United States)

    Pelanek, Radek

    2015-01-01

    Researchers use many different metrics for evaluation of performance of student models. The aim of this paper is to provide an overview of commonly used metrics, to discuss properties, advantages, and disadvantages of different metrics, to summarize current practice in educational data mining, and to provide guidance for evaluation of student…

  8. Equivalence of the AdS-metric and the QCD running coupling

    Science.gov (United States)

    Pirner, H. J.; Galow, B.

    2009-08-01

    We use the functional form of the QCD running coupling to modify the conformal metric in AdS/CFT mapping the fifth-dimensional z-coordinate to the energy scale in the four-dimensional QCD. The resulting type-0 string theory in five dimensions is solved with the Nambu-Goto action giving good agreement with the Coulombic and confinement QQbar potential.

  9. Determination of the dose equivalent Hp(0.07) in hands of occupationally exposed personnel in the practice of proton emission tomography (PET/CT)

    International Nuclear Information System (INIS)

    Lea, D.; Ruiz, N.; Esteves, L.

    2006-01-01

    In Venezuela recently it was implanted the Positron Emission Tomography technique (PET) with the perspective of implanting it at national level. Even when in our country practices it of nuclear medicine it exists from early of 70, there is not experience in the determination of the occupational doses by exposure to the external radiation in hands. By this reason, a concern exists in the workers of the centers of nuclear medicine where it is practiced the Positron Emission Tomography technique. In absence of the TLD dosimetry to measure dose in hands in our country, measurements of the dose equivalent of the workers of the PET national reference center were made, using a detector of hands type diode. It was determined the dose in hands in terms of dose equivalent Hp(0.07) in two work positions, that is: the corresponding to the transfer of the receiving vial of ( 18 F) FDG to the shield, quality control and uni doses division. The second work position corresponds the person in charge of administering, via intravenous, the ( 18 F) FDG. In this work it realizes the dose equivalent in hands Hp(0.07) measures in each one of the work positions before described by daily production. The informed doses correspond to a total average produced activity of 20.4 GBq (550 mCi). The results of the measurements in terms of dose equivalent in hands Hp(0.07) correspond to 2.1 ± 20% mSv in the work position of division and 0.4 ± 10% mSv in the position of injection of the radioactive material. At short term this foreseen until 4 productions per week, what means an annual dose equivalent Hp(0.07) in hands of 400 mSv approximately, without taking into account abnormal situations as its are spills of the ( 18 F) FDG in the work place. This work is the starting point so that the regulatory authority settles down, in Venezuela, dose restrictions in the PET practices and implant, in the centers of nuclear medicine, an optimization politics of this practice in conformity with the ALARA

  10. Attenuation-based size metric for estimating organ dose to patients undergoing tube current modulated CT exams

    Energy Technology Data Exchange (ETDEWEB)

    Bostani, Maryam, E-mail: mbostani@mednet.ucla.edu; McMillan, Kyle; Lu, Peiyun; Kim, Hyun J.; Cagnon, Chris H.; McNitt-Gray, Michael F. [Departments of Biomedical Physics and Radiology, David Geffen School of Medicine, University of California, Los Angeles, Los Angeles, California 90024 (United States); DeMarco, John J. [Department of Radiation Oncology, University of California, Los Angeles, Los Angeles, California 90095 (United States)

    2015-02-15

    Purpose: Task Group 204 introduced effective diameter (ED) as the patient size metric used to correlate size-specific-dose-estimates. However, this size metric fails to account for patient attenuation properties and has been suggested to be replaced by an attenuation-based size metric, water equivalent diameter (D{sub W}). The purpose of this study is to investigate different size metrics, effective diameter, and water equivalent diameter, in combination with regional descriptions of scanner output to establish the most appropriate size metric to be used as a predictor for organ dose in tube current modulated CT exams. Methods: 101 thoracic and 82 abdomen/pelvis scans from clinically indicated CT exams were collected retrospectively from a multidetector row CT (Sensation 64, Siemens Healthcare) with Institutional Review Board approval to generate voxelized patient models. Fully irradiated organs (lung and breasts in thoracic scans and liver, kidneys, and spleen in abdominal scans) were segmented and used as tally regions in Monte Carlo simulations for reporting organ dose. Along with image data, raw projection data were collected to obtain tube current information for simulating tube current modulation scans using Monte Carlo methods. Additionally, previously described patient size metrics [ED, D{sub W}, and approximated water equivalent diameter (D{sub Wa})] were calculated for each patient and reported in three different ways: a single value averaged over the entire scan, a single value averaged over the region of interest, and a single value from a location in the middle of the scan volume. Organ doses were normalized by an appropriate mAs weighted CTDI{sub vol} to reflect regional variation of tube current. Linear regression analysis was used to evaluate the correlations between normalized organ doses and each size metric. Results: For the abdominal organs, the correlations between normalized organ dose and size metric were overall slightly higher for all three

  11. Using the Aerasense NanoTracer for simultaneously obtaining several ultrafine particle exposure metrics

    International Nuclear Information System (INIS)

    Marra, J

    2011-01-01

    The expanding production and use of nanomaterials increases the chance of human exposure to engineered nanoparticles (NP), also referred to as ultrafine particles (UFP; ≤ 100 - 300 nm). This is particularly true in workplaces where they can become airborne and thereafter inhaled by workers during nanopowder processing. Considering the suspected hazard of many engineered UFPs, the general recommendation is to take measures for minimizing personal exposure while monitoring the UFP pollution for assessment and control purposes. The portable Aerasense NanoTracer accomplishes this UFP monitoring, either intermittently or in real time. This paper reviews its design and operational characteristics and elaborates on a number of application extensions and constraints. The NanoTracer's output signals enable several UFP exposure metrics to be simultaneously inferred. These include the airborne UFP number concentration and the number-averaged particle size, serving as characteristics of the pertaining UFP pollution. When non-hygroscopic particles are involved, the NanoTracer's output signals also allow an estimation of the lung-deposited UFP surface area concentration and the lung-deposited UFP mass concentration. It is thereby possible to distinguish between UFP depositions in the alveolar region, the trachea-bronchial region and the head airway region, respectively, by making use of the ICRP particle deposition model.

  12. The metric-affine gravitational theory as the gauge theory of the affine group

    International Nuclear Information System (INIS)

    Lord, E.A.

    1978-01-01

    The metric-affine gravitational theory is shown to be the gauge theory of the affine group, or equivalently, the gauge theory of the group GL(4,R) of tetrad deformations in a space-time with a locally Minkowskian metric. The identities of the metric-affine theory, and the relationship between them and those of general relativity and Sciama-Kibble theory, are derived. (Auth.)

  13. An investigation of automatic exposure control calibration for chest imaging with a computed radiography system

    International Nuclear Information System (INIS)

    Moore, C S; Wood, T J; Beavis, A W; Saunderson, J R; Avery, G; Balcam, S; Needler, L

    2014-01-01

    The purpose of this study was to examine the use of three physical image quality metrics in the calibration of an automatic exposure control (AEC) device for chest radiography with a computed radiography (CR) imaging system. The metrics assessed were signal-to-noise ratio (SNR), contrast-to-noise ratio (CNR) and mean effective noise equivalent quanta (eNEQ m ), all measured using a uniform chest phantom. Subsequent calibration curves were derived to ensure each metric was held constant across the tube voltage range. Each curve was assessed for its clinical appropriateness by generating computer simulated chest images with correct detector air kermas for each tube voltage, and grading these against reference images which were reconstructed at detector air kermas correct for the constant detector dose indicator (DDI) curve currently programmed into the AEC device. All simulated chest images contained clinically realistic projected anatomy and anatomical noise and were scored by experienced image evaluators. Constant DDI and CNR curves do not appear to provide optimized performance across the diagnostic energy range. Conversely, constant eNEQ m  and SNR do appear to provide optimized performance, with the latter being the preferred calibration metric given as it is easier to measure in practice. Medical physicists may use the SNR image quality metric described here when setting up and optimizing AEC devices for chest radiography CR systems with a degree of confidence that resulting clinical image quality will be adequate for the required clinical task. However, this must be done with close cooperation of expert image evaluators, to ensure appropriate levels of detector air kerma. (paper)

  14. An investigation of automatic exposure control calibration for chest imaging with a computed radiography system.

    Science.gov (United States)

    Moore, C S; Wood, T J; Avery, G; Balcam, S; Needler, L; Beavis, A W; Saunderson, J R

    2014-05-07

    The purpose of this study was to examine the use of three physical image quality metrics in the calibration of an automatic exposure control (AEC) device for chest radiography with a computed radiography (CR) imaging system. The metrics assessed were signal-to-noise ratio (SNR), contrast-to-noise ratio (CNR) and mean effective noise equivalent quanta (eNEQm), all measured using a uniform chest phantom. Subsequent calibration curves were derived to ensure each metric was held constant across the tube voltage range. Each curve was assessed for its clinical appropriateness by generating computer simulated chest images with correct detector air kermas for each tube voltage, and grading these against reference images which were reconstructed at detector air kermas correct for the constant detector dose indicator (DDI) curve currently programmed into the AEC device. All simulated chest images contained clinically realistic projected anatomy and anatomical noise and were scored by experienced image evaluators. Constant DDI and CNR curves do not appear to provide optimized performance across the diagnostic energy range. Conversely, constant eNEQm and SNR do appear to provide optimized performance, with the latter being the preferred calibration metric given as it is easier to measure in practice. Medical physicists may use the SNR image quality metric described here when setting up and optimizing AEC devices for chest radiography CR systems with a degree of confidence that resulting clinical image quality will be adequate for the required clinical task. However, this must be done with close cooperation of expert image evaluators, to ensure appropriate levels of detector air kerma.

  15. Assessment of Industrial Exposure to Magnetic Fields (invited paper)

    International Nuclear Information System (INIS)

    Chadwick, P.

    1999-01-01

    Magnetic field strengths produced by industrial processes can be very large, but they often exhibit a marked spatial variation. Whilst there may be the potential for exposures of workers to be high, actual exposure will be determined to a great extent by working practices. Possible metrics for epidemiological studies might be based on the temporal variability of exposure as well as maximum operator exposure or time-weighted average exposure and, whilst it might be possible to estimate these quantities from spot magnetic field strength measurements and observed working practices, this might be very difficult to achieve in practice. An alternative would be the use of a logging dosemeter: this paper describes some of the results of exposure assessments carried out in industrial environments with a modified EMDEX II magnetic field dosemeter. Magnetic fields in industrial environments often have waveforms which are not purely sinusoidal. Distortion can be introduced by the magnetic saturation of transformer and motor cores, by rectification, by poor matching between oscillator circuits and loads and when thyristors are used to control power. The resulting repetitive but non-sinusoidal magnetic field waveforms can be recorded and analysed; the spectral data may be incorporated into possible exposure metrics. It is also important to ensure that measurement instrumentation is responding appropriately in a non-sinusoidal field and this can only be done if the spectral content of the field is characterised fully. Some non-sinusoidal magnetic field waveforms cannot be expressed as a harmonic series. Specialist instrumentation and techniques are needed to assess exposure to such fields. Examples of approaches to the assessment of exposure to repetitive and non-repetitive magnetic fields are also discussed. (author)

  16. Determination of equivalent breast phantoms for different age groups of Taiwanese women: An experimental approach

    International Nuclear Information System (INIS)

    Dong, Shang-Lung; Chu, Tieh-Chi; Lin, Yung-Chien; Lan, Gong-Yau; Yeh, Yu-Hsiu; Chen, Sharon; Chuang, Keh-Shih

    2011-01-01

    Purpose: Polymethylmethacrylate (PMMA) slab is one of the mostly used phantoms for studying breast dosimetry in mammography. The purpose of this study was to evaluate the equivalence between exposure factors acquired from PMMA slabs and patient cases of different age groups of Taiwanese women in mammography. Methods: This study included 3910 craniocaudal screen/film mammograms on Taiwanese women acquired on one mammographic unit. The tube loading, compressed breast thickness (CBT), compression force, tube voltage, and target/filter combination for each mammogram were collected for all patients. The glandularity and the equivalent thickness of PMMA were determined for each breast using the exposure factors of the breast in combination with experimental measurements from breast-tissue-equivalent attenuation slabs. Equivalent thicknesses of PMMA to the breasts of Taiwanese women were then estimated. Results: The average ± standard deviation CBT and breast glandularity in this study were 4.2 ± 1.0 cm and 54% ± 23%, respectively. The average equivalent PMMA thickness was 4.0 ± 0.7 cm. PMMA slabs producing equivalent exposure factors as in the breasts of Taiwanese women were determined for the age groups 30-49 yr and 50-69 yr. For the 4-cm PMMA slab, the CBT and glandularity values of the equivalent breast were 4.1 cm and 65%, respectively, for the age group 30-49 yr and 4.4 cm and 44%, respectively, for the age group 50-69 yr. Conclusions: The average thickness of PMMA slabs producing the same exposure factors as observed in a large group of Taiwanese women is less than that reported for American women. The results from this study can provide useful information for determining a suitable thickness of PMMA for mammographic dose survey in Taiwan. The equivalence of PMMA slabs and the breasts of Taiwanese women is provided to allow average glandular dose assessment in clinical practice.

  17. Determination of equivalent breast phantoms for different age groups of Taiwanese women: An experimental approach

    Energy Technology Data Exchange (ETDEWEB)

    Dong, Shang-Lung; Chu, Tieh-Chi; Lin, Yung-Chien; Lan, Gong-Yau; Yeh, Yu-Hsiu; Chen, Sharon; Chuang, Keh-Shih [Department of Biomedical Engineering and Environmental Sciences, National Tsing Hua University, 101 Section 2, Kuang-Fu Road, Hsinchu 30013, Taiwan (China); Department of Radiology, Cheng Hsin General Hospital, 45 Cheng Hsin Street, Pai-Tou District, Taipei 11220, Taiwan (China); Department of Medical Imaging and Radiological Sciences, Kaohsiung Medical University, 100 Shih-Chuan 1st Road, Kaohsiung 80708, Taiwan (China); Department of Biomedical Engineering and Environmental Sciences, National Tsing Hua University, 101 Section 2, Kuang-Fu Road, Hsinchu 30013, Taiwan (China)

    2011-07-15

    Purpose: Polymethylmethacrylate (PMMA) slab is one of the mostly used phantoms for studying breast dosimetry in mammography. The purpose of this study was to evaluate the equivalence between exposure factors acquired from PMMA slabs and patient cases of different age groups of Taiwanese women in mammography. Methods: This study included 3910 craniocaudal screen/film mammograms on Taiwanese women acquired on one mammographic unit. The tube loading, compressed breast thickness (CBT), compression force, tube voltage, and target/filter combination for each mammogram were collected for all patients. The glandularity and the equivalent thickness of PMMA were determined for each breast using the exposure factors of the breast in combination with experimental measurements from breast-tissue-equivalent attenuation slabs. Equivalent thicknesses of PMMA to the breasts of Taiwanese women were then estimated. Results: The average {+-} standard deviation CBT and breast glandularity in this study were 4.2 {+-} 1.0 cm and 54% {+-} 23%, respectively. The average equivalent PMMA thickness was 4.0 {+-} 0.7 cm. PMMA slabs producing equivalent exposure factors as in the breasts of Taiwanese women were determined for the age groups 30-49 yr and 50-69 yr. For the 4-cm PMMA slab, the CBT and glandularity values of the equivalent breast were 4.1 cm and 65%, respectively, for the age group 30-49 yr and 4.4 cm and 44%, respectively, for the age group 50-69 yr. Conclusions: The average thickness of PMMA slabs producing the same exposure factors as observed in a large group of Taiwanese women is less than that reported for American women. The results from this study can provide useful information for determining a suitable thickness of PMMA for mammographic dose survey in Taiwan. The equivalence of PMMA slabs and the breasts of Taiwanese women is provided to allow average glandular dose assessment in clinical practice.

  18. Daylight metrics and energy savings

    Energy Technology Data Exchange (ETDEWEB)

    Mardaljevic, John; Heschong, Lisa; Lee, Eleanor

    2009-12-31

    The drive towards sustainable, low-energy buildings has increased the need for simple, yet accurate methods to evaluate whether a daylit building meets minimum standards for energy and human comfort performance. Current metrics do not account for the temporal and spatial aspects of daylight, nor of occupants comfort or interventions. This paper reviews the historical basis of current compliance methods for achieving daylit buildings, proposes a technical basis for development of better metrics, and provides two case study examples to stimulate dialogue on how metrics can be applied in a practical, real-world context.

  19. Light Water Reactor Sustainability Program Operator Performance Metrics for Control Room Modernization: A Practical Guide for Early Design Evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Ronald Boring; Roger Lew; Thomas Ulrich; Jeffrey Joe

    2014-03-01

    As control rooms are modernized with new digital systems at nuclear power plants, it is necessary to evaluate the operator performance using these systems as part of a verification and validation process. There are no standard, predefined metrics available for assessing what is satisfactory operator interaction with new systems, especially during the early design stages of a new system. This report identifies the process and metrics for evaluating human system interfaces as part of control room modernization. The report includes background information on design and evaluation, a thorough discussion of human performance measures, and a practical example of how the process and metrics have been used as part of a turbine control system upgrade during the formative stages of design. The process and metrics are geared toward generalizability to other applications and serve as a template for utilities undertaking their own control room modernization activities.

  20. Justification of novel practices involving radiation exposure

    International Nuclear Information System (INIS)

    Webb, G.; Boal, T.; Mason, C.; Wrixon, T.

    2006-01-01

    The concept of 'justification' of practices has been one of the three basic principles of radiation protection for many decades. The principle is simple in essence - that any practice involving radiation exposure should do more good than harm. There is no doubt that the many uses of radiation in the medical field and in industry generally satisfy this principle, yielding benefits that could not be achieved using other techniques; examples include CT scanning and industrial radiography. However, even in the early period after the introduction of the justification principle, there were practices for which the decision on justification was not clear and for which different decisions were made by the authorities in different countries. Many of these involved consumer products such as luminous clocks and watches, telephone dials, smoke detectors, lightning preventers and gas mantles. In most cases, these practices were relatively small scale and did not involve large exposures of either individual workers or members of the public. Decisions on justification were therefore often made by the regulator without extensive national debate. Over recent years, several practices have been proposed and undertaken that involve exposure to radiation for purposes that were generally not envisaged when the current system of radiation protection was created. Some of these practices were reviewed during a recent symposium held in Dublin, Ireland and involve, for example, the x-raying of people for theft detection purposes, for detection of weapons or contraband, for the prediction of physical development of young athletes or dancers, for age determination, for insurance purposes and in cases of suspected child abuse. It is particularly in the context of such novel practices that the need has emerged for clearer international guidance on the application of the justification principle. This paper reviews recent activities of the IAEA with respect to these issues, including the

  1. Justification of novel practices involving radiation exposure

    Energy Technology Data Exchange (ETDEWEB)

    Webb, G. [Radiation Protection Consul tant, Brighton (United Kingdom); Boal, T.; Mason, C.; Wrixon, T. [International Atomic Energy Agency, Vienna (Austria)

    2006-07-01

    The concept of 'justification' of practices has been one of the three basic principles of radiation protection for many decades. The principle is simple in essence - that any practice involving radiation exposure should do more good than harm. There is no doubt that the many uses of radiation in the medical field and in industry generally satisfy this principle, yielding benefits that could not be achieved using other techniques; examples include CT scanning and industrial radiography. However, even in the early period after the introduction of the justification principle, there were practices for which the decision on justification was not clear and for which different decisions were made by the authorities in different countries. Many of these involved consumer products such as luminous clocks and watches, telephone dials, smoke detectors, lightning preventers and gas mantles. In most cases, these practices were relatively small scale and did not involve large exposures of either individual workers or members of the public. Decisions on justification were therefore often made by the regulator without extensive national debate. Over recent years, several practices have been proposed and undertaken that involve exposure to radiation for purposes that were generally not envisaged when the current system of radiation protection was created. Some of these practices were reviewed during a recent symposium held in Dublin, Ireland and involve, for example, the x-raying of people for theft detection purposes, for detection of weapons or contraband, for the prediction of physical development of young athletes or dancers, for age determination, for insurance purposes and in cases of suspected child abuse. It is particularly in the context of such novel practices that the need has emerged for clearer international guidance on the application of the justification principle. This paper reviews recent activities of the IAEA with respect to these issues, including the

  2. Basis for calculating body equivalent doses after external radiation exposure. 3. rev. and enl. ed.; Berechnungsgrundlage fuer die Ermittlung von Koerper-Aequivalentdosen bei aeusserer Strahlenexposition

    Energy Technology Data Exchange (ETDEWEB)

    Sarenio, O. (comp.) [Geschaeftsstelle der Strahlenschutzkommission beim Bundesamt fuer Strahlenschutz, Bonn (Germany)

    2017-07-01

    The book on the basis for calculating body equivalent doses after external radiation exposure includes the following issues: introduction covering the scope of coverage and body equivalent doses for radiation protection, terminology, photon radiation, neutron radiation, electron radiation, mixed radiation fields and the estimation of body equivalent doses for skin surface contamination.

  3. What is correct: equivalent dose or dose equivalent

    International Nuclear Information System (INIS)

    Franic, Z.

    1994-01-01

    In Croatian language some physical quantities in radiation protection dosimetry have not precise names. Consequently, in practice either terms in English or mathematical formulas are used. The situation is even worse since the Croatian language only a limited number of textbooks, reference books and other papers are available. This paper compares the concept of ''dose equivalent'' as outlined in International Commission on Radiological Protection (ICRP) recommendations No. 26 and newest, conceptually different concept of ''equivalent dose'' which is introduced in ICRP 60. It was found out that Croatian terminology is both not uniform and unprecise. For the term ''dose equivalent'' was, under influence of Russian and Serbian languages, often used as term ''equivalent dose'' even from the point of view of ICRP 26 recommendations, which was not justified. Unfortunately, even now, in Croatia the legal unit still ''dose equivalent'' defined as in ICRP 26, but the term used for it is ''equivalent dose''. Therefore, in Croatian legislation a modified set of quantities introduced in ICRP 60, should be incorporated as soon as possible

  4. Collective effective dose equivalent, population doses and risk estimates from occupational exposures in Japan

    International Nuclear Information System (INIS)

    Maruyama, Takashi; Nishizawa, Kanae; Kumamoto, Yoshikazu; Iwai, Kazuo; Mase, Naomichi.

    1993-01-01

    Collective dose equivalent and population dose from occupational exposures in Japan, 1988 were estimated on the basis of a nationwide survey. The survey was conducted on annual collective dose equivalents by sex, age group and type of radiation work for about 0.21 million workers except for the workers in nuclear power stations. The data on the workers in nuclear power stations were obtained from the official report of the Japan Nuclear Safety Commission. The total number of workers including nuclear power stations was estimated to be about 0.26 million. Radiation works were subdivided as follows: medical works including dental; non-atomic energy industry; research and education; atomic energy industry and nuclear power station. For the determination of effective dose equivalent and population dose, organ or tissue doses were measured with a phantom experiment. The resultant doses were compared with the doses previously calculated using a chord length technique and with data from ICRP publications. The annual collective effective dose equivalent were estimated to be about 21.94 person·Sv for medical workers, 7.73 person·Sv for industrial workers, 0.75 person·Sv for research and educational workers, 2.48 person·Sv for atomic energy industry and 84.4 person ·Sv for workers in nuclear power station. The population doses were calculated to be about 1.07 Sv for genetically significant dose, 0.89 Sv for leukemia significant dose and 0.42 Sv for malignant significant dose. The population risks were estimated using these population doses. (author)

  5. Physiologically based pharmacokinetic rat model for methyl tertiary-butyl ether; comparison of selected dose metrics following various MTBE exposure scenarios used for toxicity and carcinogenicity evaluation

    International Nuclear Information System (INIS)

    Borghoff, Susan J.; Parkinson, Horace; Leavens, Teresa L.

    2010-01-01

    There are a number of cancer and toxicity studies that have been carried out to assess hazard from methyl tertiary-butyl ether (MTBE) exposure via inhalation and oral administration. MTBE has been detected in surface as well as ground water supplies which emphasized the need to assess the risk from exposure via drinking water contamination. This model can now be used to evaluate route-to-route extrapolation issues concerning MTBE exposures but also as a means of comparing potential dose metrics that may provide insight to differences in biological responses observed in rats following different routes of MTBE exposure. Recently an updated rat physiologically based pharmacokinetic (PBPK) model was published that relied on a description of MTBE and its metabolite tertiary-butyl alcohol (TBA) binding to α2u-globulin, a male rat-specific protein. This model was used to predict concentrations of MTBE and TBA in the kidney, a target tissue in the male rat. The objective of this study was to use this model to evaluate the dosimetry of MTBE and TBA in rats following different exposure scenarios, used to evaluate the toxicity and carcinogenicity of MTBE, and compare various dose metrics under these different conditions. Model simulations suggested that although inhalation and drinking water exposures show a similar pattern of MTBE and TBA exposure in the blood and kidney (i.e. concentration-time profiles), the total blood and kidney levels following exposure of MTBE to 7.5 mg/ml MTBE in the drinking water for 90 days is in the same range as administration of an oral dose of 1000 mg/kg MTBE. Evaluation of the dose metrics also supports that a high oral bolus dose (i.e. 1000 mg/kg MTBE) results in a greater percentage of the dose exhaled as MTBE with a lower percent metabolized to TBA as compared to dose of MTBE that is delivered over a longer period of time as in the case of drinking water.

  6. The definition of the individual dose equivalent

    International Nuclear Information System (INIS)

    Ehrlich, Margarete

    1986-01-01

    A brief note examines the choice of the present definition of the individual dose equivalent, the new operational dosimetry quantity for external exposure. The consequences of the use of the individual dose equivalent and the danger facing the individual dose equivalent, as currently defined, are briefly discussed. (UK)

  7. DEEP code to calculate dose equivalents in human phantom for external photon exposure by Monte Carlo method

    International Nuclear Information System (INIS)

    Yamaguchi, Yasuhiro

    1991-01-01

    The present report describes a computer code DEEP which calculates the organ dose equivalents and the effective dose equivalent for external photon exposure by the Monte Carlo method. MORSE-CG, Monte Carlo radiation transport code, is incorporated into the DEEP code to simulate photon transport phenomena in and around a human body. The code treats an anthropomorphic phantom represented by mathematical formulae and user has a choice for the phantom sex: male, female and unisex. The phantom can wear personal dosimeters on it and user can specify their location and dimension. This document includes instruction and sample problem for the code as well as the general description of dose calculation, human phantom and computer code. (author)

  8. 16 CFR 1511.8 - Metric references.

    Science.gov (United States)

    2010-01-01

    ... 16 Commercial Practices 2 2010-01-01 2010-01-01 false Metric references. 1511.8 Section 1511.8 Commercial Practices CONSUMER PRODUCT SAFETY COMMISSION FEDERAL HAZARDOUS SUBSTANCES ACT REGULATIONS... parentheses for convenience and information only. ...

  9. A metric model of lambda calculus with guarded recursion

    DEFF Research Database (Denmark)

    Birkedal, Lars; Schwinghammer, Jan; Støvring, Kristian

    2010-01-01

    We give a model for Nakano’s typed lambda calculus with guarded recursive definitions in a category of metric spaces. By proving a computational adequacy result that relates the interpretation with the operational semantics, we show that the model can be used to reason about contextual equivalence....

  10. Estimation of collective effective dose equivalent from environmental radiation and radioactive materials in Japan. A preliminary study

    International Nuclear Information System (INIS)

    Maruyama, Takashi; Noda, Yutaka; Takeshita, Mitsue; Iwai, Kazuo.

    1994-01-01

    The peaceful uses of nuclear power and radiations have been developed into a stage of practical applications for human life. Radiation causes harmful effects to human beings, although human beings receives a number of invaluable benefits from the nuclear energy and the uses of radiation. In order to examine the optimization of radiation protection in these practices, collective effective dose equivalent from environmental exposures due to natural and artificial radiations have been preliminarily evaluated using most recent data. The resultant collective doses were compared with those from medical and occupational exposures. It is noted that, in Japan, the collective effective dose from environmental radiation sources can be approximately same to that from medical exposure. (author)

  11. 21 CFR 26.9 - Equivalence determination.

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 1 2010-04-01 2010-04-01 false Equivalence determination. 26.9 Section 26.9 Food... Specific Sector Provisions for Pharmaceutical Good Manufacturing Practices § 26.9 Equivalence determination... document insufficient evidence of equivalence, lack of opportunity to assess equivalence or a determination...

  12. Some Remarks on Space-Time Decompositions, and Degenerate Metrics, in General Relativity

    Science.gov (United States)

    Bengtsson, Ingemar

    Space-time decomposition of the Hilbert-Palatini action, written in a form which admits degenerate metrics, is considered. Simple numerology shows why D = 3 and 4 are singled out as admitting a simple phase space. The canonical structure of the degenerate sector turns out to be awkward. However, the real degenerate metrics obtained as solutions are the same as those that occur in Ashtekar's formulation of complex general relativity. An exact solution of Ashtekar's equations, with degenerate metric, shows that the manifestly four-dimensional form of the action, and its 3 + 1 form, are not quite equivalent.

  13. Group covariance and metrical theory

    International Nuclear Information System (INIS)

    Halpern, L.

    1983-01-01

    The a priori introduction of a Lie group of transformations into a physical theory has often proved to be useful; it usually serves to describe special simplified conditions before a general theory can be worked out. Newton's assumptions of absolute space and time are examples where the Euclidian group and translation group have been introduced. These groups were extended to the Galilei group and modified in the special theory of relativity to the Poincare group to describe physics under the given conditions covariantly in the simplest way. The criticism of the a priori character leads to the formulation of the general theory of relativity. The general metric theory does not really give preference to a particular invariance group - even the principle of equivalence can be adapted to a whole family of groups. The physical laws covariantly inserted into the metric space are however adapted to the Poincare group. 8 references

  14. The study of practices in planed diagnostic medical exposure

    International Nuclear Information System (INIS)

    Popescu, Irina-Anca; Perju, Nicoleta Ana-Maria; Cobzeanu, Camelia

    2011-01-01

    The exposure of population to ionizing radiations in medical diagnostic purposes represents a planed exposure procedure, medically justified, having a direct impact on patient health state. A justification of exposure, with a result that can confirm a clinical diagnostic, implies further important steps in treatment decisions. Optimization in patients radiological protection is the result of observing the reference levels recommendations, which maintains a reasonable individual exposure to ionizing radiation in medical purpose. In this paper we investigated the justification of 4189 exposures of patients who underwent planed diagnostic medical investigation over 36 months in a radiological unit. The most frequent investigation concerned the spinal column in 38.3% of total exposures-mainly at lumbar level (63.0% and 24.1%, respectively of total number of exposures), followed by limb bones (20.6%) and thorax (26.9%). Justification of practices included: rheumatic pains in 45.8% of exposures followed by traumatic injuries (20.6%), pleural and pulmonary pathology (19.3%), malignant processes (12.3%), ear-nose-throat investigations (1.1%) and car accidents (0.9%). The females over 40 years old were the group with the highest number of medical exposures, with 54.5% of total practices. This study revealed that the number of medical exposures justification is almost equal with non-justified examinations, confirming a not so good correlation between clinical diagnostic and the required radiological investigation. The percentages of justified versus non-justified practices indicated by specialist physicians and general practitioners were slightly equal - 59.3% vs. 40.7%, 56.9% vs. 43.1%, respectively. The analysis of data concluded that either specialist/general physicians must evaluate more rigorously the patients and all clinical signs in order to reduce as reasonable as possible the non-justified medical exposures to ionizing radiations, and thus to avoid financial and

  15. Effective dose equivalent

    International Nuclear Information System (INIS)

    Huyskens, C.J.; Passchier, W.F.

    1988-01-01

    The effective dose equivalent is a quantity which is used in the daily practice of radiation protection as well as in the radiation hygienic rules as measure for the health risks. In this contribution it is worked out upon which assumptions this quantity is based and in which cases the effective dose equivalent can be used more or less well. (H.W.)

  16. Specific requirements for public exposure in medical practice

    International Nuclear Information System (INIS)

    Fernandez Gomez, Isis Maria

    2012-01-01

    The cause of radiation sources, by exposure to the public, has excluded all those medical and occupational exposures and exposure to natural background radiation normal, in the area. The main sources of public exposure that have found are: practices, discharges or spills, food or merchandise contaminated, chronic exposure scenarios (radon, NORM), waste management (predisposal management, storage, disposal). Public exposure can occur in two forms. One has been by procedure: transport, storage, handling of sources, radioactive waste, radioactive patient. The second has been per incident: transportation accidents, loss of sources, spread of contamination, unchecked pollution. (author) [es

  17. Defining quality metrics and improving safety and outcome in allergy care.

    Science.gov (United States)

    Lee, Stella; Stachler, Robert J; Ferguson, Berrylin J

    2014-04-01

    The delivery of allergy immunotherapy in the otolaryngology office is variable and lacks standardization. Quality metrics encompasses the measurement of factors associated with good patient-centered care. These factors have yet to be defined in the delivery of allergy immunotherapy. We developed and applied quality metrics to 6 allergy practices affiliated with an academic otolaryngic allergy center. This work was conducted at a tertiary academic center providing care to over 1500 patients. We evaluated methods and variability between 6 sites. Tracking of errors and anaphylaxis was initiated across all sites. A nationwide survey of academic and private allergists was used to collect data on current practice and use of quality metrics. The most common types of errors recorded were patient identification errors (n = 4), followed by vial mixing errors (n = 3), and dosing errors (n = 2). There were 7 episodes of anaphylaxis of which 2 were secondary to dosing errors for a rate of 0.01% or 1 in every 10,000 injection visits/year. Site visits showed that 86% of key safety measures were followed. Analysis of nationwide survey responses revealed that quality metrics are still not well defined by either medical or otolaryngic allergy practices. Academic practices were statistically more likely to use quality metrics (p = 0.021) and perform systems reviews and audits in comparison to private practices (p = 0.005). Quality metrics in allergy delivery can help improve safety and quality care. These metrics need to be further defined by otolaryngic allergists in the changing health care environment. © 2014 ARS-AAOA, LLC.

  18. Annual dose equivalents estimation received by Cienfuegos population due medical practice

    International Nuclear Information System (INIS)

    Usagaua R, Z.; Santander I, E.

    1996-01-01

    This study represents the first evaluation of the effective equivalent dose that receives the population of the Cienfuegos province in Cuba because of medical practice. The evaluation is based on the tables of doses depending on several parameters that influence over these ones, and also based on large diagnostic examinations statistics of all medical institutions over a 9 years period. Values of examinations frequency, contribution to total dose from radiography, fluoroscopy, dental radiography and nuclear medicine, and other characteristics of the last ones are offered. A comparative reflection dealing with received doses by radiography and fluoroscopy techniques is also included. (authors). 4 refs

  19. Web metrics for library and information professionals

    CERN Document Server

    Stuart, David

    2014-01-01

    This is a practical guide to using web metrics to measure impact and demonstrate value. The web provides an opportunity to collect a host of different metrics, from those associated with social media accounts and websites to more traditional research outputs. This book is a clear guide for library and information professionals as to what web metrics are available and how to assess and use them to make informed decisions and demonstrate value. As individuals and organizations increasingly use the web in addition to traditional publishing avenues and formats, this book provides the tools to unlock web metrics and evaluate the impact of this content. The key topics covered include: bibliometrics, webometrics and web metrics; data collection tools; evaluating impact on the web; evaluating social media impact; investigating relationships between actors; exploring traditional publications in a new environment; web metrics and the web of data; the future of web metrics and the library and information professional.Th...

  20. Teacher ratings of ODD symptoms: measurement equivalence across Malaysian Malay, Chinese and Indian children.

    Science.gov (United States)

    Gomez, Rapson

    2014-04-01

    The study examined the measurement equivalence for teacher ratings across Malaysian Malay, Chinese and Indian children. Malaysian teachers completed ratings of the ODD symptoms for 574 Malay, 247 Chinese and 98 Indian children. The results supported the equivalences for the configural, metric, and error variances models, and the equivalences for ODD latent variances and mean scores. Together, these findings suggest good support for measurement and structural equivalences of the ODD symptoms across these ethnic groups. The theoretical and clinical implications of the findings for cross-cultural equivalence of the ODD symptoms are discussed. Copyright © 2013 Elsevier B.V. All rights reserved.

  1. New recommendations for dose equivalent

    International Nuclear Information System (INIS)

    Bengtsson, G.

    1985-01-01

    In its report 39, the International Commission on Radiation Units and Measurements (ICRU), has defined four new quantities for the determination of dose equivalents from external sources: the ambient dose equivalent, the directional dose equivalent, the individual dose equivalent, penetrating and the individual dose equivalent, superficial. The rationale behind these concepts and their practical application are discussed. Reference is made to numerical values of these quantities which will be the subject of a coming publication from the International Commission on Radiological Protection, ICRP. (Author)

  2. Significance and principles of the calculation of the effective dose equivalent for radiological protection of personnel and patients

    International Nuclear Information System (INIS)

    Drexler, G.; Williams, G.

    1985-01-01

    The application of the effective dose equivalent, Hsub(E), concept for radiological protection assessments of occupationally exposed persons is justifiable by the practicability thus achieved with regard to the limiting principles. Nevertheless, it would be proper logic to further use as the basic limiting quantity the real physical dose equivalent of homogeneous whole-body exposure, and for inhomogeneous whole-body irradiation the Hsub(E) value, calculated by means of the concept of the effective dose equivalent. For then the required concepts, models and calculations would not be connected with a basic radiation protection quantity. Application of the effective dose equivalent for radiation protection assessments for patients is misleading and is not practical with regard to assessing an individual or collective radiation risk of patients. The quantity of expected harm would be better suited to this purpose. There is no need to express the radiation risk by a dose quantity, which means careless handling of good information. (orig./WU) [de

  3. Challenges and opportunities in establishing scientific and regulatory standards for determining therapeutic equivalence of modified-release products: Workshop summary report.

    Science.gov (United States)

    Chen, Mei-Ling; Shah, Vinod P; Ganes, Derek; Midha, Kamal K; Caro, James; Nambiar, Prabu; Rocci, Mario L; Thombre, Avinash G; Abrahamsson, Bertil; Conner, Dale; Davit, Barbara; Fackler, Paul; Farrell, Colm; Gupta, Suneel; Katz, Russell; Mehta, Mehul; Preskorn, Sheldon H; Sanderink, Gerard; Stavchansky, Salomon; Temple, Robert; Wang, Yaning; Winkle, Helen; Yu, Lawrence

    2010-09-01

    Modified-release (MR) products are complex dosage forms designed to release drug in a controlled manner to achieve the desired efficacy and safety profiles. Inappropriate control of drug release from such products may result in reduced efficacy or increased toxicity. This paper is a summary report of the American Association of Pharmaceutical Scientists, International Pharmaceutical Federation, and Product Quality Research Institute workshop titled "Challenges and Opportunities in Establishing Scientific and Regulatory Standards for Assuring Therapeutic Equivalence of Modified Release Products", held October 1-2, 2009, in Baltimore, Maryland. The workshop provided an opportunity for pharmaceutical scientists from academia, industry, and regulatory agencies to discuss current regulatory expectations and industry practices for evaluating the pharmaceutical equivalence and bioequivalence of oral MR products. In the case of conventional monophasic MR formulations, the current regulatory approaches and criteria for bioequivalence evaluation were considered adequate for the assessment of therapeutic equivalence and inter-changeability of drug products. Additional measures may occasionally be needed to determine the bioequivalence of multiphasic MR products. The metric of partial AUC proposed by the US Food and Drug Administration received broad support as an additional measure for evaluating bioequivalence of multiphasic MR products designed to have a rapid onset of drug action followed by sustained response. The cutoff for partial AUCs may be based on the pharmacokinetic/pharmacodynamic or pharmacokinetic/ response characteristics of the products under examination. If the new metric is highly variable, the bioequivalence limits may be set based on the known within-subject variability for the reference product. The current regulatory approaches and criteria for bioequivalence evaluation were considered adequate for the assessment of therapeutic equivalence and

  4. Coverage Metrics for Model Checking

    Science.gov (United States)

    Penix, John; Visser, Willem; Norvig, Peter (Technical Monitor)

    2001-01-01

    When using model checking to verify programs in practice, it is not usually possible to achieve complete coverage of the system. In this position paper we describe ongoing research within the Automated Software Engineering group at NASA Ames on the use of test coverage metrics to measure partial coverage and provide heuristic guidance for program model checking. We are specifically interested in applying and developing coverage metrics for concurrent programs that might be used to support certification of next generation avionics software.

  5. Dose equivalent distributions in the AAEC total body nitrogen facility

    International Nuclear Information System (INIS)

    Allen, B.J.; Bailey, G.M.; McGregor, B.J.

    1985-01-01

    The incident neutron dose equivalent in the AAEC total body nitrogen facility is measured by a calibrated remmeter. Dose equivalent rates and distributions are calculated by Monte Carlo techniques which take account of the secondary neutron flux from the collimator. Experiment and calculation are found to be in satisfactory agreement. The effective dose equivalent per exposure is determined by weighting organ doses, and the potential detriment per exposure is calculated from ICRP risk factors

  6. f(R) gravity, torsion and non-metricity

    International Nuclear Information System (INIS)

    Sotiriou, Thomas P

    2009-01-01

    For both f(R) theories of gravity with an independent symmetric connection (no torsion), usually referred to as Palatini f(R) gravity theories, and for f(R) theories of gravity with torsion but no non-metricity, called U4 theories, it has been shown that the independent connection can actually be eliminated algebraically, as long as this connection does not couple to matter. Remarkably, the outcome in both cases is the same theory, which is dynamically equivalent with an ω 0 = -3/2 Brans-Dicke theory. It is shown here that even for the most general case of an independent connection with both non-metricity and torsion, one arrives at exactly the same theory as in the more restricted cases. This generalizes the previous results and explains why assuming that either the torsion or the non-metricity vanishing ultimately leads to the same theory. It also demonstrates that f(R) actions cannot support an independent connection which carries dynamical degrees of freedom, irrespective of how general this connection is, at least as long as there is no connection-matter coupling. (fast track communication)

  7. Evaluating use stage exposure to food contact materials in a LCA framework

    DEFF Research Database (Denmark)

    Ernstoff, Alexi; Jolliet, Olivier; Fantke, Peter

    2015-01-01

    We present novel methods to incorporate exposure to chemicals within food contact materials (FCM) (e.g. packaging) into life cycle impact assessment (LCIA). Chemical migration into food is modeled as a function of contact temperature, time, and various chemical, FCM, and food properties. In order...... in a way compatible with intake fraction, iF, a metric traditionally used in LCIA. The model predicts PiF increases with temperature and for compounds with lower octanol-water partition coefficients within more permeable materials which are in contact with foods with high ethanol equivalencies (fatty foods)....

  8. Practical protective tools for occupational exposure: 1) double focus spectacles for the aged with highly refracted glass lens 2) remodeled barrier for radiation protection.

    Science.gov (United States)

    Kurokawa, S; Yabe, S; Takamura, A; Ishizaki, H; Aizawa, S

    2000-11-30

    Two practical protective tools for occupational exposure for neurointerventional radiologists are presented. The first purpose of this study was to investigate the effectiveness of double focus spectacles for the aged with a highly refracted glass lens (special spectacles for the aged) for radiation protection of the crystalline lens of the eye in comparison with other spectacles on the market, based on the measurement of film density which was obtained by exposure of X-ray through those spectacles. As a result of the film densitometry mentioned above, the effectiveness of special spectacles for the aged in radiation protection was nearly equal to the effectiveness of a goggle type shield which is made with a 0.07 mm lead-equivalent plastic lens. The second purpose of this study was to investigate the effectiveness of the protective barrier, which we remodeled for cerebral angiography or neuroendovascular therapy, for radiation exposure, based on the measurement in a simulated study with a head phantom, and on the measurement of radiation exposure in operaters during procedures of clinical cases. In the experimental study radiation exposure in supposed position of the crystalline lens was reduced to about one third and radiation exposure in supposed position of the gonadal glands was reduced to about one seventh, compared to radiation exposure without employing the barrier. The radiation exposure was monitored at the left breast of three radiologists, in 215 cases of cerebral angiography. Employing the barrier in cerebral angiography, average equivalent dose at the left breast measured 1.49mu Sv during 10 min of fluoroscopy. In three kinds of neuroendovascular therapy in 40 cases, radiation exposure in an operator was monitored in the same fashion and the dose was recorded less than the result reported in previous papers in which any protective barrier have not been employed in the procedure (1,2). As a result, the two above mentioned protective tools are

  9. National Metrical Types in Nineteenth Century Art Song

    Directory of Open Access Journals (Sweden)

    Leigh VanHandel

    2010-01-01

    Full Text Available William Rothstein’s article “National metrical types in music of the eighteenth and early nineteenth centuries” (2008 proposes a distinction between the metrical habits of 18th and early 19th century German music and those of Italian and French music of that period. Based on theoretical treatises and compositional practice, he outlines these national metrical types and discusses the characteristics of each type. This paper presents the results of a study designed to determine whether, and to what degree, Rothstein’s characterizations of national metrical types are present in 19th century French and German art song. Studying metrical habits in this genre may provide a lens into changing metrical conceptions of 19th century theorists and composers, as well as to the metrical habits and compositional style of individual 19th century French and German art song composers.

  10. Measuring Information Security: Guidelines to Build Metrics

    Science.gov (United States)

    von Faber, Eberhard

    Measuring information security is a genuine interest of security managers. With metrics they can develop their security organization's visibility and standing within the enterprise or public authority as a whole. Organizations using information technology need to use security metrics. Despite the clear demands and advantages, security metrics are often poorly developed or ineffective parameters are collected and analysed. This paper describes best practices for the development of security metrics. First attention is drawn to motivation showing both requirements and benefits. The main body of this paper lists things which need to be observed (characteristic of metrics), things which can be measured (how measurements can be conducted) and steps for the development and implementation of metrics (procedures and planning). Analysis and communication is also key when using security metrics. Examples are also given in order to develop a better understanding. The author wants to resume, continue and develop the discussion about a topic which is or increasingly will be a critical factor of success for any security managers in larger organizations.

  11. Knowledge, Attitude and Practices toward Post Exposure ...

    African Journals Online (AJOL)

    Knowledge, Attitude and Practices toward Post Exposure Prophylaxis for Human Immunodeficiency ... Annals of Medical and Health Sciences Research ... Data related to HIV PEP was collected by pre‑designed, pre‑tested, self‑administered ...

  12. Evaluation metrics for biostatistical and epidemiological collaborations.

    Science.gov (United States)

    Rubio, Doris McGartland; Del Junco, Deborah J; Bhore, Rafia; Lindsell, Christopher J; Oster, Robert A; Wittkowski, Knut M; Welty, Leah J; Li, Yi-Ju; Demets, Dave

    2011-10-15

    Increasing demands for evidence-based medicine and for the translation of biomedical research into individual and public health benefit have been accompanied by the proliferation of special units that offer expertise in biostatistics, epidemiology, and research design (BERD) within academic health centers. Objective metrics that can be used to evaluate, track, and improve the performance of these BERD units are critical to their successful establishment and sustainable future. To develop a set of reliable but versatile metrics that can be adapted easily to different environments and evolving needs, we consulted with members of BERD units from the consortium of academic health centers funded by the Clinical and Translational Science Award Program of the National Institutes of Health. Through a systematic process of consensus building and document drafting, we formulated metrics that covered the three identified domains of BERD practices: the development and maintenance of collaborations with clinical and translational science investigators, the application of BERD-related methods to clinical and translational research, and the discovery of novel BERD-related methodologies. In this article, we describe the set of metrics and advocate their use for evaluating BERD practices. The routine application, comparison of findings across diverse BERD units, and ongoing refinement of the metrics will identify trends, facilitate meaningful changes, and ultimately enhance the contribution of BERD activities to biomedical research. Copyright © 2011 John Wiley & Sons, Ltd.

  13. Assessing the metrics of climate change. Current methods and future possibilities

    International Nuclear Information System (INIS)

    Fuglestveit, Jan S.; Berntsen, Terje K.; Godal, Odd; Sausen, Robert; Shine, Keith P.; Skodvin, Tora

    2001-01-01

    With the principle of comprehensiveness embedded in the UN Framework Convention on Climate Change (Art. 3), a multi-gas abatement strategy with emphasis also on non-CO2 greenhouse gases as targets for reduction and control measures has been adopted in the international climate regime. In the Kyoto Protocol, the comprehensive approach is made operative as the aggregate anthropogenic carbon dioxide equivalent emissions of six specified greenhouse gases or groups of gases (Art. 3). With this operationalisation, the emissions of a set of greenhouse gases with very different atmospheric lifetimes and radiative properties are transformed into one common unit - CO2 equivalents. This transformation is based on the Global Warming Potential (GWP) index, which in turn is based on the concept of radiative forcing. The GWP metric and its application in policy making has been debated, and several other alternative concepts have been suggested. In this paper, we review existing and alternative metrics of climate change, with particular emphasis on radiative forcing and GWPs, in terms of their scientific performance. This assessment focuses on questions such as the climate impact (end point) against which gases are weighted; the extent to which and how temporality is included, both with regard to emission control and with regard to climate impact; how cost issues are dealt with; and the sensitivity of the metrics to various assumptions. It is concluded that the radiative forcing concept is a robust and useful metric of the potential climatic impact of various agents and that there are prospects for improvement by weighing different forcings according to their effectiveness. We also find that although the GWP concept is associated with serious shortcomings, it retains advantages over any of the proposed alternatives in terms of political feasibility. Alternative metrics, however, make a significant contribution to addressing important issues, and this contribution should be taken

  14. Assessing the metrics of climate change. Current methods and future possibilities

    Energy Technology Data Exchange (ETDEWEB)

    Fuglestveit, Jan S.; Berntsen, Terje K.; Godal, Odd; Sausen, Robert; Shine, Keith P.; Skodvin, Tora

    2001-07-01

    With the principle of comprehensiveness embedded in the UN Framework Convention on Climate Change (Art. 3), a multi-gas abatement strategy with emphasis also on non-CO2 greenhouse gases as targets for reduction and control measures has been adopted in the international climate regime. In the Kyoto Protocol, the comprehensive approach is made operative as the aggregate anthropogenic carbon dioxide equivalent emissions of six specified greenhouse gases or groups of gases (Art. 3). With this operationalisation, the emissions of a set of greenhouse gases with very different atmospheric lifetimes and radiative properties are transformed into one common unit - CO2 equivalents. This transformation is based on the Global Warming Potential (GWP) index, which in turn is based on the concept of radiative forcing. The GWP metric and its application in policy making has been debated, and several other alternative concepts have been suggested. In this paper, we review existing and alternative metrics of climate change, with particular emphasis on radiative forcing and GWPs, in terms of their scientific performance. This assessment focuses on questions such as the climate impact (end point) against which gases are weighted; the extent to which and how temporality is included, both with regard to emission control and with regard to climate impact; how cost issues are dealt with; and the sensitivity of the metrics to various assumptions. It is concluded that the radiative forcing concept is a robust and useful metric of the potential climatic impact of various agents and that there are prospects for improvement by weighing different forcings according to their effectiveness. We also find that although the GWP concept is associated with serious shortcomings, it retains advantages over any of the proposed alternatives in terms of political feasibility. Alternative metrics, however, make a significant contribution to addressing important issues, and this contribution should be taken

  15. Logically automorphically equivalent knowledge bases

    OpenAIRE

    Aladova, Elena; Plotkin, Tatjana

    2017-01-01

    Knowledge bases theory provide an important example of the field where applications of universal algebra and algebraic logic look very natural, and their interaction with practical problems arising in computer science might be very productive. In this paper we study the equivalence problem for knowledge bases. Our interest is to find out how the informational equivalence is related to the logical description of knowledge. Studying various equivalences of knowledge bases allows us to compare d...

  16. SOILD: A computer model for calculating the effective dose equivalent from external exposure to distributed gamma sources in soil

    International Nuclear Information System (INIS)

    Chen, S.Y.; LePoire, D.; Yu, C.; Schafetz, S.; Mehta, P.

    1991-01-01

    The SOLID computer model was developed for calculating the effective dose equivalent from external exposure to distributed gamma sources in soil. It is designed to assess external doses under various exposure scenarios that may be encountered in environmental restoration programs. The models four major functional features address (1) dose versus source depth in soil, (2) shielding of clean cover soil, (3) area of contamination, and (4) nonuniform distribution of sources. The model is also capable of adjusting doses when there are variations in soil densities for both source and cover soils. The model is supported by a data base of approximately 500 radionuclides. 4 refs

  17. Dose to population as a metric in the design of optimised exposure control in digital mammography

    International Nuclear Information System (INIS)

    Klausz, R.; Shramchenko, N.

    2005-01-01

    This paper describes a methods for automatic optimisation of parameter (AOP) in digital mammography systems. Using a model of the image chain, contrast to noise ratio (CNR) and average glandular dose (AGD) are computed for possible X-ray parameters and breast types. The optimisation process consists of the determination of the operating points providing the lowest possible AGD for each CNR level and breast type. The proposed metric for the dose used in the design of an AOP mode is the resulting dose to the population, computed by averaging the AGD values over the distribution of breast types in the population. This method has been applied to the automatic exposure control of new digital mammography equipment. Breast thickness and composition are estimated from a low dose pre-exposure and used to index tables containing sets of optimised operating points. The resulting average dose to the population ranges from a level comparable to state-of-the-art screen/film mammography to a reduction by a factor of two. Using this method, both CNR and dose are kept under control for all breast types, taking into consideration both individual and collective risk. (authors)

  18. Measurement Equivalence of the Empowerment Scale for White and Black Persons with Severe Mental Illness

    Science.gov (United States)

    Morris, Scott B.; Huang, Jialin; Zhao, Lei; Sergent, Jessica D.; Neuhengen, Jonas

    2014-01-01

    Objective The current study examined the measurement equivalence on a measure of personal empowerment for African American and White consumers of mental health services. Methods Confirmatory Factor Analysis was used to assess measurement equivalences of the 28-item Empowerment Scale (Rogers, Chamberlin, Ellison & Crean, 1997), using data from 1,035 White and 301 African American persons with severe mental illness. Results Metric invariance of the Empowerment Scale was supported, in that the factor structure and loadings were equivalent across groups. Scalar invariance was violated on three items; however, the impact of these items on scale scores was quite small. Finally, subscales of empowerment tended to be more highly inter-correlated for African American than for White respondents. Conclusions and Implications for Practice Results generally support the use of Empowerment Scale for ethnic group comparisons. However, subtle differences in the psychometric properties of this measure suggest that African Americans and White individuals may conceptualize the construct of empowerment in different ways. Specifically, African American respondents had a lower threshold for endorsing some items on the self-esteem and powerlessness dimensions. Further, White respondents viewed the three dimensions of empowerment (self-esteem, powerlessness and activism) as more distinct, whereas these three traits were more strongly interrelated for African Americans. PMID:24884300

  19. The Equivalence Principle and Anomalous Magnetic Moment Experiments

    OpenAIRE

    Alvarez, C.; Mann, R. B.

    1995-01-01

    We investigate the possibility of testing of the Einstein Equivalence Principle (EEP) using measurements of anomalous magnetic moments of elementary particles. We compute the one loop correction for the $g-2$ anomaly within the class of non metric theories of gravity described by the \\tmu formalism. We find several novel mechanisms for breaking the EEP whose origin is due purely to radiative corrections. We discuss the possibilities of setting new empirical constraints on these effects.

  20. Investigating the protective properties of milk phospholipids against ultraviolet light exposure in a skin equivalent model

    Science.gov (United States)

    Russell, Ashley; Laubscher, Andrea; Jimenez-Flores, Rafael; Laiho, Lily H.

    2010-02-01

    Current research on bioactive molecules in milk has documented health advantages of bovine milk and its components. Milk Phospholipids, selected for this study, represent molecules with great potential benefit in human health and nutrition. In this study we used confocal reflectance and multiphoton microscopy to monitor changes in skin morphology upon skin exposure to ultraviolet light and evaluate the potential of milk phospholipids in preventing photodamage to skin equivalent models. The results suggest that milk phospholipids act upon skin cells in a protective manner against the effect of ultraviolet (UV) radiation. Similar results were obtained from MTT tissue viability assay and histology.

  1. Development of quality metrics for ambulatory pediatric cardiology: Infection prevention.

    Science.gov (United States)

    Johnson, Jonathan N; Barrett, Cindy S; Franklin, Wayne H; Graham, Eric M; Halnon, Nancy J; Hattendorf, Brandy A; Krawczeski, Catherine D; McGovern, James J; O'Connor, Matthew J; Schultz, Amy H; Vinocur, Jeffrey M; Chowdhury, Devyani; Anderson, Jeffrey B

    2017-12-01

    In 2012, the American College of Cardiology's (ACC) Adult Congenital and Pediatric Cardiology Council established a program to develop quality metrics to guide ambulatory practices for pediatric cardiology. The council chose five areas on which to focus their efforts; chest pain, Kawasaki Disease, tetralogy of Fallot, transposition of the great arteries after arterial switch, and infection prevention. Here, we sought to describe the process, evaluation, and results of the Infection Prevention Committee's metric design process. The infection prevention metrics team consisted of 12 members from 11 institutions in North America. The group agreed to work on specific infection prevention topics including antibiotic prophylaxis for endocarditis, rheumatic fever, and asplenia/hyposplenism; influenza vaccination and respiratory syncytial virus prophylaxis (palivizumab); preoperative methods to reduce intraoperative infections; vaccinations after cardiopulmonary bypass; hand hygiene; and testing to identify splenic function in patients with heterotaxy. An extensive literature review was performed. When available, previously published guidelines were used fully in determining metrics. The committee chose eight metrics to submit to the ACC Quality Metric Expert Panel for review. Ultimately, metrics regarding hand hygiene and influenza vaccination recommendation for patients did not pass the RAND analysis. Both endocarditis prophylaxis metrics and the RSV/palivizumab metric passed the RAND analysis but fell out during the open comment period. Three metrics passed all analyses, including those for antibiotic prophylaxis in patients with heterotaxy/asplenia, for influenza vaccination compliance in healthcare personnel, and for adherence to recommended regimens of secondary prevention of rheumatic fever. The lack of convincing data to guide quality improvement initiatives in pediatric cardiology is widespread, particularly in infection prevention. Despite this, three metrics were

  2. Analysis of specification of an electrode type sensor equivalent circuit on the base of impedance spectroscopy simulation

    International Nuclear Information System (INIS)

    Ogurtsov, V I; Mathewson, A; Sheehan, M M

    2005-01-01

    Simulation of electrochemical impedance spectroscopy (EIS) based on a LabVIEW model of a complex impedance measuring system in the frequency domain has been investigated to specify parameters of Randle's equivalent circuit, which is ordinarily used for electrode sensors. The model was based on a standard system for EIS instrumentation and consisted of a sensor modelled by Randle's equivalent circuit, a source of harmonic frequency sweep voltage applied to the sensor and a transimpedance amplifier, which transformed the sensor current to voltage. It provided impedance spectroscopy data for different levels of noise, modelled by current and voltage equivalent noise sources applied to the amplifier input. The noise influence on Randle's equivalent circuit specification was analysed by considering the behaviour of the approximation error. Different metrics including absolute, relative, semilogarithmic and logarithmic based distance between complex numbers on a complex plane were considered and compared to one another for evaluating this error. It was shown that the relative and logarithmic based metrics provide more reliable results for the determination of circuit parameters

  3. Analysis of specification of an electrode type sensor equivalent circuit on the base of impedance spectroscopy simulation

    Energy Technology Data Exchange (ETDEWEB)

    Ogurtsov, V I; Mathewson, A; Sheehan, M M [Tyndall National Institute, Lee Maltings, Prospect Row, Cork (Ireland)

    2005-01-01

    Simulation of electrochemical impedance spectroscopy (EIS) based on a LabVIEW model of a complex impedance measuring system in the frequency domain has been investigated to specify parameters of Randle's equivalent circuit, which is ordinarily used for electrode sensors. The model was based on a standard system for EIS instrumentation and consisted of a sensor modelled by Randle's equivalent circuit, a source of harmonic frequency sweep voltage applied to the sensor and a transimpedance amplifier, which transformed the sensor current to voltage. It provided impedance spectroscopy data for different levels of noise, modelled by current and voltage equivalent noise sources applied to the amplifier input. The noise influence on Randle's equivalent circuit specification was analysed by considering the behaviour of the approximation error. Different metrics including absolute, relative, semilogarithmic and logarithmic based distance between complex numbers on a complex plane were considered and compared to one another for evaluating this error. It was shown that the relative and logarithmic based metrics provide more reliable results for the determination of circuit parameters.

  4. Practical methods for exposure control/management at nuclear facilities

    International Nuclear Information System (INIS)

    Twiggs, J.A.

    1991-01-01

    Exposure management/reduction is very important to Duke Power Company. Practical exposure control/reduction techniques applied to their reactor vessel head disassembly outage activity have consistently reduced personnel exposure for this task. The following exposure control methods have worked for use and will be the industry's direction for the 1990's. A summary of these methods includes: (a) move the responsibility of exposure management from the Radiation Protection group to the Maintenance group; (b) reduce area source term by removal of source; (c) improve working environments in radiation areas by minimizing protective clothing usage; and (d) maximize the use of electronic instruments to allow remote monitoring

  5. Development of retrospective quantitative and qualitative job-exposure matrices for exposures at a beryllium processing facility.

    Science.gov (United States)

    Couch, James R; Petersen, Martin; Rice, Carol; Schubauer-Berigan, Mary K

    2011-05-01

    To construct a job-exposure matrix (JEM) for an Ohio beryllium processing facility between 1953 and 2006 and to evaluate temporal changes in airborne beryllium exposures. Quantitative area- and breathing-zone-based exposure measurements of airborne beryllium were made between 1953 and 2006 and used by plant personnel to estimate daily weighted average (DWA) exposure concentrations for sampled departments and operations. These DWA measurements were used to create a JEM with 18 exposure metrics, which was linked to the plant cohort consisting of 18,568 unique job, department and year combinations. The exposure metrics ranged from quantitative metrics (annual arithmetic/geometric average DWA exposures, maximum DWA and peak exposures) to descriptive qualitative metrics (chemical beryllium species and physical form) to qualitative assignment of exposure to other risk factors (yes/no). Twelve collapsed job titles with long-term consistent industrial hygiene samples were evaluated using regression analysis for time trends in DWA estimates. Annual arithmetic mean DWA estimates (overall plant-wide exposures including administration, non-production, and production estimates) for the data by decade ranged from a high of 1.39 μg/m(3) in the 1950s to a low of 0.33 μg/m(3) in the 2000s. Of the 12 jobs evaluated for temporal trend, the average arithmetic DWA mean was 2.46 μg/m(3) and the average geometric mean DWA was 1.53 μg/m(3). After the DWA calculations were log-transformed, 11 of the 12 had a statistically significant (p < 0.05) decrease in reported exposure over time. The constructed JEM successfully differentiated beryllium exposures across jobs and over time. This is the only quantitative JEM containing exposure estimates (average and peak) for the entire plant history.

  6. SU-F-I-05: Dose Symmetry for CTDI Equivalent Measurements with Limited Angle CBCT

    Energy Technology Data Exchange (ETDEWEB)

    Singh, V [Henry Ford Hospital, Detroit, MI (United States); McKenney, S [Children’s National Medical Center, Washington, DC (United States); Sunde, P [Radcal, Inc, Monrovia, CA (United States); Feng, W [New York Presbyterian Hospital, Tenafly, NJ (United States); Bakalyar, D [Henry Ford Health System, Detroit, MI (United States)

    2016-06-15

    Purpose: CTDI measurements, useful for characterizing the x-ray output for multi-detector CT (MDCT), require a 360° rotation of the gantry; this presents a problem for cone beam CT (CBCT) due to its limited angular rotation. The purpose of this work is to demonstrate a methodology for overcoming this limited angular rotation so that CTDI measurements can also be made on CBCT systems making it possible to compare the radiation output from both types of system with a common metric. Methods: The symmetry of the CTDI phantom allows a 360° CTDI measurement to be replaced with two 180° measurements. A pencil chamber with a real-time digitizer was placed at the center of the head phantom (16 cm, PMMA) and the resulting exposure measurement from a 180° acquisition was doubled. A pair of edge measurements, each obtained with the gantry passing through the same 180 arc, was obtained with the pencil chamber at opposite edges of the diameter of the phantom and then summed. The method was demonstrated on a clinical CT scanner (Philips, Brilliance6) and then implemented on an interventional system (Siemens, Axiom Artis). Results: The equivalent CTDI measurement agreed with the conventional CTDI measurement within 8%. The discrepancy in the two measurements is largely attributed to uncertainties in cropping the waveform to a 180°acquisition. (Note: Because of the reduced fan angle in the CBCT, CTDI is not directly comparable to MDCT values when a 32 cm phantom is used.) Conclusion: The symmetry-based CTDI measurement is an equivalent measurement to the conventional CTDI measurement when the fan angle is large enough to encompass the phantom diameter. This allows a familiar metric of radiation output to be employed on systems with a limited angular rotation.

  7. The margin of internal exposure (MOIE) concept for dermal risk assessment based on oral toxicity data - A case study with caffeine.

    Science.gov (United States)

    Bessems, Jos G M; Paini, Alicia; Gajewska, Monika; Worth, Andrew

    2017-12-01

    Route-to-route extrapolation is a common part of human risk assessment. Data from oral animal toxicity studies are commonly used to assess the safety of various but specific human dermal exposure scenarios. Using theoretical examples of various user scenarios, it was concluded that delineation of a generally applicable human dermal limit value is not a practicable approach, due to the wide variety of possible human exposure scenarios, including its consequences for internal exposure. This paper uses physiologically based kinetic (PBK) modelling approaches to predict animal as well as human internal exposure dose metrics and for the first time, introduces the concept of Margin of Internal Exposure (MOIE) based on these internal dose metrics. Caffeine was chosen to illustrate this approach. It is a substance that is often found in cosmetics and for which oral repeated dose toxicity data were available. A rat PBK model was constructed in order to convert the oral NOAEL to rat internal exposure dose metrics, i.e. the area under the curve (AUC) and the maximum concentration (C max ), both in plasma. A human oral PBK model was constructed and calibrated using human volunteer data and adapted to accommodate dermal absorption following human dermal exposure. Use of the MOIE approach based on internal dose metrics predictions provides excellent opportunities to investigate the consequences of variations in human dermal exposure scenarios. It can accommodate within-day variation in plasma concentrations and is scientifically more robust than assuming just an exposure in mg/kg bw/day. Copyright © 2017 The Authors. Published by Elsevier B.V. All rights reserved.

  8. Extreme Precipitation and Flooding: Exposure Characterization and the Association Between Exposure and Mortality in 108 United States Communities, 1987-2005

    Science.gov (United States)

    Severson, R. L.; Peng, R. D.; Anderson, G. B.

    2017-12-01

    There is substantial evidence that extreme precipitation and flooding are serious threats to public health and safety. These threats are predicted to increase with climate change. Epidemiological studies investigating the health effects of these events vary in the methods used to characterize exposure. Here, we compare two sources of precipitation data (National Oceanic and Atmospheric Administration (NOAA) station-based and North American Land Data Assimilation Systems (NLDAS-2) Reanalysis data-based) for estimating exposure to extreme precipitation and two sources of flooding data, based on United States Geological Survey (USGS) streamflow gages and the NOAA Storm Events database. We investigate associations between each of the four exposure metrics and short-term risk of four causes of mortality (accidental, respiratory-related, cardiovascular-related, and all-cause) in the United States from 1987 through 2005. Average daily precipitation values from the two precipitation data sources were moderately correlated (Spearman's rho = 0.74); however, values from the two data sources were less correlated when comparing binary metrics of exposure to extreme precipitation days (Jaccard index (J) = 0.35). Binary metrics of daily flood exposure were poorly correlated between the two flood data sources (Spearman's rho = 0.07; J = 0.05). There was little correlation between extreme precipitation exposure and flood exposure in study communities. We did not observe evidence of a positive association between any of the four exposure metrics and risk of any of the four mortality outcomes considered. Our results suggest, due to the observed lack of agreement between different extreme precipitation and flood metrics, that exposure to extreme precipitation may not serve as an effective surrogate for exposures related to flooding. Furthermore, It is possible that extreme precipitation and flood exposures may often be too localized to allow accurate exposure assessment at the

  9. Classification in medical images using adaptive metric k-NN

    Science.gov (United States)

    Chen, C.; Chernoff, K.; Karemore, G.; Lo, P.; Nielsen, M.; Lauze, F.

    2010-03-01

    The performance of the k-nearest neighborhoods (k-NN) classifier is highly dependent on the distance metric used to identify the k nearest neighbors of the query points. The standard Euclidean distance is commonly used in practice. This paper investigates the performance of k-NN classifier with respect to different adaptive metrics in the context of medical imaging. We propose using adaptive metrics such that the structure of the data is better described, introducing some unsupervised learning knowledge in k-NN. We investigated four different metrics are estimated: a theoretical metric based on the assumption that images are drawn from Brownian Image Model (BIM), the normalized metric based on variance of the data, the empirical metric is based on the empirical covariance matrix of the unlabeled data, and an optimized metric obtained by minimizing the classification error. The spectral structure of the empirical covariance also leads to Principal Component Analysis (PCA) performed on it which results the subspace metrics. The metrics are evaluated on two data sets: lateral X-rays of the lumbar aortic/spine region, where we use k-NN for performing abdominal aorta calcification detection; and mammograms, where we use k-NN for breast cancer risk assessment. The results show that appropriate choice of metric can improve classification.

  10. Dose equivalent distribution during occupational exposure in oncology

    International Nuclear Information System (INIS)

    Marco H, J.

    1996-01-01

    In this work are presented the results of the radiological surveillance of occupationally exposed workers at the National Institute of Oncology and Radiology during 26 years. The incidence of the equivalent dose in the personal working with radiant sources and radioactive substances in areas of x rays diagnostic, teletherapy, brachytherapy, nuclear medicine and biomedical research was showed. The employed dosimetric system makes use of ORWO RD3/RD4 monitoring film with copper and lead filters inside a plastic cassette manufactured in Cuba. The experimental method is supported by the optical densitometric analysis of films together with a set of standard film calibrated in standard X and gamma photon beams by means of a secondary standard dosimeter, type NPL. Statistics show that except those workings with radium-226, manual brachytherapy or Mo-99/Tc-99 generator elution, the equivalent dose distribution in our workers has been kept in regions well down the annual permissible limit. (authors). 6 refs., 3 tabs

  11. Resilience-based performance metrics for water resources management under uncertainty

    Science.gov (United States)

    Roach, Tom; Kapelan, Zoran; Ledbetter, Ralph

    2018-06-01

    This paper aims to develop new, resilience type metrics for long-term water resources management under uncertain climate change and population growth. Resilience is defined here as the ability of a water resources management system to 'bounce back', i.e. absorb and then recover from a water deficit event, restoring the normal system operation. Ten alternative metrics are proposed and analysed addressing a range of different resilience aspects including duration, magnitude, frequency and volume of related water deficit events. The metrics were analysed on a real-world case study of the Bristol Water supply system in the UK and compared with current practice. The analyses included an examination of metrics' sensitivity and correlation, as well as a detailed examination into the behaviour of metrics during water deficit periods. The results obtained suggest that multiple metrics which cover different aspects of resilience should be used simultaneously when assessing the resilience of a water resources management system, leading to a more complete understanding of resilience compared with current practice approaches. It was also observed that calculating the total duration of a water deficit period provided a clearer and more consistent indication of system performance compared to splitting the deficit periods into the time to reach and time to recover from the worst deficit events.

  12. On the conformal equivalence of harmonic maps and exponentially harmonic maps

    International Nuclear Information System (INIS)

    Hong Minchun.

    1991-06-01

    Suppose that (M,g) and (N,h) are compact smooth Riemannian manifolds without boundaries. For m = dim M ≥3, and Φ: (M,g) → (N,h) is exponentially harmonic, there exists a smooth metric g-tilde conformally equivalent to g such that Φ: (M,g-tilde) → (N,h) is harmonic. (author). 7 refs

  13. Metrix Matrix: A Cloud-Based System for Tracking Non-Relative Value Unit Value-Added Work Metrics.

    Science.gov (United States)

    Kovacs, Mark D; Sheafor, Douglas H; Thacker, Paul G; Hardie, Andrew D; Costello, Philip

    2018-03-01

    In the era of value-based medicine, it will become increasingly important for radiologists to provide metrics that demonstrate their value beyond clinical productivity. In this article the authors describe their institution's development of an easy-to-use system for tracking value-added but non-relative value unit (RVU)-based activities. Metrix Matrix is an efficient cloud-based system for tracking value-added work. A password-protected home page contains links to web-based forms created using Google Forms, with collected data populating Google Sheets spreadsheets. Value-added work metrics selected for tracking included interdisciplinary conferences, hospital committee meetings, consulting on nonbilled outside studies, and practice-based quality improvement. Over a period of 4 months, value-added work data were collected for all clinical attending faculty members in a university-based radiology department (n = 39). Time required for data entry was analyzed for 2 faculty members over the same time period. Thirty-nine faculty members (equivalent to 36.4 full-time equivalents) reported a total of 1,223.5 hours of value-added work time (VAWT). A formula was used to calculate "value-added RVUs" (vRVUs) from VAWT. VAWT amounted to 5,793.6 vRVUs or 6.0% of total work performed (vRVUs plus work RVUs [wRVUs]). Were vRVUs considered equivalent to wRVUs for staffing purposes, this would require an additional 2.3 full-time equivalents, on the basis of average wRVU calculations. Mean data entry time was 56.1 seconds per day per faculty member. As health care reimbursement evolves with an emphasis on value-based medicine, it is imperative that radiologists demonstrate the value they add to patient care beyond wRVUs. This free and easy-to-use cloud-based system allows the efficient quantification of value-added work activities. Copyright © 2017 American College of Radiology. Published by Elsevier Inc. All rights reserved.

  14. Validation of Metrics as Error Predictors

    Science.gov (United States)

    Mendling, Jan

    In this chapter, we test the validity of metrics that were defined in the previous chapter for predicting errors in EPC business process models. In Section 5.1, we provide an overview of how the analysis data is generated. Section 5.2 describes the sample of EPCs from practice that we use for the analysis. Here we discuss a disaggregation by the EPC model group and by error as well as a correlation analysis between metrics and error. Based on this sample, we calculate a logistic regression model for predicting error probability with the metrics as input variables in Section 5.3. In Section 5.4, we then test the regression function for an independent sample of EPC models from textbooks as a cross-validation. Section 5.5 summarizes the findings.

  15. Problems of Equivalence in Shona- English Bilingual Dictionaries

    African Journals Online (AJOL)

    rbr

    Page 1 ... translation equivalents in Shona-English dictionaries where lexicographers will be dealing with divergent languages and cultures, traditional practices of lexicography and the absence of reliable ... ideal in translation is to achieve structural and semantic equivalence. Absolute equivalence between any two ...

  16. Development of Quality Metrics in Ambulatory Pediatric Cardiology.

    Science.gov (United States)

    Chowdhury, Devyani; Gurvitz, Michelle; Marelli, Ariane; Anderson, Jeffrey; Baker-Smith, Carissa; Diab, Karim A; Edwards, Thomas C; Hougen, Tom; Jedeikin, Roy; Johnson, Jonathan N; Karpawich, Peter; Lai, Wyman; Lu, Jimmy C; Mitchell, Stephanie; Newburger, Jane W; Penny, Daniel J; Portman, Michael A; Satou, Gary; Teitel, David; Villafane, Juan; Williams, Roberta; Jenkins, Kathy

    2017-02-07

    The American College of Cardiology Adult Congenital and Pediatric Cardiology (ACPC) Section had attempted to create quality metrics (QM) for ambulatory pediatric practice, but limited evidence made the process difficult. The ACPC sought to develop QMs for ambulatory pediatric cardiology practice. Five areas of interest were identified, and QMs were developed in a 2-step review process. In the first step, an expert panel, using the modified RAND-UCLA methodology, rated each QM for feasibility and validity. The second step sought input from ACPC Section members; final approval was by a vote of the ACPC Council. Work groups proposed a total of 44 QMs. Thirty-one metrics passed the RAND process and, after the open comment period, the ACPC council approved 18 metrics. The project resulted in successful development of QMs in ambulatory pediatric cardiology for a range of ambulatory domains. Copyright © 2017 American College of Cardiology Foundation. Published by Elsevier Inc. All rights reserved.

  17. Using Publication Metrics to Highlight Academic Productivity and Research Impact

    Science.gov (United States)

    Carpenter, Christopher R.; Cone, David C.; Sarli, Cathy C.

    2016-01-01

    This article provides a broad overview of widely available measures of academic productivity and impact using publication data and highlights uses of these metrics for various purposes. Metrics based on publication data include measures such as number of publications, number of citations, the journal impact factor score, and the h-index, as well as emerging metrics based on document-level metrics. Publication metrics can be used for a variety of purposes for tenure and promotion, grant applications and renewal reports, benchmarking, recruiting efforts, and administrative purposes for departmental or university performance reports. The authors also highlight practical applications of measuring and reporting academic productivity and impact to emphasize and promote individual investigators, grant applications, or department output. PMID:25308141

  18. Equivalence Testing of Complex Particle Size Distribution Profiles Based on Earth Mover's Distance.

    Science.gov (United States)

    Hu, Meng; Jiang, Xiaohui; Absar, Mohammad; Choi, Stephanie; Kozak, Darby; Shen, Meiyu; Weng, Yu-Ting; Zhao, Liang; Lionberger, Robert

    2018-04-12

    Particle size distribution (PSD) is an important property of particulates in drug products. In the evaluation of generic drug products formulated as suspensions, emulsions, and liposomes, the PSD comparisons between a test product and the branded product can provide useful information regarding in vitro and in vivo performance. Historically, the FDA has recommended the population bioequivalence (PBE) statistical approach to compare the PSD descriptors D50 and SPAN from test and reference products to support product equivalence. In this study, the earth mover's distance (EMD) is proposed as a new metric for comparing PSD particularly when the PSD profile exhibits complex distribution (e.g., multiple peaks) that is not accurately described by the D50 and SPAN descriptor. EMD is a statistical metric that measures the discrepancy (distance) between size distribution profiles without a prior assumption of the distribution. PBE is then adopted to perform statistical test to establish equivalence based on the calculated EMD distances. Simulations show that proposed EMD-based approach is effective in comparing test and reference profiles for equivalence testing and is superior compared to commonly used distance measures, e.g., Euclidean and Kolmogorov-Smirnov distances. The proposed approach was demonstrated by evaluating equivalence of cyclosporine ophthalmic emulsion PSDs that were manufactured under different conditions. Our results show that proposed approach can effectively pass an equivalent product (e.g., reference product against itself) and reject an inequivalent product (e.g., reference product against negative control), thus suggesting its usefulness in supporting bioequivalence determination of a test product to the reference product which both possess multimodal PSDs.

  19. Effects of language of assessment on the measurement of acculturation: measurement equivalence and cultural frame switching.

    Science.gov (United States)

    Schwartz, Seth J; Benet-Martínez, Verónica; Knight, George P; Unger, Jennifer B; Zamboanga, Byron L; Des Rosiers, Sabrina E; Stephens, Dionne P; Huang, Shi; Szapocznik, José

    2014-03-01

    The present study used a randomized design, with fully bilingual Hispanic participants from the Miami area, to investigate 2 sets of research questions. First, we sought to ascertain the extent to which measures of acculturation (Hispanic and U.S. practices, values, and identifications) satisfied criteria for linguistic measurement equivalence. Second, we sought to examine whether cultural frame switching would emerge--that is, whether latent acculturation mean scores for U.S. acculturation would be higher among participants randomized to complete measures in English and whether latent acculturation mean scores for Hispanic acculturation would be higher among participants randomized to complete measures in Spanish. A sample of 722 Hispanic students from a Hispanic-serving university participated in the study. Participants were first asked to complete translation tasks to verify that they were fully bilingual. Based on ratings from 2 independent coders, 574 participants (79.5% of the sample) qualified as fully bilingual and were randomized to complete the acculturation measures in either English or Spanish. Theoretically relevant criterion measures--self-esteem, depressive symptoms, and personal identity--were also administered in the randomized language. Measurement equivalence analyses indicated that all of the acculturation measures--Hispanic and U.S. practices, values, and identifications-met criteria for configural, weak/metric, strong/scalar, and convergent validity equivalence. These findings indicate that data generated using acculturation measures can, at least under some conditions, be combined or compared across languages of administration. Few latent mean differences emerged. These results are discussed in terms of the measurement of acculturation in linguistically diverse populations. 2014 APA

  20. 10 CFR 20.1208 - Dose equivalent to an embryo/fetus.

    Science.gov (United States)

    2010-01-01

    ... 10 Energy 1 2010-01-01 2010-01-01 false Dose equivalent to an embryo/fetus. 20.1208 Section 20... Limits § 20.1208 Dose equivalent to an embryo/fetus. (a) The licensee shall ensure that the dose equivalent to the embryo/fetus during the entire pregnancy, due to the occupational exposure of a declared...

  1. Practical quantitative measures of ALARA

    International Nuclear Information System (INIS)

    Kathren, R.L.; Larson, H.V.

    1982-06-01

    Twenty specific quantitative measures to assist in evaluating the effectiveness of as low as reasonably achievable (ALARA) programs are described along with their applicability, practicality, advantages, disadvantages, and potential for misinterpretation or dortion. Although no single index or combination of indices is suitable for all facilities, generally, these five: (1) mean individual dose equivalent (MIDE) to the total body from penetrating radiations; (2) statistical distribution of MIDE to the whole body from penetrating radiations; (3) cumulative penetrating whole body dose equivalent; (4) MIDE evaluated by job classification; and (5) MIDE evaluated by work location-apply to most programs. Evaluation of other programs may require other specific dose equivalent based indices, including extremity exposure data, cumulative dose equivalent to organs or to the general population, and nonpenetrating radiation dose equivalents. Certain nondose equivalent indices, such as the size of the radiation or contamination area, may also be used; an airborne activity index based on air concentration, room volume, and radiotoxicity is developed for application in some ALARA programs

  2. Practical application of decision support metrics for power plant risk-informed asset management

    International Nuclear Information System (INIS)

    Liming, James K.; Johnson, David H.; Kee, Ernest J.; Sun, Alice Y.; Young, Garry G.

    2003-01-01

    The objective of this paper is to provide electric utilities with a concept for developing and applying effective decision support metrics via integrated risk-informed asset management (RIAM) programs for power stations and generating companies. RIAM is a process by which analysts review historical performance and develop predictive logic models and data analyses to predict critical decision support figures-of-merit (or metrics) for generating station managers and electric utility company executives. These metrics include, but are not limited to, the following; profitability, net benefit, benefit-to-cost ratio, projected return on investment, projected revenue, projected costs, asset value, safety (catastrophic facility damage frequency and consequences, etc.), power production availability (capacity factor, etc.), efficiency (heat rate), and others. RIAM applies probabilistic safety assessment (PSA) techniques and generates predictions probabilistically so that metrics information can be supplied to managers in terms of probability distributions as well as point estimates. This enables the managers to apply the concept of 'confidence levels' in their critical decision-making processes. (author)

  3. Standardised metrics for global surgical surveillance.

    Science.gov (United States)

    Weiser, Thomas G; Makary, Martin A; Haynes, Alex B; Dziekan, Gerald; Berry, William R; Gawande, Atul A

    2009-09-26

    Public health surveillance relies on standardised metrics to evaluate disease burden and health system performance. Such metrics have not been developed for surgical services despite increasing volume, substantial cost, and high rates of death and disability associated with surgery. The Safe Surgery Saves Lives initiative of WHO's Patient Safety Programme has developed standardised public health metrics for surgical care that are applicable worldwide. We assembled an international panel of experts to develop and define metrics for measuring the magnitude and effect of surgical care in a population, while taking into account economic feasibility and practicability. This panel recommended six measures for assessing surgical services at a national level: number of operating rooms, number of operations, number of accredited surgeons, number of accredited anaesthesia professionals, day-of-surgery death ratio, and postoperative in-hospital death ratio. We assessed the feasibility of gathering such statistics at eight diverse hospitals in eight countries and incorporated them into the WHO Guidelines for Safe Surgery, in which methods for data collection, analysis, and reporting are outlined.

  4. Assessing elemental mercury vapor exposure from cultural and religious practices.

    Science.gov (United States)

    Riley, D M; Newby, C A; Leal-Almeraz, T O; Thomas, V M

    2001-08-01

    Use of elemental mercury in certain cultural and religious practices can cause high exposures to mercury vapor. Uses include sprinkling mercury on the floor of a home or car, burning it in a candle, and mixing it with perfume. Some uses can produce indoor air mercury concentrations one or two orders of magnitude above occupational exposure limits. Exposures resulting from other uses, such as infrequent use of a small bead of mercury, could be well below currently recognized risk levels. Metallic mercury is available at almost all of the 15 botanicas visited in New York, New Jersey, and Pennsylvania, but botanica personnel often deny having mercury for sale when approached by outsiders to these religious and cultural traditions. Actions by public health authorities have driven the mercury trade underground in some locations. Interviews indicate that mercury users are aware that mercury is hazardous, but are not aware of the inhalation exposure risk. We argue against a crackdown by health authorities because it could drive the practices further underground, because high-risk practices may be rare, and because uninformed government intervention could have unfortunate political and civic side effects for some Caribbean and Latin American immigrant groups. We recommend an outreach and education program involving religious and community leaders, botanica personnel, and other mercury users.

  5. Metrical presentation boosts implicit learning of artificial grammar.

    Science.gov (United States)

    Selchenkova, Tatiana; François, Clément; Schön, Daniele; Corneyllie, Alexandra; Perrin, Fabien; Tillmann, Barbara

    2014-01-01

    The present study investigated whether a temporal hierarchical structure favors implicit learning. An artificial pitch grammar implemented with a set of tones was presented in two different temporal contexts, notably with either a strongly metrical structure or an isochronous structure. According to the Dynamic Attending Theory, external temporal regularities can entrain internal oscillators that guide attention over time, allowing for temporal expectations that influence perception of future events. Based on this framework, it was hypothesized that the metrical structure provides a benefit for artificial grammar learning in comparison to an isochronous presentation. Our study combined behavioral and event-related potential measurements. Behavioral results demonstrated similar learning in both participant groups. By contrast, analyses of event-related potentials showed a larger P300 component and an earlier N2 component for the strongly metrical group during the exposure phase and the test phase, respectively. These findings suggests that the temporal expectations in the strongly metrical condition helped listeners to better process the pitch dimension, leading to improved learning of the artificial grammar.

  6. Benefits of the effective dose equivalent concept at a medical center

    International Nuclear Information System (INIS)

    Vetter, R.J.; Classic, K.L.

    1991-01-01

    A primary objective of the recommendations of the International Committee on Radiological Protection Publication 26 is to insure that no source of radiation exposure is unjustified in relation to its benefits. This objective is consistent with goals of the Radiation Safety Committee and Institutional Review Board at medical centers where research may involve radiation exposure of human subjects. The effective dose equivalent concept facilitates evaluation of risk by those who have little or no knowledge of quantities or biological effects of radiation. This paper presents effective dose equivalent data used by radiation workers and those who evaluate human research protocols as these data relate to personal dosimeter reading, entrance skin exposure, and target organ dose. The benefits of using effective dose equivalent to evaluate risk of medical radiation environments and research protocols are also described

  7. Impact of greenhouse gas metrics on the quantification of agricultural emissions and farm-scale mitigation strategies: a New Zealand case study

    Science.gov (United States)

    Reisinger, Andy; Ledgard, Stewart

    2013-06-01

    Agriculture emits a range of greenhouse gases. Greenhouse gas metrics allow emissions of different gases to be reported in a common unit called CO2-equivalent. This enables comparisons of the efficiency of different farms and production systems and of alternative mitigation strategies across all gases. The standard metric is the 100 year global warming potential (GWP), but alternative metrics have been proposed and could result in very different CO2-equivalent emissions, particularly for CH4. While significant effort has been made to reduce uncertainties in emissions estimates of individual gases, little effort has been spent on evaluating the implications of alternative metrics on overall agricultural emissions profiles and mitigation strategies. Here we assess, for a selection of New Zealand dairy farms, the effect of two alternative metrics (100 yr GWP and global temperature change potentials, GTP) on farm-scale emissions and apparent efficiency and cost effectiveness of alternative mitigation strategies. We find that alternative metrics significantly change the balance between CH4 and N2O; in some cases, alternative metrics even determine whether a specific management option would reduce or increase net farm-level emissions or emissions intensity. However, the relative ranking of different farms by profitability or emissions intensity, and the ranking of the most cost-effective mitigation options for each farm, are relatively unaffected by the metric. We conclude that alternative metrics would change the perceived significance of individual gases from agriculture and the overall cost to farmers if a price were applied to agricultural emissions, but the economically most effective response strategies are unaffected by the choice of metric.

  8. Impact of greenhouse gas metrics on the quantification of agricultural emissions and farm-scale mitigation strategies: a New Zealand case study

    International Nuclear Information System (INIS)

    Reisinger, Andy; Ledgard, Stewart

    2013-01-01

    Agriculture emits a range of greenhouse gases. Greenhouse gas metrics allow emissions of different gases to be reported in a common unit called CO 2 -equivalent. This enables comparisons of the efficiency of different farms and production systems and of alternative mitigation strategies across all gases. The standard metric is the 100 year global warming potential (GWP), but alternative metrics have been proposed and could result in very different CO 2 -equivalent emissions, particularly for CH 4 . While significant effort has been made to reduce uncertainties in emissions estimates of individual gases, little effort has been spent on evaluating the implications of alternative metrics on overall agricultural emissions profiles and mitigation strategies. Here we assess, for a selection of New Zealand dairy farms, the effect of two alternative metrics (100 yr GWP and global temperature change potentials, GTP) on farm-scale emissions and apparent efficiency and cost effectiveness of alternative mitigation strategies. We find that alternative metrics significantly change the balance between CH 4 and N 2 O; in some cases, alternative metrics even determine whether a specific management option would reduce or increase net farm-level emissions or emissions intensity. However, the relative ranking of different farms by profitability or emissions intensity, and the ranking of the most cost-effective mitigation options for each farm, are relatively unaffected by the metric. We conclude that alternative metrics would change the perceived significance of individual gases from agriculture and the overall cost to farmers if a price were applied to agricultural emissions, but the economically most effective response strategies are unaffected by the choice of metric. (letter)

  9. Comparison of adult and child radiation equivalent doses from 2 dental cone-beam computed tomography units.

    Science.gov (United States)

    Al Najjar, Anas; Colosi, Dan; Dauer, Lawrence T; Prins, Robert; Patchell, Gayle; Branets, Iryna; Goren, Arthur D; Faber, Richard D

    2013-06-01

    With the advent of cone-beam computed tomography (CBCT) scans, there has been a transition toward these scans' replacing traditional radiographs for orthodontic diagnosis and treatment planning. Children represent a significant proportion of orthodontic patients. Similar CBCT exposure settings are predicted to result in higher equivalent doses to the head and neck organs in children than in adults. The purpose of this study was to measure the difference in equivalent organ doses from different scanners under similar settings in children compared with adults. Two phantom heads were used, representing a 33-year-old woman and a 5-year-old boy. Optically stimulated dosimeters were placed at 8 key head and neck organs, and equivalent doses to these organs were calculated after scanning. The manufacturers' predefined exposure settings were used. One scanner had a pediatric preset option; the other did not. Scanning the child's phantom head with the adult settings resulted in significantly higher equivalent radiation doses to children compared with adults, ranging from a 117% average ratio of equivalent dose to 341%. Readings at the cervical spine level were decreased significantly, down to 30% of the adult equivalent dose. When the pediatric preset was used for the scans, there was a decrease in the ratio of equivalent dose to the child mandible and thyroid. CBCT scans with adult settings on both phantom heads resulted in higher radiation doses to the head and neck organs in the child compared with the adult. In practice, this might result in excessive radiation to children scanned with default adult settings. Collimation should be used when possible to reduce the radiation dose to the patient. While CBCT scans offer a valuable tool, use of CBCT scans should be justified on a specific case-by-case basis. Copyright © 2013 American Association of Orthodontists. Published by Mosby, Inc. All rights reserved.

  10. Social Media Metrics Importance and Usage Frequency in Latvia

    Directory of Open Access Journals (Sweden)

    Ronalds Skulme

    2017-12-01

    Full Text Available Purpose of the article: The purpose of this paper was to explore which social media marketing metrics are most often used and are most important for marketing experts in Latvia and can be used to evaluate marketing campaign effectiveness. Methodology/methods: In order to achieve the aim of this paper several theoretical and practical research methods were used, such as theoretical literature analysis, surveying and grouping. First of all, theoretical research about social media metrics was conducted. Authors collected information about social media metric grouping methods and the most frequently mentioned social media metrics in the literature. The collected information was used as the foundation for the expert surveys. The expert surveys were used to collect information from Latvian marketing professionals to determine which social media metrics are used most often and which social media metrics are most important in Latvia. Scientific aim: The scientific aim of this paper was to identify if social media metrics importance varies depending on the consumer purchase decision stage. Findings: Information about the most important and most often used social media marketing metrics in Latvia was collected. A new social media grouping framework is proposed. Conclusions: The main conclusion is that the importance and the usage frequency of the social media metrics is changing depending of consumer purchase decisions stage the metric is used to evaluate.

  11. RAAK PRO project: measuring safety in aviation : concept for the design of new metrics

    NARCIS (Netherlands)

    Karanikas, Nektarios; Kaspers, Steffen; Roelen, Alfred; Piric, Selma; van Aalst, Robbert; de Boer, Robert

    2017-01-01

    Following the completion of the 1st phase of the RAAK PRO project Aviation Safety Metrics, during which the researchers mapped the current practice in safety metrics and explored the validity of monotonic relationships of SMS, activity and demographic metrics with safety outcomes, this report

  12. Design parameters for toroidal and bobbin magnetics. [conversion from English to metric units

    Science.gov (United States)

    Mclyman, W. T.

    1974-01-01

    The adoption by NASA of the metric system for dimensioning to replace long-used English units imposes a requirement on the U.S. transformer designer to convert from the familiar units to the less familiar metric equivalents. Material is presented to assist in that transition in the field of transformer design and fabrication. The conversion data makes it possible for the designer to obtain a fast and close approximation of significant parameters such as size, weight, and temperature rise. Nomographs are included to provide a close approximation for breadboarding purposes. For greater convenience, derivations of some of the parameters are also presented.

  13. Creation of a retrospective job-exposure matrix using surrogate measures of exposure for a cohort of US career firefighters from San Francisco, Chicago and Philadelphia.

    Science.gov (United States)

    Dahm, Matthew M; Bertke, Stephen; Allee, Steve; Daniels, Robert D

    2015-09-01

    To construct a cohort-specific job-exposure matrix (JEM) using surrogate metrics of exposure for a cancer study on career firefighters from the Chicago, Philadelphia and San Francisco Fire Departments. Departmental work history records, along with data on historical annual fire-runs and hours, were collected from 1950 to 2009 and coded into separate databases. These data were used to create a JEM based on standardised job titles and fire apparatus assignments using several surrogate exposure metrics to estimate firefighters' exposure to the combustion byproducts of fire. The metrics included duration of exposure (cumulative time with a standardised exposed job title and assignment), fire-runs (cumulative events of potential fire exposure) and time at fire (cumulative hours of potential fire exposure). The JEM consisted of 2298 unique job titles alongside 16,174 fire apparatus assignments from the three departments, which were collapsed into 15 standardised job titles and 15 standardised job assignments. Correlations were found between fire-runs and time at fires (Pearson coefficient=0.92), duration of exposure and time at fires (Pearson coefficient=0.85), and duration of exposure and fire-runs (Pearson coefficient=0.82). Total misclassification rates were found to be between 16-30% when using duration of employment as an exposure surrogate, which has been traditionally used in most epidemiological studies, compared with using the duration of exposure surrogate metric. The constructed JEM successfully differentiated firefighters based on gradient levels of potential exposure to the combustion byproducts of fire using multiple surrogate exposure metrics. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.

  14. From fundamental limits to radioprotection practice

    International Nuclear Information System (INIS)

    Henry, P.; Chassany, J.

    1980-01-01

    The individual dose limits fixed by present French legislation for different categories of people refer to dose equivalents received by or delivered to the whole body or to certain tissues or organs over given periods of time. The values concerning personnel engaged directly in work under radiations are summed up in a table. These are the limits which radioprotection authorities must impose, while ensuring that exposure levels are kept as low as possible. With the means available in practical radioprotection it is not possible to measure dose equivalents directly, but information may be obtained on dose rates, absorbed doses, particle fluxes, activities per unit volume and per surface area. An interpretation of these measurements is necessary if an efficient supervision of worker exposure is to be achieved [fr

  15. The AGIS metric and time of test: A replication study

    OpenAIRE

    Counsell, S; Swift, S; Tucker, A

    2016-01-01

    Visual Field (VF) tests and corresponding data are commonly used in clinical practices to manage glaucoma. The standard metric used to measure glaucoma severity is the Advanced Glaucoma Intervention Studies (AGIS) metric. We know that time of day when VF tests are applied can influence a patient’s AGIS metric value; a previous study showed that this was the case for a data set of 160 patients. In this paper, we replicate that study using data from 2468 patients obtained from Moorfields Eye Ho...

  16. Application of the modified chi-square ratio statistic in a stepwise procedure for cascade impactor equivalence testing.

    Science.gov (United States)

    Weber, Benjamin; Lee, Sau L; Delvadia, Renishkumar; Lionberger, Robert; Li, Bing V; Tsong, Yi; Hochhaus, Guenther

    2015-03-01

    Equivalence testing of aerodynamic particle size distribution (APSD) through multi-stage cascade impactors (CIs) is important for establishing bioequivalence of orally inhaled drug products. Recent work demonstrated that the median of the modified chi-square ratio statistic (MmCSRS) is a promising metric for APSD equivalence testing of test (T) and reference (R) products as it can be applied to a reduced number of CI sites that are more relevant for lung deposition. This metric is also less sensitive to the increased variability often observed for low-deposition sites. A method to establish critical values for the MmCSRS is described here. This method considers the variability of the R product by employing a reference variance scaling approach that allows definition of critical values as a function of the observed variability of the R product. A stepwise CI equivalence test is proposed that integrates the MmCSRS as a method for comparing the relative shapes of CI profiles and incorporates statistical tests for assessing equivalence of single actuation content and impactor sized mass. This stepwise CI equivalence test was applied to 55 published CI profile scenarios, which were classified as equivalent or inequivalent by members of the Product Quality Research Institute working group (PQRI WG). The results of the stepwise CI equivalence test using a 25% difference in MmCSRS as an acceptance criterion provided the best matching with those of the PQRI WG as decisions of both methods agreed in 75% of the 55 CI profile scenarios.

  17. Integrated Metrics for Improving the Life Cycle Approach to Assessing Product System Sustainability

    Directory of Open Access Journals (Sweden)

    Wesley Ingwersen

    2014-03-01

    Full Text Available Life cycle approaches are critical for identifying and reducing environmental burdens of products. While these methods can indicate potential environmental impacts of a product, current Life Cycle Assessment (LCA methods fail to integrate the multiple impacts of a system into unified measures of social, economic or environmental performance related to sustainability. Integrated metrics that combine multiple aspects of system performance based on a common scientific or economic principle have proven to be valuable for sustainability evaluation. In this work, we propose methods of adapting four integrated metrics for use with LCAs of product systems: ecological footprint, emergy, green net value added, and Fisher information. These metrics provide information on the full product system in land, energy, monetary equivalents, and as a unitless information index; each bundled with one or more indicators for reporting. When used together and for relative comparison, integrated metrics provide a broader coverage of sustainability aspects from multiple theoretical perspectives that is more likely to illuminate potential issues than individual impact indicators. These integrated metrics are recommended for use in combination with traditional indicators used in LCA. Future work will test and demonstrate the value of using these integrated metrics and combinations to assess product system sustainability.

  18. Deep Energy Retrofit Performance Metric Comparison: Eight California Case Studies

    Energy Technology Data Exchange (ETDEWEB)

    Walker, Iain [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Fisher, Jeremy [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Less, Brennan [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2014-06-01

    In this paper we will present the results of monitored annual energy use data from eight residential Deep Energy Retrofit (DER) case studies using a variety of performance metrics. For each home, the details of the retrofits were analyzed, diagnostic tests to characterize the home were performed and the homes were monitored for total and individual end-use energy consumption for approximately one year. Annual performance in site and source energy, as well as carbon dioxide equivalent (CO2e) emissions were determined on a per house, per person and per square foot basis to examine the sensitivity to these different metrics. All eight DERs showed consistent success in achieving substantial site energy and CO2e reductions, but some projects achieved very little, if any source energy reduction. This problem emerged in those homes that switched from natural gas to electricity for heating and hot water, resulting in energy consumption dominated by electricity use. This demonstrates the crucial importance of selecting an appropriate metric to be used in guiding retrofit decisions. Also, due to the dynamic nature of DERs, with changes in occupancy, size, layout, and comfort, several performance metrics might be necessary to understand a project’s success.

  19. [Clinical trial data management and quality metrics system].

    Science.gov (United States)

    Chen, Zhao-hua; Huang, Qin; Deng, Ya-zhong; Zhang, Yue; Xu, Yu; Yu, Hao; Liu, Zong-fan

    2015-11-01

    Data quality management system is essential to ensure accurate, complete, consistent, and reliable data collection in clinical research. This paper is devoted to various choices of data quality metrics. They are categorized by study status, e.g. study start up, conduct, and close-out. In each category, metrics for different purposes are listed according to ALCOA+ principles such us completeness, accuracy, timeliness, traceability, etc. Some general quality metrics frequently used are also introduced. This paper contains detail information as much as possible to each metric by providing definition, purpose, evaluation, referenced benchmark, and recommended targets in favor of real practice. It is important that sponsors and data management service providers establish a robust integrated clinical trial data quality management system to ensure sustainable high quality of clinical trial deliverables. It will also support enterprise level of data evaluation and bench marking the quality of data across projects, sponsors, data management service providers by using objective metrics from the real clinical trials. We hope this will be a significant input to accelerate the improvement of clinical trial data quality in the industry.

  20. Sharp metric obstructions for quasi-Einstein metrics

    Science.gov (United States)

    Case, Jeffrey S.

    2013-02-01

    Using the tractor calculus to study smooth metric measure spaces, we adapt results of Gover and Nurowski to give sharp metric obstructions to the existence of quasi-Einstein metrics on suitably generic manifolds. We do this by introducing an analogue of the Weyl tractor W to the setting of smooth metric measure spaces. The obstructions we obtain can be realized as tensorial invariants which are polynomial in the Riemann curvature tensor and its divergence. By taking suitable limits of their tensorial forms, we then find obstructions to the existence of static potentials, generalizing to higher dimensions a result of Bartnik and Tod, and to the existence of potentials for gradient Ricci solitons.

  1. Metric Relativity and the Dynamical Bridge: highlights of Riemannian geometry in physics

    Energy Technology Data Exchange (ETDEWEB)

    Novello, Mario [Centro Brasileiro de Pesquisas Fisicas (ICRA/CBPF), Rio de Janeiro, RJ (Brazil). Instituto de Cosmologia Relatividade e Astrofisica; Bittencourt, Eduardo, E-mail: eduardo.bittencourt@icranet.org [Physics Department, La Sapienza University of Rome (Italy)

    2015-12-15

    We present an overview of recent developments concerning modifications of the geometry of space-time to describe various physical processes of interactions among classical and quantum configurations. We concentrate in two main lines of research: the Metric Relativity and the Dynamical Bridge. We describe the notion of equivalent (dragged) metric ĝ μ υ which is responsible to map the path of any accelerated body in Minkowski space-time onto a geodesic motion in such associatedĝ geometry. Only recently, the method introduced by Einstein in general relativity was used beyond the domain of gravitational forces to map arbitrary accelerated bodies submitted to non-Newtonian attractions onto geodesics of a modified geometry. This process has its roots in the very ancient idea to treat any dynamical problem in Classical Mechanics as nothing but a problem of static where all forces acting on a body annihilates themselves including the inertial ones. This general procedure, that concerns arbitrary forces - beyond the uses of General Relativity that is limited only to gravitational processes - is nothing but the relativistic version of the d'Alembert method in classical mechanics and consists in the principle of Metric Relativity. The main difference between gravitational interaction and all other forces concerns the universality of gravity which added to the interpretation of the equivalence principle allows all associated geometries-one for each different body in the case of non-gravitational forces-to be unified into a unique Riemannian space-time structure. The same geometrical description appears for electromagnetic waves in the optical limit within the context of nonlinear theories or material medium. Once it is largely discussed in the literature, the so-called analogue models of gravity, we will dedicate few sections on this emphasizing their relation with the new concepts introduced here. Then, we pass to the description of the Dynamical Bridge formalism

  2. Standard Practice for Visual Inspections of Photovoltaic Modules

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    2008-01-01

    1.1 This practice covers procedures and criteria for visual inspections of photovoltaic modules. 1.2 Visual inspections of photovoltaic modules are normally performed before and after modules have been subjected to environmental, electrical, or mechanical stress testing, such as thermal cycling, humidity-freeze cycling, damp heat exposure, ultraviolet exposure, mechanical loading, hail impact testing, outdoor exposure, or other stress testing that may be part of photovoltaic module testing sequence. 1.3 This practice does not establish pass or fail levels. The determination of acceptable or unacceptable results is beyond the scope of this practice. 1.4 There is no similar or equivalent ISO standard. 1.5 This standard does not purport to address all of the safety concerns, if any, associated with its use. It is the responsibility of the user of this standard to establish appropriate safety and health practices and determine the applicability of regulatory limitations prior to use.

  3. Wireless sensor network performance metrics for building applications

    Energy Technology Data Exchange (ETDEWEB)

    Jang, W.S. (Department of Civil Engineering Yeungnam University 214-1 Dae-Dong, Gyeongsan-Si Gyeongsangbuk-Do 712-749 South Korea); Healy, W.M. [Building and Fire Research Laboratory, 100 Bureau Drive, Gaithersburg, MD 20899-8632 (United States)

    2010-06-15

    Metrics are investigated to help assess the performance of wireless sensors in buildings. Wireless sensor networks present tremendous opportunities for energy savings and improvement in occupant comfort in buildings by making data about conditions and equipment more readily available. A key barrier to their adoption, however, is the uncertainty among users regarding the reliability of the wireless links through building construction. Tests were carried out that examined three performance metrics as a function of transmitter-receiver separation distance, transmitter power level, and obstruction type. These tests demonstrated, via the packet delivery rate, a clear transition from reliable to unreliable communications at different separation distances. While the packet delivery rate is difficult to measure in actual applications, the received signal strength indication correlated well with the drop in packet delivery rate in the relatively noise-free environment used in these tests. The concept of an equivalent distance was introduced to translate the range of reliability in open field operation to that seen in a typical building, thereby providing wireless system designers a rough estimate of the necessary spacing between sensor nodes in building applications. It is anticipated that the availability of straightforward metrics on the range of wireless sensors in buildings will enable more widespread sensing in buildings for improved control and fault detection. (author)

  4. Centrality metrics and localization in core-periphery networks

    International Nuclear Information System (INIS)

    Barucca, Paolo; Lillo, Fabrizio; Tantari, Daniele

    2016-01-01

    Two concepts of centrality have been defined in complex networks. The first considers the centrality of a node and many different metrics for it have been defined (e.g. eigenvector centrality, PageRank, non-backtracking centrality, etc). The second is related to large scale organization of the network, the core-periphery structure, composed by a dense core plus an outlying and loosely-connected periphery. In this paper we investigate the relation between these two concepts. We consider networks generated via the stochastic block model, or its degree corrected version, with a core-periphery structure and we investigate the centrality properties of the core nodes and the ability of several centrality metrics to identify them. We find that the three measures with the best performance are marginals obtained with belief propagation, PageRank, and degree centrality, while non-backtracking and eigenvector centrality (or MINRES [10], showed to be equivalent to the latter in the large network limit) perform worse in the investigated networks. (paper: interdisciplinary statistical mechanics )

  5. Stationary metrics and optical Zermelo-Randers-Finsler geometry

    International Nuclear Information System (INIS)

    Gibbons, G. W.; Warnick, C. M.; Herdeiro, C. A. R.; Werner, M. C.

    2009-01-01

    We consider a triality between the Zermelo navigation problem, the geodesic flow on a Finslerian geometry of Randers type, and spacetimes in one dimension higher admitting a timelike conformal Killing vector field. From the latter viewpoint, the data of the Zermelo problem are encoded in a (conformally) Painleve-Gullstrand form of the spacetime metric, whereas the data of the Randers problem are encoded in a stationary generalization of the usual optical metric. We discuss how the spacetime viewpoint gives a simple and physical perspective on various issues, including how Finsler geometries with constant flag curvature always map to conformally flat spacetimes and that the Finsler condition maps to either a causality condition or it breaks down at an ergo surface in the spacetime picture. The gauge equivalence in this network of relations is considered as well as the connection to analogue models and the viewpoint of magnetic flows. We provide a variety of examples.

  6. Do projections from bioclimatic envelope models and climate change metrics match?

    DEFF Research Database (Denmark)

    Garcia, Raquel A.; Cabeza, Mar; Altwegg, Res

    2016-01-01

    as indicators of the exposure of species to climate change. Here, we investigate whether these two approaches provide qualitatively similar indications about where biodiversity is potentially most exposed to climate change. Location: Sub-Saharan Africa. Methods: We compared a range of climate change metrics...... for sub-Saharan Africa with ensembles of bioclimatic envelope models for 2723 species of amphibians, snakes, mammals and birds. For each taxonomic group, we performed three comparisons between the two approaches: (1) is projected change in local climatic suitability (models) greater in grid cells...... between the two approaches was found for all taxonomic groups, although it was stronger for species with a narrower climatic envelope breadth. Main conclusions: For sub-Saharan African vertebrates, projected patterns of exposure to climate change given by climate change metrics alone were qualitatively...

  7. 10 CFR 835.203 - Combining internal and external equivalent doses.

    Science.gov (United States)

    2010-01-01

    ... 10 Energy 4 2010-01-01 2010-01-01 false Combining internal and external equivalent doses. 835.203 Section 835.203 Energy DEPARTMENT OF ENERGY OCCUPATIONAL RADIATION PROTECTION Standards for Internal and External Exposure § 835.203 Combining internal and external equivalent doses. (a) The total effective dose...

  8. Football Players' Head-Impact Exposure After Limiting of Full-Contact Practices.

    Science.gov (United States)

    Broglio, Steven P; Williams, Richelle M; O'Connor, Kathryn L; Goldstick, Jason

    2016-07-01

    Sporting organizations limit full-contact football practices to reduce concussion risk and based on speculation that repeated head impacts may result in long-term neurodegeneration. To directly compare head-impact exposure in high school football players before and after a statewide restriction on full-contact practices. Cross-sectional study. High school football field. Participants were varsity football athletes from a single high school. Before the rule change, 26 athletes (age = 16.2 ± 0.8 years, height = 179.6 ± 6.4 cm, weight = 81.9 ± 13.1 kg) participated. After the rule change, 24 athletes (age = 15.9 ± 0.8 years, height = 178.3 ± 6.5 cm, weight = 76.2 ± 11.6 kg) participated. Nine athletes participated in both years of the investigation. Head-impact exposure was monitored using the Head Impact Telemetry System while the athletes participated in football games and practices in the seasons before and after the rule change. Head-impact frequency, location, and magnitude (ie, linear acceleration, rotational acceleration, and Head Impact Telemetry severity profile [HITsp], respectively) were measured. A total of 15 398 impacts (592 impacts per player per season) were captured before the rule change and 8269 impacts (345 impacts per player per season) after the change. An average 42% decline in impact exposure occurred across all players, with practice-exposure declines occurring among linemen (46% decline); receivers, cornerbacks, and safeties (41% decline); and tight ends, running backs (including fullbacks), and linebackers (39% decline). Impact magnitudes remained largely unchanged between the years. A rule change limiting full-contact high school football practices appears to have been effective in reducing head-impact exposure across all players, with the largest reduction occurring among linemen. This finding is likely associated with the rule modification, particularly because the coaching staff and offensive scheme remained consistent, yet how

  9. Patient exposure in general dental practice in the Netherlands

    International Nuclear Information System (INIS)

    Velders, X.L.; Selling, H.A.

    1988-01-01

    To estimate the population risk due to dental radiography an investigation was started among 1200 dental practitioners. A questionnaire was set up to inventory commonly applied indications of X-ray examinations, the number of examinations and the organizational actions taken by the dentists to limit radiation doses to the patients. Information was gathered on the type of X-ray machines, the use of aiming devices, protective measurements for patients and dental staff, developing procedures and the type of films. A number of practical tests was applied to obtain a quantitative impression of patient doses in accordance with special circumstances. For the practical tests films and lithium fluoride TLD-100 chips (Harshaw) were used to determine the beam diameter, the exposure of the X-ray machine and the scatter at a set distance of the middle of the beam, developing circumstances as well as entrance and exist skin doses measured on the skin of a patient. The results of 544 dental practices will be discussed. Finally an estimation of the possible extent of reduction in patient exposure in the Netherlands will be made

  10. $\\eta$-metric structures

    OpenAIRE

    Gaba, Yaé Ulrich

    2017-01-01

    In this paper, we discuss recent results about generalized metric spaces and fixed point theory. We introduce the notion of $\\eta$-cone metric spaces, give some topological properties and prove some fixed point theorems for contractive type maps on these spaces. In particular we show that theses $\\eta$-cone metric spaces are natural generalizations of both cone metric spaces and metric type spaces.

  11. Comparative Study of Trace Metrics between Bibliometrics and Patentometrics

    Directory of Open Access Journals (Sweden)

    Fred Y. Ye

    2016-06-01

    Full Text Available Purpose: To comprehensively evaluate the overall performance of a group or an individual in both bibliometrics and patentometrics. Design/methodology/approach: Trace metrics were applied to the top 30 universities in the 2014 Academic Ranking of World Universities (ARWU — computer sciences, the top 30 ESI highly cited papers in the computer sciences field in 2014, as well as the top 30 assignees and the top 30 most cited patents in the National Bureau of Economic Research (NBER computer hardware and software category. Findings: We found that, by applying trace metrics, the research or marketing impact efficiency, at both group and individual levels, was clearly observed. Furthermore, trace metrics were more sensitive to the different publication-citation distributions than the average citation and h-index were. Research limitations: Trace metrics considered publications with zero citations as negative contributions. One should clarify how he/she evaluates a zero-citation paper or patent before applying trace metrics. Practical implications: Decision makers could regularly examinine the performance of their university/company by applying trace metrics and adjust their policies accordingly. Originality/value: Trace metrics could be applied both in bibliometrics and patentometrics and provide a comprehensive view. Moreover, the high sensitivity and unique impact efficiency view provided by trace metrics can facilitate decision makers in examining and adjusting their policies.

  12. The practical application of ICRP recommendations regarding dose-equivalent limits for workers to staff in diagnostic X-ray departments

    International Nuclear Information System (INIS)

    Gill, J.R.; Beaver, P.F.; Dennis, J.A.

    1980-01-01

    Members of hospital staff who work in the X-ray room with patients, wear lead aprons to protect their bodies. These aprons greatly reduce the radiation dose rate at the surface of the body underneath the apron, but do not give any protection to parts of the body not covered by the apron, especially the head, neck, arms and legs. The ICRP's system of dose limitation for non-uniform irradiation of the body has been applied to exposure of this kind and a simple formula has been derived that permits the calculation of a good approximation to the effective dose-equivalent, using two dosemeters. One dosemeter is worn at chest or waist level under the apron to monitor the dose-equivalent received by protected organs while the other is worn on the collar or forehead to monitor the head and neck. Evidence based on published data is presented that suggests that in work of this nature, contrary to earlier opinion, the limiting factor is the dose equivalent received by the organs of the head and neck. The implications of this conclusion for routine personal monitoring are discussed. (H.K.)

  13. Uncertainty quantification metrics for whole product life cycle cost estimates in aerospace innovation

    Science.gov (United States)

    Schwabe, O.; Shehab, E.; Erkoyuncu, J.

    2015-08-01

    The lack of defensible methods for quantifying cost estimate uncertainty over the whole product life cycle of aerospace innovations such as propulsion systems or airframes poses a significant challenge to the creation of accurate and defensible cost estimates. Based on the axiomatic definition of uncertainty as the actual prediction error of the cost estimate, this paper provides a comprehensive overview of metrics used for the uncertainty quantification of cost estimates based on a literature review, an evaluation of publicly funded projects such as part of the CORDIS or Horizon 2020 programs, and an analysis of established approaches used by organizations such NASA, the U.S. Department of Defence, the ESA, and various commercial companies. The metrics are categorized based on their foundational character (foundations), their use in practice (state-of-practice), their availability for practice (state-of-art) and those suggested for future exploration (state-of-future). Insights gained were that a variety of uncertainty quantification metrics exist whose suitability depends on the volatility of available relevant information, as defined by technical and cost readiness level, and the number of whole product life cycle phases the estimate is intended to be valid for. Information volatility and number of whole product life cycle phases can hereby be considered as defining multi-dimensional probability fields admitting various uncertainty quantification metric families with identifiable thresholds for transitioning between them. The key research gaps identified were the lacking guidance grounded in theory for the selection of uncertainty quantification metrics and lacking practical alternatives to metrics based on the Central Limit Theorem. An innovative uncertainty quantification framework consisting of; a set-theory based typology, a data library, a classification system, and a corresponding input-output model are put forward to address this research gap as the basis

  14. A Novel Riemannian Metric Based on Riemannian Structure and Scaling Information for Fixed Low-Rank Matrix Completion.

    Science.gov (United States)

    Mao, Shasha; Xiong, Lin; Jiao, Licheng; Feng, Tian; Yeung, Sai-Kit

    2017-05-01

    Riemannian optimization has been widely used to deal with the fixed low-rank matrix completion problem, and Riemannian metric is a crucial factor of obtaining the search direction in Riemannian optimization. This paper proposes a new Riemannian metric via simultaneously considering the Riemannian geometry structure and the scaling information, which is smoothly varying and invariant along the equivalence class. The proposed metric can make a tradeoff between the Riemannian geometry structure and the scaling information effectively. Essentially, it can be viewed as a generalization of some existing metrics. Based on the proposed Riemanian metric, we also design a Riemannian nonlinear conjugate gradient algorithm, which can efficiently solve the fixed low-rank matrix completion problem. By experimenting on the fixed low-rank matrix completion, collaborative filtering, and image and video recovery, it illustrates that the proposed method is superior to the state-of-the-art methods on the convergence efficiency and the numerical performance.

  15. The Relationship of Practice Exposure and Injury Rate on Game Performance and Season Success in Professional Male Basketball

    Directory of Open Access Journals (Sweden)

    Toni Caparrós, Eduard Alentorn-Geli, Gregory D. Myer, Lluís Capdevila, Kristian Samuelsson, Bruce Hamilton, Gil Rodas

    2016-09-01

    Full Text Available The objectives of this study were to determine the relationship among game performance, injury rate, and practice exposure in a professional male basketball team. A retroospective analysis of prospective collected data was conducted over seven consecutive seasons (2007/2008 to 2013/2014. Data collection included sports performance during competition (statistical evaluation, injury rate, and total exposure (games and practices. Over the surveillance period, 162 injuries (91 practice; 71 matches occurred over 32,668 hours of exposure (556 games and 2005 practices. There was a strong positive correlation between: 1 exposure (total number of practices and hours of exposure and the total number of injuries (r = 0.77; p = 0.04; 2 exposure (total hours of exposure and total hours of practice exposure and performance (total team ranking (r = 0.77 and p = 0.04, and r = 0.8 and p = 0.03, respectively; and 3 total number of injuries and performance (total team ranking (r = 0.84; p = 0.02. While increasing practice and competition time is related to greater team performance, it also increases the number of injuries. However, higher injury rates were not associated with worse overall team performance. Efforts to reduce high-risk activity during practice, optimally replaced with injury prevention training, might help to reduce injury risk.

  16. Social Media Metrics Importance and Usage Frequency in Latvia

    OpenAIRE

    Ronalds Skulme

    2017-01-01

    Purpose of the article: The purpose of this paper was to explore which social media marketing metrics are most often used and are most important for marketing experts in Latvia and can be used to evaluate marketing campaign effectiveness. Methodology/methods: In order to achieve the aim of this paper several theoretical and practical research methods were used, such as theoretical literature analysis, surveying and grouping. First of all, theoretical research about social media metrics was...

  17. Synchronization of multi-agent systems with metric-topological interactions.

    Science.gov (United States)

    Wang, Lin; Chen, Guanrong

    2016-09-01

    A hybrid multi-agent systems model integrating the advantages of both metric interaction and topological interaction rules, called the metric-topological model, is developed. This model describes planar motions of mobile agents, where each agent can interact with all the agents within a circle of a constant radius, and can furthermore interact with some distant agents to reach a pre-assigned number of neighbors, if needed. Some sufficient conditions imposed only on system parameters and agent initial states are presented, which ensure achieving synchronization of the whole group of agents. It reveals the intrinsic relationships among the interaction range, the speed, the initial heading, and the density of the group. Moreover, robustness against variations of interaction range, density, and speed are investigated by comparing the motion patterns and performances of the hybrid metric-topological interaction model with the conventional metric-only and topological-only interaction models. Practically in all cases, the hybrid metric-topological interaction model has the best performance in the sense of achieving highest frequency of synchronization, fastest convergent rate, and smallest heading difference.

  18. Equivalent Hermitian Hamiltonian for the non-Hermitian -x4 potential

    International Nuclear Information System (INIS)

    Jones, H.F.; Mateo, J.

    2006-01-01

    The potential V(x)=-x 4 , which is unbounded below on the real line, can give rise to a well-posed bound state problem when x is taken on a contour in the lower-half complex plane. It is then PT-symmetric rather than Hermitian. Nonetheless it has been shown numerically to have a real spectrum, and a proof of reality, involving the correspondence between ordinary differential equations and integrable systems, was subsequently constructed for the general class of potentials -(ix) N . For such Hamiltonians the natural PT metric is not positive definite, but a dynamically-defined positive-definite metric can be defined, depending on an operator Q. Further, with the help of this operator an equivalent Hermitian Hamiltonian h can be constructed. This programme has been carried out exactly for a few soluble models, and the first few terms of a perturbative expansion have been found for the potential m 2 x 2 +igx 3 . However, until now, the -x 4 potential has proved intractable. In the present paper we give explicit, closed form expressions for Q and h, which are made possible by a particular parametrization of the contour in the complex plane on which the problem is defined. This constitutes an explicit proof of the reality of the spectrum. The resulting equivalent Hamiltonian has a potential with a positive quartic term together with a linear term

  19. Incorporating big data into treatment plan evaluation: Development of statistical DVH metrics and visualization dashboards

    Directory of Open Access Journals (Sweden)

    Charles S. Mayo, PhD

    2017-07-01

    Conclusions: Statistical DVH offers an easy-to-read, detailed, and comprehensive way to visualize the quantitative comparison with historical experiences and among institutions. WES and GEM metrics offer a flexible means of incorporating discrete threshold-prioritizations and historic context into a set of standardized scoring metrics. Together, they provide a practical approach for incorporating big data into clinical practice for treatment plan evaluations.

  20. Self-reported sleep disturbances due to railway noise: exposure-response relationships for nighttime equivalent and maximum noise levels.

    Science.gov (United States)

    Aasvang, Gunn Marit; Moum, Torbjorn; Engdahl, Bo

    2008-07-01

    The objective of the present survey was to study self-reported sleep disturbances due to railway noise with respect to nighttime equivalent noise level (L(p,A,eq,night)) and maximum noise level (L(p,A,max)). A sample of 1349 people in and around Oslo in Norway exposed to railway noise was studied in a cross-sectional survey to obtain data on sleep disturbances, sleep problems due to noise, and personal characteristics including noise sensitivity. Individual noise exposure levels were determined outside of the bedroom facade, the most-exposed facade, and inside the respondents' bedrooms. The exposure-response relationships were analyzed by using logistic regression models, controlling for possible modifying factors including the number of noise events (train pass-by frequency). L(p,A,eq,night) and L(p,A,max) were significantly correlated, and the proportion of reported noise-induced sleep problems increased as both L(p,A,eq,night) and L(p,A,max) increased. Noise sensitivity, type of bedroom window, and pass-by frequency were significant factors affecting noise-induced sleep disturbances, in addition to the noise exposure level. Because about half of the study population did not use a bedroom at the most-exposed side of the house, the exposure-response curve obtained by using noise levels for the most-exposed facade underestimated noise-induced sleep disturbance for those who actually have their bedroom at the most-exposed facade.

  1. Exposure to MRI-related magnetic fields and vertigo in MRI workers.

    Science.gov (United States)

    Schaap, Kristel; Portengen, Lützen; Kromhout, Hans

    2016-03-01

    Vertigo has been reported by people working around magnetic resonance imaging (MRI) scanners and was found to increase with increasing strength of scanner magnets. This suggests an association with exposure to static magnetic fields (SMF) and/or motion-induced time-varying magnetic fields (TVMF). This study assessed the association between various metrics of shift-long exposure to SMF and TVMF and self-reported vertigo among MRI workers. We analysed 358 shifts from 234 employees at 14 MRI facilities in the Netherlands. Participants used logbooks to report vertigo experienced during the work day at the MRI facility. In addition, personal exposure to SMF and TVMF was measured during the same shifts, using portable magnetic field dosimeters. Vertigo was reported during 22 shifts by 20 participants and was significantly associated with peak and time-weighted average (TWA) metrics of SMF as well as TVMF exposure. Associations were most evident with full-shift TWA TVMF exposure. The probability of vertigo occurrence during a work shift exceeded 5% at peak exposure levels of 409 mT and 477 mT/s and at full-shift TWA levels of 3 mT and 0.6 mT/s. These results confirm the hypothesis that vertigo is associated with exposure to MRI-related SMF and TVMF. Strong correlations between various metrics of shift-long exposure make it difficult to disentangle the effects of SMF and TVMF exposure, or identify the most relevant exposure metric. On the other hand, this also implies that several metrics of shift-long exposure to SMF and TVMF should perform similarly in epidemiological studies on MRI-related vertigo. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/

  2. Conformal changes of metrics and the initial-value problem of general relativity

    International Nuclear Information System (INIS)

    Mielke, E.W.

    1977-01-01

    Conformal techniques are reviewed with respect to applications to the initial-value problem of general relativity. Invariant transverse traceless decompositions of tensors, one of its main tools, are related to representations of the group of 'conformeomorphisms' acting on the space of all Riemannian metrics on M. Conformal vector fields, a kernel in the decomposition, are analyzed on compact manifolds with constant scalar curvature. The realization of arbitrary functions as scalar curvature of conformally equivalent metrics, a generalization of Yamabe's (Osaka Math. J.; 12:12 (1960)) conjecture, is applied to the Hamiltonian constraint and to the issue of positive energy of gravitational fields. Various approaches to the solution of the initial-value equations produced by altering the scaling behaviour of the second fundamental form are compared. (author)

  3. The Agony and the Ecstasy: Teaching Marketing Metrics to Undergraduate Business Students

    Science.gov (United States)

    Saber, Jane Lee; Foster, Mary K.

    2011-01-01

    The marketing department of a large business school introduced a new undergraduate course, marketing metrics and analysis. The main materials for this course consisted of a series of online spreadsheets with embedded text and practice problems, a 32-page online metrics primer that included assurance of learning questions and a sample examination…

  4. Dependence on age at intake of committed dose equivalents from radionuclides

    International Nuclear Information System (INIS)

    Adams, N.

    1981-01-01

    The dependence of committed dose equivalents on age at intake is needed to assess the significance of exposures of young persons among the general public resulting from inhaled or ingested radionuclides. The committed dose equivalents, evaluated using ICRP principles, depend on the body dimensions of the young person at the time of intake of a radionuclide and on subsequent body growth. Representation of growth by a series of exponential segments facilitates the derivation of general expressions for the age dependence of committed dose equivalents if metabolic models do not change with age. The additional assumption that intakes of radionuclides in air or food are proportional to a person's energy expenditure (implying age-independent dietary composition) enables the demonstration that the age of the most highly exposed 'critical groups' of the general public from these radionuclides is either about 1 year or 17 years. With the above assumptions the exposure of the critical group is less than three times the exposure of adult members of the general public. Approximate values of committed dose equivalents which avoid both underestimation and excessive overestimation are shown to be obtainable by simplified procedures. Modified procedures are suggested for use if metabolic models change with age. (author)

  5. Risk equivalent of exposure versus dose of radiation

    International Nuclear Information System (INIS)

    Bond, V.P.

    1986-01-01

    This report describes a risk analysis study of low-dose irradiation and the resulting biological effects on a cell. The author describes fundamental differences between the effects of high-level exposure (HLE) and low-level exposure (LLE). He stresses that the concept of absorbed dose to an organ is not a dose but a level of effect produced by a particular number of particles. He discusses the confusion between a linear-proportional representation of dose limits and a threshold-curvilinear representation, suggesting that a LLE is a composite of both systems

  6. Diagnostic on the appropriation of metrics in software medium enterprises of Medellin city

    Directory of Open Access Journals (Sweden)

    Piedad Metaute P.

    2016-06-01

    Full Text Available This article was produced as a result of the investigation, "Ownership and use of metrics in software medium-sized city of Medellin." The objective of this research was to conduct an assessment of the ownership and use of metrics, seeking to make recommendations that contribute to the strengthening of academia and the productive sector in this topic. The methodology used was based on documentary review related to international norms, standards, methodologies, guides and tools that address software quality metrics especially applicable during Software Engineering. The main sources consulted were books, journals and articles, which could raise the foundation for such research, likewise, field research was used, it applied to medium-sized enterprises engaged in the construction of the product, where contact he had with people involved in these processes, of which data pertaining to real contexts where the events are generated are obtained. topics were addressed as project control, process control, software engineering, control of product quality software, application time metrics, applying metrics at different stages, certifications metrics, methodologies, tools used, processes where contributions in their application, types of tests which are applied, among others, which resulted, argued discussion findings generated from the respective regulations, best practices and needs of different contexts where they are used metrics apply software products in addition to the respective conclusions and practical implications that allowed for an assessment of the ownership and use of metrics in software medium-sized city of Medellin, as well as some suggestions for the academy, aimed at strengthening subjects whose responsibility generating skills in Software Engineering, especially in the metrics, and contextualized for significant contributions to the industry.

  7. [Evaluation of the factorial and metric equivalence of the Sexual Assertiveness Scale (SAS) by sex].

    Science.gov (United States)

    Sierra, Juan Carlos; Santos-Iglesias, Pablo; Vallejo-Medina, Pablo

    2012-05-01

    Sexual assertiveness refers to the ability to initiate sexual activity, refuse unwanted sexual activity, and use contraceptive methods to avoid sexually transmitted diseases, developing healthy sexual behaviors. The Sexual Assertiveness Scale (SAS) assesses these three dimensions. The purpose of this study is to evaluate, using structural equation modeling and differential item functioning, the equivalence of the scale between men and women. Standard scores are also provided. A total of 4,034 participants from 21 Spanish provinces took part in the study. Quota sampling method was used. Results indicate a strict equivalent dimensionality of the Sexual Assertiveness Scale across sexes. One item was flagged by differential item functioning, although it does not affect the scale. Therefore, there is no significant bias in the scale when comparing across sexes. Standard scores show similar Initiation assertiveness scores for men and women, and higher scores on Refusal and Sexually Transmitted Disease Prevention for women. This scale can be used on men and women with sufficient psychometric guarantees.

  8. Occupational hazard: radiation exposure for the urologist: developing a reference standard

    Energy Technology Data Exchange (ETDEWEB)

    Cohen, Seth A.; Rangarajan, Sriram S.; Chen, Tony; Palazzi, Kerrin L.; Langford, J. Scott; Sur, Roger L., E-mail: rlsur@ucsd.edu [Department of Surgery and Division of Urology, U C San Diego Health Science System, San Diego, CA (United States)

    2013-03-15

    Introduction: to date, there is a paucity of literature offering practicing urologists a reference for the amount of radiation exposure received while surgically managing urolithiasis. This study examines the cumulative radiation exposure of an urologist over 9 months. Materials and methods: We present a case series of fluoroscopic exposures of an experienced stone surgeon operating at an academic comprehensive stone center between April and December 2011. Radiation exposure measurements were determined by a thermoluminescent dosimeter worn on the outside of the surgeon's thyroid shield. Estimations of radiation exposure (mrem) per month were charted with fluoroscopy times, using scatter plots to estimate Spearman's rank correlation coefficients. Results: the total 9-month radiation exposure was 87 mrems for deep dose equivalent (DDE), 293 mrem for lens dose equivalent (LDE), and 282 mrem for shallow dose equivalent (SDE). Total fluoroscopy time was 252.44 minutes for 64 ureteroscopies (URSs), 29 percutaneous nephrolithtomies (PNLs), 20 cystoscopies with ureteral stent placements, 9 shock wave ithotripsies (SWLs), 9 retrograde pyelograms (RPGs), 2 endoureterotomies, and 1 ureteral balloon dilation. Spearman's rank correlation coefficients examining the association between fluoroscopy time and radiation exposure were not significant for DDE (p = 0.6, Spearman's rho = 0.2), LDE (p = 0.6, Spearman's rho = 0.2), or SDE (p = 0.6, Spearman's rho = 0.2). Conclusions: Over a 9-month period, total radiation exposures were well below annual accepted limits (DDE 5000 mrem, LDE 15,000 mrem and SDE 50,000 mrem). Although fluoroscopy time did not correlate with radiation exposure, future prospective studies can account for co-variates such as patient obesity and urologist distance from radiation source. (author)

  9. Assessment of Effective Dose Equivalent of Indoor 222Rn Daughters in Inchass

    International Nuclear Information System (INIS)

    Ali, E.M.; Taha, T.M.; Gomaa, M.A.; El-Hussein, A.M.; Ahmad, A.A.

    2000-01-01

    The dominant component of natural radiation dose for the general population comes from the radon gas 222 Rn and its short-lived decay products, Ra A ( 214 Po), Ra B ( 214 Pb), Ra C ( 214 Bi), Ra C( 214 Po) in the breathing air. The objective of the present work is to assess the affective dose equivalent of the inhalation exposure of indoor 222 Rn for occupational workers. Average indon concentrations (Bqm -3 ) were monitored in several departments in Nuclear Research Center by radon monitor. We have calculated the lung dose equivalent and the effective dose equivalent for the Egyptian workers due to inhalation exposure of an equilibrium equivalent concentrations of radon daughters which varies from 0.27 to 2.5 mSvy -1 and 0.016 to 0.152mSvy -1 respectively. The annual effective doses obtained are within the accepted range of ICRP recommendations

  10. Change of annual collective dose equivalent of radiation workers at KURRI

    International Nuclear Information System (INIS)

    Okamoto, Kenichi

    1994-01-01

    The change of exposure dose equivalent of radiation workers at KURRI (Kyoto University Research Reactor Institute) in the past 30 years is reported together with the operational accomplishments. The reactor achieved criticality on June 24, 1964 and reached the normal power of 1000 kW on August 17 of the same year, and the normal power was elevated to 5000 kW on July 16, 1968 until today. The change of the annual effective dose equivalent, the collective dose equivalent, the average annual dose equivalent and the maximum dose equivalent are indicated in the table and the figure. The chronological table on the activities of the reactor is added. (T.H.)

  11. Method Points: towards a metric for method complexity

    Directory of Open Access Journals (Sweden)

    Graham McLeod

    1998-11-01

    Full Text Available A metric for method complexity is proposed as an aid to choosing between competing methods, as well as in validating the effects of method integration or the products of method engineering work. It is based upon a generic method representation model previously developed by the author and adaptation of concepts used in the popular Function Point metric for system size. The proposed technique is illustrated by comparing two popular I.E. deliverables with counterparts in the object oriented Unified Modeling Language (UML. The paper recommends ways to improve the practical adoption of new methods.

  12. Reproducibility of graph metrics of human brain functional networks.

    Science.gov (United States)

    Deuker, Lorena; Bullmore, Edward T; Smith, Marie; Christensen, Soren; Nathan, Pradeep J; Rockstroh, Brigitte; Bassett, Danielle S

    2009-10-01

    Graph theory provides many metrics of complex network organization that can be applied to analysis of brain networks derived from neuroimaging data. Here we investigated the test-retest reliability of graph metrics of functional networks derived from magnetoencephalography (MEG) data recorded in two sessions from 16 healthy volunteers who were studied at rest and during performance of the n-back working memory task in each session. For each subject's data at each session, we used a wavelet filter to estimate the mutual information (MI) between each pair of MEG sensors in each of the classical frequency intervals from gamma to low delta in the overall range 1-60 Hz. Undirected binary graphs were generated by thresholding the MI matrix and 8 global network metrics were estimated: the clustering coefficient, path length, small-worldness, efficiency, cost-efficiency, assortativity, hierarchy, and synchronizability. Reliability of each graph metric was assessed using the intraclass correlation (ICC). Good reliability was demonstrated for most metrics applied to the n-back data (mean ICC=0.62). Reliability was greater for metrics in lower frequency networks. Higher frequency gamma- and beta-band networks were less reliable at a global level but demonstrated high reliability of nodal metrics in frontal and parietal regions. Performance of the n-back task was associated with greater reliability than measurements on resting state data. Task practice was also associated with greater reliability. Collectively these results suggest that graph metrics are sufficiently reliable to be considered for future longitudinal studies of functional brain network changes.

  13. Semantic metrics

    OpenAIRE

    Hu, Bo; Kalfoglou, Yannis; Dupplaw, David; Alani, Harith; Lewis, Paul; Shadbolt, Nigel

    2006-01-01

    In the context of the Semantic Web, many ontology-related operations, e.g. ontology ranking, segmentation, alignment, articulation, reuse, evaluation, can be boiled down to one fundamental operation: computing the similarity and/or dissimilarity among ontological entities, and in some cases among ontologies themselves. In this paper, we review standard metrics for computing distance measures and we propose a series of semantic metrics. We give a formal account of semantic metrics drawn from a...

  14. Metric modular spaces

    CERN Document Server

    Chistyakov, Vyacheslav

    2015-01-01

    Aimed toward researchers and graduate students familiar with elements of functional analysis, linear algebra, and general topology; this book contains a general study of modulars, modular spaces, and metric modular spaces. Modulars may be thought of as generalized velocity fields and serve two important purposes: generate metric spaces in a unified manner and provide a weaker convergence, the modular convergence, whose topology is non-metrizable in general. Metric modular spaces are extensions of metric spaces, metric linear spaces, and classical modular linear spaces. The topics covered include the classification of modulars, metrizability of modular spaces, modular transforms and duality between modular spaces, metric  and modular topologies. Applications illustrated in this book include: the description of superposition operators acting in modular spaces, the existence of regular selections of set-valued mappings, new interpretations of spaces of Lipschitzian and absolutely continuous mappings, the existe...

  15. Standardized reporting of functioning information on ICF-based common metrics.

    Science.gov (United States)

    Prodinger, Birgit; Tennant, Alan; Stucki, Gerold

    2018-02-01

    In clinical practice and research a variety of clinical data collection tools are used to collect information on people's functioning for clinical practice and research and national health information systems. Reporting on ICF-based common metrics enables standardized documentation of functioning information in national health information systems. The objective of this methodological note on applying the ICF in rehabilitation is to demonstrate how to report functioning information collected with a data collection tool on ICF-based common metrics. We first specify the requirements for the standardized reporting of functioning information. Secondly, we introduce the methods needed for transforming functioning data to ICF-based common metrics. Finally, we provide an example. The requirements for standardized reporting are as follows: 1) having a common conceptual framework to enable content comparability between any health information; and 2) a measurement framework so that scores between two or more clinical data collection tools can be directly compared. The methods needed to achieve these requirements are the ICF Linking Rules and the Rasch measurement model. Using data collected incorporating the 36-item Short Form Health Survey (SF-36), the World Health Organization Disability Assessment Schedule 2.0 (WHODAS 2.0), and the Stroke Impact Scale 3.0 (SIS 3.0), the application of the standardized reporting based on common metrics is demonstrated. A subset of items from the three tools linked to common chapters of the ICF (d4 Mobility, d5 Self-care and d6 Domestic life), were entered as "super items" into the Rasch model. Good fit was achieved with no residual local dependency and a unidimensional metric. A transformation table allows for comparison between scales, and between a scale and the reporting common metric. Being able to report functioning information collected with commonly used clinical data collection tools with ICF-based common metrics enables clinicians

  16. Next-Generation Metrics: Responsible Metrics & Evaluation for Open Science

    Energy Technology Data Exchange (ETDEWEB)

    Wilsdon, J.; Bar-Ilan, J.; Peters, I.; Wouters, P.

    2016-07-01

    Metrics evoke a mixed reaction from the research community. A commitment to using data to inform decisions makes some enthusiastic about the prospect of granular, real-time analysis o of research and its wider impacts. Yet we only have to look at the blunt use of metrics such as journal impact factors, h-indices and grant income targets, to be reminded of the pitfalls. Some of the most precious qualities of academic culture resist simple quantification, and individual indicators often struggle to do justice to the richness and plurality of research. Too often, poorly designed evaluation criteria are “dominating minds, distorting behaviour and determining careers (Lawrence, 2007).” Metrics hold real power: they are constitutive of values, identities and livelihoods. How to exercise that power to more positive ends has been the focus of several recent and complementary initiatives, including the San Francisco Declaration on Research Assessment (DORA1), the Leiden Manifesto2 and The Metric Tide3 (a UK government review of the role of metrics in research management and assessment). Building on these initiatives, the European Commission, under its new Open Science Policy Platform4, is now looking to develop a framework for responsible metrics for research management and evaluation, which can be incorporated into the successor framework to Horizon 2020. (Author)

  17. Effective equivalence of the Einstein-Cartan and Einstein theories of gravity

    International Nuclear Information System (INIS)

    Nester, J.M.

    1977-01-01

    I prove that, for any choice of minimally coupled source field Lagrangian for the Einstein-Cartan-Sciama-Kibble theory of gravity, there exists a related minimally coupled source field Lagrangian for the Einstein theory which produces the same field equations for the metric and source field. By using a standard first-order form for source Lagrangians, the converse is also demonstrated. This establishes a one-to-one correspondence between source Lagrangians for the two theories which clearly reveals their similarities and their differences. Because of this ''equivalence,'' one can view either theory, in terms of the other, as minimal coupling for a related Minkowski source Lagrangian or as nonminimal coupling for the same Minkowski source Lagrangian. Consequently the two theories are, in this sense, indistinguishable. Some other implications of this ''equivalence'' are discussed

  18. A bridge role metric model for nodes in software networks.

    Directory of Open Access Journals (Sweden)

    Bo Li

    Full Text Available A bridge role metric model is put forward in this paper. Compared with previous metric models, our solution of a large-scale object-oriented software system as a complex network is inherently more realistic. To acquire nodes and links in an undirected network, a new model that presents the crucial connectivity of a module or the hub instead of only centrality as in previous metric models is presented. Two previous metric models are described for comparison. In addition, it is obvious that the fitting curve between the Bre results and degrees can well be fitted by a power law. The model represents many realistic characteristics of actual software structures, and a hydropower simulation system is taken as an example. This paper makes additional contributions to an accurate understanding of module design of software systems and is expected to be beneficial to software engineering practices.

  19. Repeated intermittent alcohol exposure during the third trimester-equivalent increases expression of the GABA(A) receptor δ subunit in cerebellar granule neurons and delays motor development in rats.

    Science.gov (United States)

    Diaz, Marvin R; Vollmer, Cyndel C; Zamudio-Bulcock, Paula A; Vollmer, William; Blomquist, Samantha L; Morton, Russell A; Everett, Julie C; Zurek, Agnieszka A; Yu, Jieying; Orser, Beverley A; Valenzuela, C Fernando

    2014-04-01

    Exposure to ethanol (EtOH) during fetal development can lead to long-lasting alterations, including deficits in fine motor skills and motor learning. Studies suggest that these are, in part, a consequence of cerebellar damage. Cerebellar granule neurons (CGNs) are the gateway of information into the cerebellar cortex. Functionally, CGNs are heavily regulated by phasic and tonic GABAergic inhibition from Golgi cell interneurons; however, the effect of EtOH exposure on the development of GABAergic transmission in immature CGNs has not been investigated. To model EtOH exposure during the 3rd trimester-equivalent of human pregnancy, neonatal pups were exposed intermittently to high levels of vaporized EtOH from postnatal day (P) 2 to P12. This exposure gradually increased pup serum EtOH concentrations (SECs) to ∼60 mM (∼0.28 g/dl) during the 4 h of exposure. EtOH levels gradually decreased to baseline 8 h after the end of exposure. Surprisingly, basal tonic and phasic GABAergic currents in CGNs were not significantly affected by postnatal alcohol exposure (PAE). However, PAE increased δ subunit expression at P28 as detected by immunohistochemical and western blot analyses. Also, electrophysiological studies with an agonist that is highly selective for δ-containing GABA(A) receptors, 4,5,6,7-tetrahydroisoxazolo[4,5-c]pyridine-3-ol (THIP), showed an increase in THIP-induced tonic current. Behavioral studies of PAE rats did not reveal any deficits in motor coordination, except for a delay in the acquisition of the mid-air righting reflex that was apparent at P15 to P18. These findings demonstrate that repeated intermittent exposure to high levels of EtOH during the equivalent of the last trimester of human pregnancy has significant but relatively subtle effects on motor coordination and GABAergic transmission in CGNs in rats. Copyright © 2013 Elsevier Ltd. All rights reserved.

  20. Exposure Metrics for RF Epidemiology: Cellular Phone Handsets (invited paper)

    International Nuclear Information System (INIS)

    Balzano, Q.

    1999-01-01

    The parameters are described that characterise the exposure of the users of cellular phones. The parameters are distinguished in two classes: the human and the cell phone parameters. Among the human parameters the following are discussed: size and shape of head and neck, manner of holding the phone (left vs. right, finger tips vs. palm contact) and phone position on the face of the user. The cell phone parameters causing the largest exposure variations are: antenna geometry (size, shape, extended or retracted) and matching conditions; operating RF power level; proximity of tissue to RF currents on metal parts, channel access method (analogue, pulsed, CDMA). The large variability of the RF exposure is further expanded by the variety (ever increasing) of phone models available to users who may change service frequently or sporadically. After a brief discussion of possible dose definitions and the uncertainty of the 'user' of a cell phone for a specific call, the paper analyses the critical exposure parameters that should be investigated to characterise statistically the RF exposure of the subjects of an epidemiological study. The improved exposure assessment of the users of cellular phones requires the cooperation of network operators and equipment manufacturers. The statistics of the most critical parameters, those with variability greater than 10:1, can be collected by modifying the software and hardware of the cell phone equipment. The paper suggests base station software modifications and the introduction of cell phone 'dosemeter' devices that record some of the critical exposure parameters. A certain number of these 'dosemeters' should be distributed among subscribers to determine the statistical variations of the RF exposure from cell phones. The paper concludes by recommending a pilot dosimetric study independent from any epidemiological study. (author)

  1. Regulation and practice of workers' protection from chemical exposures during container handling

    DEFF Research Database (Denmark)

    Nørgaard Fløe Pedersen, Randi; Jepsen, Jørgen Riis; Ádám, Balázs

    2014-01-01

    instructions relate to container handling, the provided information is not sufficiently detailed to conduct safe practice in many aspects. In accordance with the scientific literature, the interviewees estimate that there is a high frequency (5 to 50%) of containers with hazardous chemical exposure......Background: Fumigation of freight containers to prevent spread of pests and off-gassing of freight are sources of volatile chemicals that may constitute significant health risks when released. The aim of the study was to investigate the regulation and practice of container handling in Denmark...... with focus on preventive measures to reduce risk of chemical exposure. Methods: A comprehensive systematic search of scientific literature, legislation and recommendations related to safe work with transport containers from international and Danish regulatory bodies was performed. The practice of handling...

  2. Contrasting Various Metrics for Measuring Tropical Cyclone Activity

    Directory of Open Access Journals (Sweden)

    Jia-Yuh Yu Ping-Gin Chiu

    2012-01-01

    Full Text Available Popular metrics used for measuring the tropical cyclone (TC activity, including NTC (number of tropical cyclones, TCD (tropical cyclone days, ACE (accumulated cyclone energy, PDI (power dissipation index, along with two newly proposed indices: RACE (revised accumulated cyclone energy and RPDI (revised power dissipation index, are compared using the JTWC (Joint Typhoon Warning Center best-track data of TC over the western North Pacific basin. Our study shows that, while the above metrics have demonstrated various degrees of discrepancies, but in practical terms, they are all able to produce meaningful temporal and spatial changes in response to climate variability. Compared with the conventional ACE and PDI, RACE and RPDI seem to provide a more precise estimate of the total TC activity, especially in projecting the upswing trend of TC activity over the past few decades, simply because of a better approach in estimating TC wind energy. However, we would argue that there is still no need to find a ¡§universal¡¨ or ¡§best¡¨ metric for TC activity because different metrics are designed to stratify different aspects of TC activity, and whether the selected metric is appropriate or not should be determined solely by the purpose of study. Except for magnitude difference, the analysis results seem insensitive to the choice of the best-track datasets.

  3. Technical guidelines for maintaining occupational exposures as low as practicable. Phase I. Summary of current practices

    International Nuclear Information System (INIS)

    Gilchrist, R.L.; Selby, J.M.; Wedlick, H.L.

    1978-08-01

    Reducing radiation exposures to as low as practicable is a principle that was first introduced in 1949. However, the recent controversy over the low-level effects of radiation has led the Department of Energy (DOE) to review its programs. One such review was conducted by the Pacific Northwest Laboratory to survey the implementation of the ALAP principle in the DOE laboratories. This report contains the results of that survey, performed in 18 major DOE installations. The DOE contractors were asked questions concerning the following eight major areas: management, operational health physics, design, dosimetry, instrumentation, training, risk/cost-benefit, and impact of the ALAP philosophy. The survey revealed several potential areas of concern, which are described in this report. These areas will be addressed in the forthcoming manual, A Guide to Reducing Radiation Exposures As Low As Practicable

  4. Occupational exposure assessment: Practices in Malaysian nuclear agency

    Energy Technology Data Exchange (ETDEWEB)

    Sarowi, S. Muhd, E-mail: suzie@nuclearmalaysia.gov.my; Ramli, S. A.; Kontol, K. Mohamad [Radiation Safety & Health Division, Malaysian Nuclear Agency, 43000 Kajang, Selangor (Malaysia); Rahman, N. A. H. Abd. [Faculty of Science & Mathematics, Sultan Idris of Education Universit, 35900, Tanjong Malim, Perak Darul Ridzuan (Malaysia)

    2016-01-22

    Malaysian Nuclear Agency (Nuclear Malaysia) is the leading agency in introducing and promoting the application of nuclear science technology in Malaysia. The agency provides major nuclear facilities purposely for research and commercialisation such as reactor, irradiation plants and radioisotope production laboratory. When dealing with ionizing radiation, there is an obligatory requirement to monitor and assess the radiation exposure to the workers. The personal dose of radiation workers were monitored monthly by assessing their Thermoluminescence Dosimeter (TLD) dose reading. This paper will discuss the current practice in managing, assessing, record keeping and reporting of the occupational exposure in Nuclear Malaysia including the Health Physic Group roles and challenges. The statistics on occupational radiation exposure of monitored workers working in different fields in Nuclear Malaysia from 2011 - 2013 will also be presented. The results show that the null hypothesis (H{sub 0}) was accepted which the means of every populations are all equal or not differ significantly. This hypothesis states that the dose exposure received by the radiation workers in Nuclear Malaysia is similar and there were no significant changes from 2011 to 2013. The radiation monitoring programme correlate with the requirement of our national law, the Atomic Energy Licensing Act 1984 (Act 304)

  5. Occupational exposure assessment: Practices in Malaysian nuclear agency

    Science.gov (United States)

    Sarowi, S. Muhd; Ramli, S. A.; Kontol, K. Mohamad; Rahman, N. A. H. Abd.

    2016-01-01

    Malaysian Nuclear Agency (Nuclear Malaysia) is the leading agency in introducing and promoting the application of nuclear science technology in Malaysia. The agency provides major nuclear facilities purposely for research and commercialisation such as reactor, irradiation plants and radioisotope production laboratory. When dealing with ionizing radiation, there is an obligatory requirement to monitor and assess the radiation exposure to the workers. The personal dose of radiation workers were monitored monthly by assessing their Thermoluminescence Dosimeter (TLD) dose reading. This paper will discuss the current practice in managing, assessing, record keeping and reporting of the occupational exposure in Nuclear Malaysia including the Health Physic Group roles and challenges. The statistics on occupational radiation exposure of monitored workers working in different fields in Nuclear Malaysia from 2011 - 2013 will also be presented. The results show that the null hypothesis (H₀) was accepted which the means of every populations are all equal or not differ significantly. This hypothesis states that the dose exposure received by the radiation workers in Nuclear Malaysia is similar and there were no significant changes from 2011 to 2013. The radiation monitoring programme correlate with the requirement of our national law, the Atomic Energy Licensing Act 1984 (Act 304).

  6. Occupational exposure assessment: Practices in Malaysian nuclear agency

    International Nuclear Information System (INIS)

    Sarowi, S. Muhd; Ramli, S. A.; Kontol, K. Mohamad; Rahman, N. A. H. Abd.

    2016-01-01

    Malaysian Nuclear Agency (Nuclear Malaysia) is the leading agency in introducing and promoting the application of nuclear science technology in Malaysia. The agency provides major nuclear facilities purposely for research and commercialisation such as reactor, irradiation plants and radioisotope production laboratory. When dealing with ionizing radiation, there is an obligatory requirement to monitor and assess the radiation exposure to the workers. The personal dose of radiation workers were monitored monthly by assessing their Thermoluminescence Dosimeter (TLD) dose reading. This paper will discuss the current practice in managing, assessing, record keeping and reporting of the occupational exposure in Nuclear Malaysia including the Health Physic Group roles and challenges. The statistics on occupational radiation exposure of monitored workers working in different fields in Nuclear Malaysia from 2011 - 2013 will also be presented. The results show that the null hypothesis (H 0 ) was accepted which the means of every populations are all equal or not differ significantly. This hypothesis states that the dose exposure received by the radiation workers in Nuclear Malaysia is similar and there were no significant changes from 2011 to 2013. The radiation monitoring programme correlate with the requirement of our national law, the Atomic Energy Licensing Act 1984 (Act 304)

  7. Lead Equivalent Thickness Measurement for Mixed Compositions of Barium Plaster Block

    International Nuclear Information System (INIS)

    Norriza Mohd Isa; Muhammad Jamal Muhammad Isa; Nur Shahriza Zainuddin; Mohd Khairusalih Md Zin; Shahrul Azlan Azizan

    2016-01-01

    Measurement of lead equivalent thickness for ionizing radiation exposure room wall shall be performed as stated in Malaysian Standard MS 838. A few numbers of sample blocks with different mixture of barium plaster compositions based and varies certain thickness as a shielding material for exposure room wall belong to a local company were tested by using Cs-137, Co-60 and Am-241 with different activities . Radiations passed through the samples were detected with calibrated survey meter. The distance between radiation source and the detector is about 40 cm. Lead uniformity test on the samples was also determined at three labeled points on the samples. Lead equivalent thicknesses for the samples were evaluated based on a calibration graph that was plotted with lead sheets and with the radiation sources. Results shown that lead equivalent thickness for the samples with same actual physical thickness represent different values for different sources. (author)

  8. Integrable topological billiards and equivalent dynamical systems

    Science.gov (United States)

    Vedyushkina, V. V.; Fomenko, A. T.

    2017-08-01

    We consider several topological integrable billiards and prove that they are Liouville equivalent to many systems of rigid body dynamics. The proof uses the Fomenko-Zieschang theory of invariants of integrable systems. We study billiards bounded by arcs of confocal quadrics and their generalizations, generalized billiards, where the motion occurs on a locally planar surface obtained by gluing several planar domains isometrically along their boundaries, which are arcs of confocal quadrics. We describe two new classes of integrable billiards bounded by arcs of confocal quadrics, namely, non-compact billiards and generalized billiards obtained by gluing planar billiards along non-convex parts of their boundaries. We completely classify non-compact billiards bounded by arcs of confocal quadrics and study their topology using the Fomenko invariants that describe the bifurcations of singular leaves of the additional integral. We study the topology of isoenergy surfaces for some non-convex generalized billiards. It turns out that they possess exotic Liouville foliations: the integral trajectories of the billiard that lie on some singular leaves admit no continuous extension. Such billiards appear to be leafwise equivalent to billiards bounded by arcs of confocal quadrics in the Minkowski metric.

  9. Computer program for diagnostic X-ray exposure conversion

    International Nuclear Information System (INIS)

    Lewis, S.L.

    1984-01-01

    Presented is a computer program designed to convert any given set of diagnostic X-ray exposure factors sequentially into another, yielding either an equivalent photographic density or one increased or decreased by a specifiable proportion. In addition to containing the wherewithal with which to manipulate a set of exposure factors, the facility to print hard (paper) copy is included enabling the results to be pasted into a notebook and used at any time. This program was originally written as an investigative exercise into examining the potential use of computers for practical radiographic purposes as conventionally encountered. At the same time, its possible use as an educational tool was borne in mind. To these ends, the current version of this program may be used as a means whereby exposure factors used in a diagnostic department may be altered to suit a particular requirement or may be used in the school as a mathematical model to describe the behaviour of exposure factors under manipulation without patient exposure. (author)

  10. A practical method to evaluate radiofrequency exposure of mast workers

    International Nuclear Information System (INIS)

    Alanko, T.; Hietanen, M.

    2008-01-01

    Assessment of occupational exposure to radiofrequency (RF) fields in telecommunication transmitter masts is a challenging task. For conventional field strength measurements using manually operated instruments, it is difficult to document the locations of measurements while climbing up a mast. Logging RF dosemeters worn by the workers, on the other hand, do not give any information about the location of the exposure. In this study, a practical method was developed and applied to assess mast workers' exposure to RF fields and the corresponding location. This method uses a logging dosemeter for personal RF exposure evaluation and two logging barometers to determine the corresponding height of the worker's position on the mast. The procedure is not intended to be used for compliance assessments, but to indicate locations where stricter assessments are needed. The applicability of the method is demonstrated by making measurements in a TV and radio transmitting mast. (authors)

  11. An examination of knowledge, attitudes and practices related to lead exposure in South Western Nigeria

    Directory of Open Access Journals (Sweden)

    Sridhar Mynepalli KC

    2006-03-01

    Full Text Available Abstract Background Lead is a highly toxic and pervasive metal. Chronic exposure to low levels is responsible for significant health effects, particularly in children. Prevention remains the best option for reducing childhood lead exposure, however the knowledge, attitudes and practices to lead exposure in many developing countries is not known. Methods: We conducted four focus group discussions (FGD to evaluate knowledge attitudes and practices to lead exposure in Nigeria. An FGD guide was developed from the literature and preliminary discussion with members of the public. Participants in the FGD were randomly selected from adults living in Ibadan, South Western Nigeria in 2004. Results We found that there was limited awareness of the sources of lead exposure in the domestic environment and participants had little knowledge of the health effects of chronic low-dose lead exposure. Conclusion We conclude that the findings of this study should be used, in conjunction with others, to develop appropriate health education intervention for lead exposure in the domestic environment.

  12. Development and validation of trauma surgical skills metrics: Preliminary assessment of performance after training.

    Science.gov (United States)

    Shackelford, Stacy; Garofalo, Evan; Shalin, Valerie; Pugh, Kristy; Chen, Hegang; Pasley, Jason; Sarani, Babak; Henry, Sharon; Bowyer, Mark; Mackenzie, Colin F

    2015-07-01

    Maintaining trauma-specific surgical skills is an ongoing challenge for surgical training programs. An objective assessment of surgical skills is needed. We hypothesized that a validated surgical performance assessment tool could detect differences following a training intervention. We developed surgical performance assessment metrics based on discussion with expert trauma surgeons, video review of 10 experts and 10 novice surgeons performing three vascular exposure procedures and lower extremity fasciotomy on cadavers, and validated the metrics with interrater reliability testing by five reviewers blinded to level of expertise and a consensus conference. We tested these performance metrics in 12 surgical residents (Year 3-7) before and 2 weeks after vascular exposure skills training in the Advanced Surgical Skills for Exposure in Trauma (ASSET) course. Performance was assessed in three areas as follows: knowledge (anatomic, management), procedure steps, and technical skills. Time to completion of procedures was recorded, and these metrics were combined into a single performance score, the Trauma Readiness Index (TRI). Wilcoxon matched-pairs signed-ranks test compared pretraining/posttraining effects. Mean time to complete procedures decreased by 4.3 minutes (from 13.4 minutes to 9.1 minutes). The performance component most improved by the 1-day skills training was procedure steps, completion of which increased by 21%. Technical skill scores improved by 12%. Overall knowledge improved by 3%, with 18% improvement in anatomic knowledge. TRI increased significantly from 50% to 64% with ASSET training. Interrater reliability of the surgical performance assessment metrics was validated with single intraclass correlation coefficient of 0.7 to 0.98. A trauma-relevant surgical performance assessment detected improvements in specific procedure steps and anatomic knowledge taught during a 1-day course, quantified by the TRI. ASSET training reduced time to complete vascular

  13. Properties of C-metric spaces

    Science.gov (United States)

    Croitoru, Anca; Apreutesei, Gabriela; Mastorakis, Nikos E.

    2017-09-01

    The subject of this paper belongs to the theory of approximate metrics [23]. An approximate metric on X is a real application defined on X × X that satisfies only a part of the metric axioms. In a recent paper [23], we introduced a new type of approximate metric, named C-metric, that is an application which satisfies only two metric axioms: symmetry and triangular inequality. The remarkable fact in a C-metric space is that a topological structure induced by the C-metric can be defined. The innovative idea of this paper is that we obtain some convergence properties of a C-metric space in the absence of a metric. In this paper we investigate C-metric spaces. The paper is divided into four sections. Section 1 is for Introduction. In Section 2 we recall some concepts and preliminary results. In Section 3 we present some properties of C-metric spaces, such as convergence properties, a canonical decomposition and a C-fixed point theorem. Finally, in Section 4 some conclusions are highlighted.

  14. Learning Low-Dimensional Metrics

    OpenAIRE

    Jain, Lalit; Mason, Blake; Nowak, Robert

    2017-01-01

    This paper investigates the theoretical foundations of metric learning, focused on three key questions that are not fully addressed in prior work: 1) we consider learning general low-dimensional (low-rank) metrics as well as sparse metrics; 2) we develop upper and lower (minimax)bounds on the generalization error; 3) we quantify the sample complexity of metric learning in terms of the dimension of the feature space and the dimension/rank of the underlying metric;4) we also bound the accuracy ...

  15. Microdosimetric basis for exposure limits

    International Nuclear Information System (INIS)

    Brackenbush, L.W.; Braby, L.A.

    1986-10-01

    The new organ-weighted effective dose equivalents should provide a much more accurate estimation of the degree of hazard for a worker's exposure to ionizing radiations. The method involves the microdosimetric concept of lineal energy to help establish exposure limits and will provide a unified system applicable to all types of ionizing radiation. Rather than being only calculated values, the effective dose equivalents and quality factors will be experimentally measured using tissue equivalent proportional counters. The measurement may be difficult to perform at various depths in an anthropomorphic phantom. Operational health physicists will be concerned about the lack of survey instruments and personnel dosimeters that measure lineal energy distributions. Their possible objections may be mitigated by the commercial introduction of instruments based upon tissue equivalent proportional counters or related devices containing inexpensive microprocessors. The many potential benefits include providing a uniform method for implementing the proposed increases in quality factors for neutrons and photons, providing a more unified approach for combining external and internal exposures, and potentially resolving questions about dosimeter placement and dose assessment for nonuniform exposures to mixed radiations. 16 refs., 3 figs

  16. Application of maximum radiation exposure values and monitoring of radiation exposure

    International Nuclear Information System (INIS)

    1996-01-01

    The guide presents the principles to be applied in calculating the equivalent dose and the effective dose, instructions on application of the maximum values for radiation exposure, and instruction on monitoring of radiation exposure. In addition, the measurable quantities to be used in monitoring the radiation exposure are presented. (2 refs.)

  17. Science as Knowledge, Practice, and Map Making: The Challenge of Defining Metrics for Evaluating and Improving DOE-Funded Basic Experimental Science

    Energy Technology Data Exchange (ETDEWEB)

    Bodnarczuk, M.

    1993-03-01

    Industrial R&D laboratories have been surprisingly successful in developing performance objectives and metrics that convincingly show that planning, management, and improvement techniques can be value-added to the actual output of R&D organizations. In this paper, I will discuss the more difficult case of developing analogous constructs for DOE-funded non-nuclear, non-weapons basic research, or as I will refer to it - basic experimental science. Unlike most industrial R&D or the bulk of applied science performed at the National Renewable Energy Laboratory (NREL), the purpose of basic experimental science is producing new knowledge (usually published in professional journals) that has no immediate application to the first link (the R) of a planned R&D chain. Consequently, performance objectives and metrics are far more difficult to define. My claim is that if one can successfully define metrics for evaluating and improving DOE-funded basic experimental science (which is the most difficult case), then defining such constructs for DOE-funded applied science should be much less problematic. With the publication of the DOE Standard - Implementation Guide for Quality Assurance Programs for Basic and Applied Research (DOE-ER-STD-6001-92) and the development of a conceptual framework for integrating all the DOE orders, we need to move aggressively toward the threefold next phase: (1) focusing the management elements found in DOE-ER-STD-6001-92 on the main output of national laboratories - the experimental science itself; (2) developing clearer definitions of basic experimental science as practice not just knowledge; and (3) understanding the relationship between the metrics that scientists use for evaluating the performance of DOE-funded basic experimental science, the management elements of DOE-ER-STD-6001-92, and the notion of continuous improvement.

  18. Scalar-metric and scalar-metric-torsion gravitational theories

    International Nuclear Information System (INIS)

    Aldersley, S.J.

    1977-01-01

    The techniques of dimensional analysis and of the theory of tensorial concomitants are employed to study field equations in gravitational theories which incorporate scalar fields of the Brans-Dicke type. Within the context of scalar-metric gravitational theories, a uniqueness theorem for the geometric (or gravitational) part of the field equations is proven and a Lagrangian is determined which is uniquely specified by dimensional analysis. Within the context of scalar-metric-torsion gravitational theories a uniqueness theorem for field Lagrangians is presented and the corresponding Euler-Lagrange equations are given. Finally, an example of a scalar-metric-torsion theory is presented which is similar in many respects to the Brans-Dicke theory and the Einstein-Cartan theory

  19. Metrics of quantum states

    International Nuclear Information System (INIS)

    Ma Zhihao; Chen Jingling

    2011-01-01

    In this work we study metrics of quantum states, which are natural generalizations of the usual trace metric and Bures metric. Some useful properties of the metrics are proved, such as the joint convexity and contractivity under quantum operations. Our result has a potential application in studying the geometry of quantum states as well as the entanglement detection.

  20. Doses from radiation exposure

    CERN Document Server

    Menzel, H G

    2012-01-01

    Practical implementation of the International Commission on Radiological Protection's (ICRP) system of protection requires the availability of appropriate methods and data. The work of Committee 2 is concerned with the development of reference data and methods for the assessment of internal and external radiation exposure of workers and members of the public. This involves the development of reference biokinetic and dosimetric models, reference anatomical models of the human body, and reference anatomical and physiological data. Following ICRP's 2007 Recommendations, Committee 2 has focused on the provision of new reference dose coefficients for external and internal exposure. As well as specifying changes to the radiation and tissue weighting factors used in the calculation of protection quantities, the 2007 Recommendations introduced the use of reference anatomical phantoms based on medical imaging data, requiring explicit sex averaging of male and female organ-equivalent doses in the calculation of effecti...

  1. Evaluation of 1cm dose equivalent rate using a NaI(Tl) scintilation spectrometer

    International Nuclear Information System (INIS)

    Matsuda, Hideharu

    1990-01-01

    A method for evaluating 1 cm dose equivalent rates from a pulse height distribution obtained by a 76.2mmφ spherical NaI(Tl) scintillation spectrometer was described. Weak leakage radiation from nuclear facilities were also measured and dose equivalent conversion factor and effective energy of leakage radiation were evaluated from 1 cm dose equivalent rate and exposure rate. (author)

  2. Construction of computational program of aging in insulating materials for searching reversed sequential test conditions to give damage equivalent to simultaneous exposure of heat and radiation

    International Nuclear Information System (INIS)

    Fuse, Norikazu; Homma, Hiroya; Okamoto, Tatsuki

    2013-01-01

    Two consecutive numerical calculations on degradation of polymeric insulations under thermal and radiation environment are carried out to simulate so-called reversal sequential acceleration test. The aim of the calculation is to search testing conditions which provide material damage equivalent to the case of simultaneous exposure of heat and radiation. At least following four parameters are needed to be considered in the sequential method; dose rate and exposure time in radiation, as well as temperature and aging time in heating. The present paper discusses the handling of these parameters and shows some trial calculation results. (author)

  3. Environmental cost of using poor decision metrics to prioritize environmental projects.

    Science.gov (United States)

    Pannell, David J; Gibson, Fiona L

    2016-04-01

    Conservation decision makers commonly use project-scoring metrics that are inconsistent with theory on optimal ranking of projects. As a result, there may often be a loss of environmental benefits. We estimated the magnitudes of these losses for various metrics that deviate from theory in ways that are common in practice. These metrics included cases where relevant variables were omitted from the benefits metric, project costs were omitted, and benefits were calculated using a faulty functional form. We estimated distributions of parameters from 129 environmental projects from Australia, New Zealand, and Italy for which detailed analyses had been completed previously. The cost of using poor prioritization metrics (in terms of lost environmental values) was often high--up to 80% in the scenarios we examined. The cost in percentage terms was greater when the budget was smaller. The most costly errors were omitting information about environmental values (up to 31% loss of environmental values), omitting project costs (up to 35% loss), omitting the effectiveness of management actions (up to 9% loss), and using a weighted-additive decision metric for variables that should be multiplied (up to 23% loss). The latter 3 are errors that occur commonly in real-world decision metrics, in combination often reducing potential benefits from conservation investments by 30-50%. Uncertainty about parameter values also reduced the benefits from investments in conservation projects but often not by as much as faulty prioritization metrics. © 2016 Society for Conservation Biology.

  4. METRIC context unit architecture

    Energy Technology Data Exchange (ETDEWEB)

    Simpson, R.O.

    1988-01-01

    METRIC is an architecture for a simple but powerful Reduced Instruction Set Computer (RISC). Its speed comes from the simultaneous processing of several instruction streams, with instructions from the various streams being dispatched into METRIC's execution pipeline as they become available for execution. The pipeline is thus kept full, with a mix of instructions for several contexts in execution at the same time. True parallel programming is supported within a single execution unit, the METRIC Context Unit. METRIC's architecture provides for expansion through the addition of multiple Context Units and of specialized Functional Units. The architecture thus spans a range of size and performance from a single-chip microcomputer up through large and powerful multiprocessors. This research concentrates on the specification of the METRIC Context Unit at the architectural level. Performance tradeoffs made during METRIC's design are discussed, and projections of METRIC's performance are made based on simulation studies.

  5. Technical background for shallow (skin) dose equivalent evaluations

    International Nuclear Information System (INIS)

    Ashley, J.C.; Turner, J.E.; Crawford, O.H.; Hamm, R.N.; Reaves, K.L.; McMahan, K.L.

    1991-01-01

    Department of Energy Order 5480.11 describes procedures for radiation protection for occupational workers. The revisions dealing with non-uniform exposure to the skin are the subject of this report. We describe measurements and analysis required to assess shallow (skin) dose equivalent from skin contamination. 6 refs., 4 tabs

  6. Characteristics of peaks of inhalation exposure to organic solvents

    NARCIS (Netherlands)

    Preller, L.; Burstyn, I.; Pater, N. de; Kromhout, H.

    2004-01-01

    Objectives: To determine which exposure metrics are sufficient to characterize 'peak' inhalation exposure to organic solvents (OS) during spraying operations. Methods: Personal exposure measurements (n = 27; duration 5-159 min) were collected during application of paints, primers, resins and glues

  7. The Application of Equivalence Theory to Advertising Translation

    Institute of Scientific and Technical Information of China (English)

    张颖

    2017-01-01

    Through analyzing equivalence theory, the author tries to find a solution to the problems arising in the process of ad?vertising translation. These problems include cultural diversity, language diversity and special requirement of advertisement. The author declares that Nida''s functional equivalence is one of the most appropriate theories to deal with these problems. In this pa?per, the author introduces the principles of advertising translation and culture divergences in advertising translation, and then gives some advertising translation practices to explain and analyze how to create good advertising translation by using functional equivalence. At last, the author introduces some strategies in advertising translation.

  8. Exposure control practices for administering nitrous oxide: A survey of dentists, dental hygienists, and dental assistants.

    Science.gov (United States)

    Boiano, James M; Steege, Andrea L; Sweeney, Marie H

    2017-06-01

    Engineering, administrative, and work practice controls have been recommended for many years to minimize exposure to nitrous oxide during dental procedures. To better understand the extent to which these exposure controls are used, the NIOSH Health and Safety Practices Survey of Healthcare Workers was conducted among members of professional practice organizations representing dentists, dental hygienists and dental assistants. The anonymous, modular, web-based survey was completed by 284 dental professionals in private practice who administered nitrous oxide to adult and/or pediatric patients in the seven days prior to the survey. Use of primary engineering controls (i.e., nasal scavenging mask and/or local exhaust ventilation (LEV) near the patient's mouth) was nearly universal, reported by 93% and 96% of respondents who administered to adult (A) and pediatric (P) patients, respectively. However, adherence to other recommended precautionary practices were lacking to varying degrees, and were essentially no different among those administering nitrous oxide to adult or pediatric patients. Examples of work practices which increase exposure risk, expressed as percent of respondents, included: not checking nitrous oxide equipment for leaks (41% A; 48% P); starting nitrous oxide gas flow before delivery mask or airway mask was applied to patient (13% A; 12% P); and not turning off nitrous oxide gas flow before turning off oxygen flow to the patient (8% A; 7% P). Absence of standard procedures to minimize worker exposure to nitrous oxide (13% of all respondents) and not being trained on safe handling and administration of nitrous oxide (3%) were examples of breaches of administrative controls which may also increase exposure risk. Successful management of nitrous oxide emissions should include properly fitted nasal scavenging masks, supplemental LEV (when nitrous oxide levels cannot be adequately controlled using nasal masks alone), adequate general ventilation, regular

  9. Assaults by Mentally Disordered Offenders in Prison: Equity and Equivalence.

    Science.gov (United States)

    Hales, Heidi; Dixon, Amy; Newton, Zoe; Bartlett, Annie

    2016-06-01

    Managing the violent behaviour of mentally disordered offenders (MDO) is challenging in all jurisdictions. We describe the ethical framework and practical management of MDOs in England and Wales in the context of the move to equivalence of healthcare between hospital and prison. We consider the similarities and differences between prison and hospital management of the violent and challenging behaviours of MDOs. We argue that both types of institution can learn from each other and that equivalence of care should extend to equivalence of criminal proceedings in court and prisons for MDOs. We argue that any adjudication process in prison for MDOs is enhanced by the relevant involvement of mental health professionals and the articulation of the ethical principles underpinning health and criminal justice practices.

  10. Relativity and equivalence principles in the gauge theory of gravitation

    International Nuclear Information System (INIS)

    Ivanenko, D.; Sardanashvili, G.

    1981-01-01

    Roles of relativity (RP) and equivalence principles (EP) in the gauge theory of gravity are shown. RP in the gravitational theory in formalism of laminations can be formulated as requirement of covariance of equations relative to the GL + (4, R)(X) gauge group. In such case RP turns out to be identical to the gauge principle in the gauge theory of a group of outer symmetries, and the gravitational theory can be directly constructed as the gauge theory. In general relativity theory the equivalence theory adds RP and is intended for description of transition to a special relativity theory in some system of reference. The approach described takes into account that in the gauge theory, besides gauge fields under conditions of spontaneous symmetry breaking, the Goldstone and Higgs fields can also arise, to which the gravitational metric field is related, what is the sequence of taking account of RP in the gauge theory of gravitation [ru

  11. Observable traces of non-metricity: New constraints on metric-affine gravity

    Science.gov (United States)

    Delhom-Latorre, Adrià; Olmo, Gonzalo J.; Ronco, Michele

    2018-05-01

    Relaxing the Riemannian condition to incorporate geometric quantities such as torsion and non-metricity may allow to explore new physics associated with defects in a hypothetical space-time microstructure. Here we show that non-metricity produces observable effects in quantum fields in the form of 4-fermion contact interactions, thereby allowing us to constrain the scale of non-metricity to be greater than 1 TeV by using results on Bahbah scattering. Our analysis is carried out in the framework of a wide class of theories of gravity in the metric-affine approach. The bound obtained represents an improvement of several orders of magnitude to previous experimental constraints.

  12. Fourteenth annual report radiation exposures for DOE and DOE contractor employees - 1981

    International Nuclear Information System (INIS)

    1983-03-01

    All Department of Energy (DOE) and DOE contractors are required by DOE Order 5484.1, Chapter IV, to submit occupational exposure records to a central repository. The data required includes a summary of whole-body exposures to ionizing radiation, a summary of internal depositions of radioactive materials above specified limits, and occupational exposure reports for terminating employees. This report is a summary of the data submitted by DOE and DOE contractors for 1981. A total of 82,873 DOE and DOE contractor employees were monitored for whole-body ionizing radiation exposures in 1981. In addition to the employees, 84,343 visitors were monitored. Of all employees monitored, 54.43% received a dose equivalent that was less than measurable, 44.04% a measurable exposure less than 1 rem, and 1.53% an exposure granter than 1 rem. the exposure received by 88.14% of the visitors to DOE facilities was less than measurable. Only 11.85% of the visitors received a measurable exposure less than 1 rem, and 0.0% of the visitors received an exposure greater than 1 rem. No employees or visitors received a dose equivalent greater than 5 rem. The collective dose equivalent for DOE and DOE contractors employees was 6,902 person-rem. The collective dose equivalent for visitors was 579 person-rem. The total dose equivalent for employees and visitors combined was 7,481 person-rem. The average dose equivalent for all individuals (employees and visitors) monitored was 45 mrem and the average dose equivalent for all individuals who received a measurable exposure was 157 mrem. The highest average dose equivalent was observed for employees monitored at fuel processing facilities (342 mrem) and the lowest among visitors (7 mrem) to DOE facilities. These averages are significantly less than the DOE 5-rem/year radiation protection standard for whole-body exposures

  13. Measurement of cardiopulmonary performance during acute exposure to a 2440-m equivalent atmosphere

    Science.gov (United States)

    Levitan, B. M.; Bungo, M. W.

    1982-01-01

    Each of 20 subjects (ranging in age from 18 to 38 years, 15 being male, five female) was given two Bruce Protocol symptom-limited maximum treadmill stress tests, breathing sea-level compressed air (20.9% O2) for one test and a 2440-m equivalent (15.5% O2) for the other. A significant difference was found to exist between measured VO2 max (p less than 0.0002) and exercise time (p less than 0.0004) for the two conditions. No significant differences were observed in heart rate or the recovery time to a respiratory quotient of less than 1. Hemoglobin saturation, as measured by an ear oximeter, averaged 95% for sea-level and 91% for the 2440-m equivalent gases. These results support a 2440-m equivalent contingency atmosphere in the Space Shuttle prior to donning a low-pressure suit for the purpose reducing nitrogen washout times.

  14. Empirical Information Metrics for Prediction Power and Experiment Planning

    Directory of Open Access Journals (Sweden)

    Christopher Lee

    2011-01-01

    Full Text Available In principle, information theory could provide useful metrics for statistical inference. In practice this is impeded by divergent assumptions: Information theory assumes the joint distribution of variables of interest is known, whereas in statistical inference it is hidden and is the goal of inference. To integrate these approaches we note a common theme they share, namely the measurement of prediction power. We generalize this concept as an information metric, subject to several requirements: Calculation of the metric must be objective or model-free; unbiased; convergent; probabilistically bounded; and low in computational complexity. Unfortunately, widely used model selection metrics such as Maximum Likelihood, the Akaike Information Criterion and Bayesian Information Criterion do not necessarily meet all these requirements. We define four distinct empirical information metrics measured via sampling, with explicit Law of Large Numbers convergence guarantees, which meet these requirements: Ie, the empirical information, a measure of average prediction power; Ib, the overfitting bias information, which measures selection bias in the modeling procedure; Ip, the potential information, which measures the total remaining information in the observations not yet discovered by the model; and Im, the model information, which measures the model’s extrapolation prediction power. Finally, we show that Ip + Ie, Ip + Im, and Ie — Im are fixed constants for a given observed dataset (i.e. prediction target, independent of the model, and thus represent a fundamental subdivision of the total information contained in the observations. We discuss the application of these metrics to modeling and experiment planning.    

  15. Control of the individual exposure in the practice of a nuclear medicine department

    International Nuclear Information System (INIS)

    Hernandez, J.M.; Castro Crespo, D.; Naranjo Cardentey, O.

    1996-01-01

    In Cuba, since the beginning of the 50 s radioisotopes and radiopharmaceutical have been used for medical purposes to diagnose different diseases. At present about 21 modules of Nuclear Medicine are available for this type of medical assistance service. Use of these substances has increased notably as years passed by, being it an important source of exposure to ionizing radiation. Therefore we were interested in knowing the behaviour of the distribution of equivalent doses during the procedures that are performed in one of such typical modules of nuclear medicine

  16. Metric diffusion along foliations

    CERN Document Server

    Walczak, Szymon M

    2017-01-01

    Up-to-date research in metric diffusion along compact foliations is presented in this book. Beginning with fundamentals from the optimal transportation theory and the theory of foliations; this book moves on to cover Wasserstein distance, Kantorovich Duality Theorem, and the metrization of the weak topology by the Wasserstein distance. Metric diffusion is defined, the topology of the metric space is studied and the limits of diffused metrics along compact foliations are discussed. Essentials on foliations, holonomy, heat diffusion, and compact foliations are detailed and vital technical lemmas are proved to aide understanding. Graduate students and researchers in geometry, topology and dynamics of foliations and laminations will find this supplement useful as it presents facts about the metric diffusion along non-compact foliation and provides a full description of the limit for metrics diffused along foliation with at least one compact leaf on the two dimensions.

  17. Influence of exposure assessment and parameterization on exposure response. Aspects of epidemiologic cohort analysis using the Libby Amphibole asbestos worker cohort.

    Science.gov (United States)

    Bateson, Thomas F; Kopylev, Leonid

    2015-01-01

    Recent meta-analyses of occupational epidemiology studies identified two important exposure data quality factors in predicting summary effect measures for asbestos-associated lung cancer mortality risk: sufficiency of job history data and percent coverage of work history by measured exposures. The objective was to evaluate different exposure parameterizations suggested in the asbestos literature using the Libby, MT asbestos worker cohort and to evaluate influences of exposure measurement error caused by historically estimated exposure data on lung cancer risks. Focusing on workers hired after 1959, when job histories were well-known and occupational exposures were predominantly based on measured exposures (85% coverage), we found that cumulative exposure alone, and with allowance of exponential decay, fit lung cancer mortality data similarly. Residence-time-weighted metrics did not fit well. Compared with previous analyses based on the whole cohort of Libby workers hired after 1935, when job histories were less well-known and exposures less frequently measured (47% coverage), our analyses based on higher quality exposure data yielded an effect size as much as 3.6 times higher. Future occupational cohort studies should continue to refine retrospective exposure assessment methods, consider multiple exposure metrics, and explore new methods of maintaining statistical power while minimizing exposure measurement error.

  18. EFFDOS - a FORTRAN-77-code for the calculation of the effective dose equivalent

    International Nuclear Information System (INIS)

    Baer, M.; Honcu, S.; Huebschmann, W.

    1984-01-01

    The FORTRAN-77-code EFFDOS calculates the effective dose equivalent according to ICRP 26 due to the longterm emission of radionuclides into the atmosphere for the following exposure pathways: inhalation, ingestion, γ-ground irradiation (γ-irradiation by radionuclides deposited on the ground) and β- or γ-submersion (irradiation by the passing radioactive cloud). For calculating the effective dose equivalent at a single spot it is necessary to put in the diffusion factor and - if need be - the washout factor; otherwise EFFDOS calculates the input data for the computer codes ISOLA III and WOLGA-1, which then are enabled to compute the atmospheric diffusion, ground deposition and local dose equivalent distribution for the requested exposure pathway. Atmospheric diffusion, deposition and radionuclide transfer are calculated according to the ''Allgemeine Berechnungsgrundlage ....'' recommended by the German Fed. Ministry of Interior. A sample calculated is added. (orig.) [de

  19. Fault Management Metrics

    Science.gov (United States)

    Johnson, Stephen B.; Ghoshal, Sudipto; Haste, Deepak; Moore, Craig

    2017-01-01

    This paper describes the theory and considerations in the application of metrics to measure the effectiveness of fault management. Fault management refers here to the operational aspect of system health management, and as such is considered as a meta-control loop that operates to preserve or maximize the system's ability to achieve its goals in the face of current or prospective failure. As a suite of control loops, the metrics to estimate and measure the effectiveness of fault management are similar to those of classical control loops in being divided into two major classes: state estimation, and state control. State estimation metrics can be classified into lower-level subdivisions for detection coverage, detection effectiveness, fault isolation and fault identification (diagnostics), and failure prognosis. State control metrics can be classified into response determination effectiveness and response effectiveness. These metrics are applied to each and every fault management control loop in the system, for each failure to which they apply, and probabilistically summed to determine the effectiveness of these fault management control loops to preserve the relevant system goals that they are intended to protect.

  20. Completion of a Dislocated Metric Space

    Directory of Open Access Journals (Sweden)

    P. Sumati Kumari

    2015-01-01

    Full Text Available We provide a construction for the completion of a dislocated metric space (abbreviated d-metric space; we also prove that the completion of the metric associated with a d-metric coincides with the metric associated with the completion of the d-metric.

  1. Is the dose equivalent index a quantity to be measured

    International Nuclear Information System (INIS)

    Wagner, S.R.

    1980-01-01

    ICRP introduced the concept of Effective Dose Equivalent H(sub)E and fixed the basic limits of radiation exposure in terms of H(sub)I. As H(sub)E cannot be measured, ICRP stated that with external exposure to penetrating radiation the limitation of the Dose Equivalent Index H(sub)I would afford at least as good a level of protection. However, difficulties arise in measuring H(sub)I and in calibrating instruments in terms of H(sub)I, since the height and location of the dose equivalent maximum in the sphere which is the phantom used in the definition of H(sub)I, depend on the energy and the angular distribution of the incident radiation. That is, H(sub)I is not an additive quantity relative to the partial H(sub)I(sub)i-values of the different energy and angular components. Hence, 1) the distribution of dose equivalent in the sphere must be measured in full for a determination of H(sub)I, and 2) it is not possible to calibrate an instrument which does not exhibit the scattering and absorption properties of the sphere, consistently for arbitrary radiation fields in terms of H(sub)I. Thus the calibration in an unidirectional beam would infer an uncertainty which may amount to a factor of up to 4. This would hardly be tolerable as a base for radiation protection provisions. An alternative is to introduce operational quantities which are additive, e.g. 1) the sum of maxima of the dose equivalent distributions in the sphere produced by different radiation components, and 2) the mean dose equivalent in the sphere. Their relation to H(sub)E for different types of radiation and consequences on secondary limits are discussed. (H.K.)

  2. Motor equivalence and structure of variance: multi-muscle postural synergies in Parkinson's disease.

    Science.gov (United States)

    Falaki, Ali; Huang, Xuemei; Lewis, Mechelle M; Latash, Mark L

    2017-07-01

    We explored posture-stabilizing multi-muscle synergies with two methods of analysis of multi-element, abundant systems: (1) Analysis of inter-cycle variance; and (2) Analysis of motor equivalence, both quantified within the framework of the uncontrolled manifold (UCM) hypothesis. Data collected in two earlier studies of patients with Parkinson's disease (PD) were re-analyzed. One study compared synergies in the space of muscle modes (muscle groups with parallel scaling of activation) during tasks performed by early-stage PD patients and controls. The other study explored the effects of dopaminergic medication on multi-muscle-mode synergies. Inter-cycle variance and absolute magnitude of the center of pressure displacement across consecutive cycles were quantified during voluntary whole-body sway within the UCM and orthogonal to the UCM space. The patients showed smaller indices of variance within the UCM and motor equivalence compared to controls. The indices were also smaller in the off-drug compared to on-drug condition. There were strong across-subject correlations between the inter-cycle variance within/orthogonal to the UCM and motor equivalent/non-motor equivalent displacements. This study has shown that, at least for cyclical tasks, analysis of variance and analysis of motor equivalence lead to metrics of stability that correlate with each other and show similar effects of disease and medication. These results show, for the first time, intimate links between indices of variance and motor equivalence. They suggest that analysis of motor equivalence, which requires only a handful of trials, could be used broadly in the field of motor disorders to analyze problems with action stability.

  3. Metrics with vanishing quantum corrections

    International Nuclear Information System (INIS)

    Coley, A A; Hervik, S; Gibbons, G W; Pope, C N

    2008-01-01

    We investigate solutions of the classical Einstein or supergravity equations that solve any set of quantum corrected Einstein equations in which the Einstein tensor plus a multiple of the metric is equated to a symmetric conserved tensor T μν (g αβ , ∂ τ g αβ , ∂ τ ∂ σ g αβ , ...,) constructed from sums of terms, the involving contractions of the metric and powers of arbitrary covariant derivatives of the curvature tensor. A classical solution, such as an Einstein metric, is called universal if, when evaluated on that Einstein metric, T μν is a multiple of the metric. A Ricci flat classical solution is called strongly universal if, when evaluated on that Ricci flat metric, T μν vanishes. It is well known that pp-waves in four spacetime dimensions are strongly universal. We focus attention on a natural generalization; Einstein metrics with holonomy Sim(n - 2) in which all scalar invariants are zero or constant. In four dimensions we demonstrate that the generalized Ghanam-Thompson metric is weakly universal and that the Goldberg-Kerr metric is strongly universal; indeed, we show that universality extends to all four-dimensional Sim(2) Einstein metrics. We also discuss generalizations to higher dimensions

  4. The cyclophosphamide equivalent dose as an approach for quantifying alkylating agent exposure: a report from the Childhood Cancer Survivor Study.

    Science.gov (United States)

    Green, Daniel M; Nolan, Vikki G; Goodman, Pamela J; Whitton, John A; Srivastava, DeoKumar; Leisenring, Wendy M; Neglia, Joseph P; Sklar, Charles A; Kaste, Sue C; Hudson, Melissa M; Diller, Lisa R; Stovall, Marilyn; Donaldson, Sarah S; Robison, Leslie L

    2014-01-01

    Estimation of the risk of adverse long-term outcomes such as second malignant neoplasms and infertility often requires reproducible quantification of exposures. The method for quantification should be easily utilized and valid across different study populations. The widely used Alkylating Agent Dose (AAD) score is derived from the drug dose distribution of the study population and thus cannot be used for comparisons across populations as each will have a unique distribution of drug doses. We compared the performance of the Cyclophosphamide Equivalent Dose (CED), a unit for quantifying alkylating agent exposure independent of study population, to the AAD. Comparisons included associations from three Childhood Cancer Survivor Study (CCSS) outcome analyses, receiver operator characteristic (ROC) curves and goodness of fit based on the Akaike's Information Criterion (AIC). The CED and AAD performed essentially identically in analyses of risk for pregnancy among the partners of male CCSS participants, risk for adverse dental outcomes among all CCSS participants and risk for premature menopause among female CCSS participants, based on similar associations, lack of statistically significant differences between the areas under the ROC curves and similar model fit values for the AIC between models including the two measures of exposure. The CED is easily calculated, facilitating its use for patient counseling. It is independent of the drug dose distribution of a particular patient population, a characteristic that will allow direct comparisons of outcomes among epidemiological cohorts. We recommend the use of the CED in future research assessing cumulative alkylating agent exposure. © 2013 Wiley Periodicals, Inc.

  5. Remarks on G-Metric Spaces

    Directory of Open Access Journals (Sweden)

    Bessem Samet

    2013-01-01

    Full Text Available In 2005, Mustafa and Sims (2006 introduced and studied a new class of generalized metric spaces, which are called G-metric spaces, as a generalization of metric spaces. We establish some useful propositions to show that many fixed point theorems on (nonsymmetric G-metric spaces given recently by many authors follow directly from well-known theorems on metric spaces. Our technique can be easily extended to other results as shown in application.

  6. Equivalent sphere approximations for skin, eye, and blood-forming organs

    International Nuclear Information System (INIS)

    Maxson, W.L.; Townsend, L.W.; Bier, S.G.

    1996-01-01

    Throughout the manned spaceflight program, protecting astronauts from space radiation has been the subject of intense study. For interplanetary crews, two main sources of radiation hazards are solar particle events (SPEs) and galactic cosmic rays. For nearly three decades, crew doses and related shielding requirements have been assessed using the assumption that body organ exposures are well approximated by exposures at the center of tissue-equivalent spheres. For the skin and for blood-forming organs (BFOs), these spheres have radii of 0 and 5 cm, respectively. Recent studies indicate that significant overestimation of organ doses occurs if these models are used instead of realistic human geometry models. The use of the latter, however, requires much longer computational times. In this work, the authors propose preliminary revisions to these equivalent sphere approximations that yield more realistic dose estimates

  7. Radiation exposure in X-ray studies of the hips

    Energy Technology Data Exchange (ETDEWEB)

    Kainberger, F [Krankenhaus der Barmherzigen Brueder, Salzburg (Austria). Roentgeninstitut

    1979-12-01

    The genetic exposure of the small child is above all a consequence of the rapid increase in X-ray studies, a problem which has not yet been settled. Through phantom measurements it can be shown, that dose reduction is of considerable practical significance if appropriate lead shielding is employed. The radiation dose can be significantly reduced, provided that the shielding material is of an appropriate lead equivalent. The form of the pelvic shield which is used is also of crucial importance.

  8. New equivalent-electrical circuit model and a practical measurement method for human body impedance.

    Science.gov (United States)

    Chinen, Koyu; Kinjo, Ichiko; Zamami, Aki; Irei, Kotoyo; Nagayama, Kanako

    2015-01-01

    Human body impedance analysis is an effective tool to extract electrical information from tissues in the human body. This paper presents a new measurement method of impedance using armpit electrode and a new equivalent circuit model for the human body. The lowest impedance was measured by using an LCR meter and six electrodes including armpit electrodes. The electrical equivalent circuit model for the cell consists of resistance R and capacitance C. The R represents electrical resistance of the liquid of the inside and outside of the cell, and the C represents high frequency conductance of the cell membrane. We propose an equivalent circuit model which consists of five parallel high frequency-passing CR circuits. The proposed equivalent circuit represents alpha distribution in the impedance measured at a lower frequency range due to ion current of the outside of the cell, and beta distribution at a high frequency range due to the cell membrane and the liquid inside cell. The calculated values by using the proposed equivalent circuit model were consistent with the measured values for the human body impedance.

  9. Metric-adjusted skew information

    DEFF Research Database (Denmark)

    Liang, Cai; Hansen, Frank

    2010-01-01

    on a bipartite system and proved superadditivity of the Wigner-Yanase-Dyson skew informations for such states. We extend this result to the general metric-adjusted skew information. We finally show that a recently introduced extension to parameter values 1 ...We give a truly elementary proof of the convexity of metric-adjusted skew information following an idea of Effros. We extend earlier results of weak forms of superadditivity to general metric-adjusted skew information. Recently, Luo and Zhang introduced the notion of semi-quantum states...... of (unbounded) metric-adjusted skew information....

  10. Radiology metrics for safe use and regulatory compliance with CT imaging

    Science.gov (United States)

    Paden, Robert; Pavlicek, William

    2018-03-01

    The MACRA Act creates a Merit-Based Payment System, with monitoring patient exposure from CT providing one possible quality metric for meeting merit requirements. Quality metrics are also required by The Joint Commission, ACR, and CMS as facilities are tasked to perform reviews of CT irradiation events outside of expected ranges, review protocols for appropriateness, and validate parameters for low dose lung cancer screening. In order to efficiently collect and analyze irradiation events and associated DICOM tags, all clinical CT devices were DICOM connected to a parser which extracted dose related information for storage into a database. Dose data from every exam is compared to the appropriate external standard exam type. AAPM recommended CTDIvol values for head and torso, adult and pediatrics, coronary and perfusion exams are used for this study. CT doses outside the expected range were automatically formatted into a report for analysis and review documentation. CT Technologist textual content, the reason for proceeding with an irradiation above the recommended threshold, is captured for inclusion in the follow up reviews by physics staff. The use of a knowledge based approach in labeling individual protocol and device settings is a practical solution resulting in efficiency of analysis and review. Manual methods would require approximately 150 person-hours for our facility, exclusive of travel time and independent of device availability. An efficiency of 89% time savings occurs through use of this informatics tool including a low dose CT comparison review and low dose lung cancer screening requirements set forth by CMS.

  11. Attitudes, knowledge and practices of healthcare workers regarding occupational exposure of pulmonary tuberculosis

    Directory of Open Access Journals (Sweden)

    Lesley T. Bhebhe

    2014-01-01

    Full Text Available Background: Healthcare-associated tuberculosis (TB has become a major occupational hazard for healthcare workers (HCWs. HCWs are inevitably exposed to TB, due to frequent interaction with patients with undiagnosed and potentially contagious TB. Whenever there is a possibility of exposure, implementation of infection prevention and control (IPC practices is critical.Objective: Following a high incidence of TB among HCWs at Maluti Adventist Hospital in Lesotho, a study was carried out to assess the knowledge, attitudes and practices of HCWs regarding healthcare-associated TB infection and infection controls.Methods: This was a cross-sectional study performed in June 2011; it involved HCWs at Maluti Adventist Hospital who were involved with patients and/or sputum. Stratified sampling of 140 HCWs was performed, of whom, 129 (92.0% took part. A self-administered, semi-structured questionnaire was used.Results: Most respondents (89.2% had appropriate knowledge of transmission, diagnosis and prevention of TB; however, only 22.0% of the respondents knew the appropriate method of sputum collection. All of the respondents (100.0% were motivated and willing to implement IPC measures. A significant proportion of participants (36.4% reported poor infection control practices, with the majority of inappropriate practices being the administrative infection controls (> 80.0%. Only 38.8% of the participants reported to be using the appropriate N-95 respirator.Conclusion: Poor infection control practices regarding occupational TB exposure were demonstrated, the worst being the first-line administrative infection controls. Critical knowledge gaps were identified; however, there was encouraging willingness by HCWs to adapt to recommended infection control measures. Healthcare workers are inevitably exposed to TB, due to frequent interaction with patients with undiagnosed and potentially contagious TB. Implementation of infection prevention and control practices is

  12. Societal impact metrics for non-patentable research in dentistry

    Energy Technology Data Exchange (ETDEWEB)

    Hicks, D.; Isett, K.; Melkers, J.; Song, L.; Trivedi, R.

    2016-07-01

    Indicators of research impact tend to revolve around patents, licenses and startups. However, much university research is non-patentable and therefore doesn’t register in those metrics. That does not mean such research lacks impact, just that it follows different pathways to use in society. Without the visibility of patents, license income and jobs created in startups, society risks ignoring or discounting the societal impact of such research and therefore of undervaluing the research itself. In order to make visible the importance of research advances underpinning broader societal advance, in this project we explore the possibility of developing metrics of research impact for research whose results are relevant to professional practice. (Author)

  13. Challenges and perspectives of nanoparticle exposure assessment.

    Science.gov (United States)

    Lee, Ji Hyun; Moon, Min Chaul; Lee, Joon Yeob; Yu, Il Je

    2010-06-01

    Nanoparticle exposure assessment presents a unique challenge in the field of occupational and environmental health. With the commercialization of nanotechnology, exposure usually starts from the workplace and then spreads to environment and consumer exposure. This report discusses the current trends of nanoparticle exposure assessment, including the definition of nanotechnology relevant terms, essential physicochemical properties for nanomaterial characterization, current international activities related nanomaterial safety, and exposure assessment standard development for nanotechnology. Further this report describes challenges of nanoparticle exposure assessment such as background measurement, metrics of nanoparticle exposure assessment and personal sampling.

  14. Software metrics: Software quality metrics for distributed systems. [reliability engineering

    Science.gov (United States)

    Post, J. V.

    1981-01-01

    Software quality metrics was extended to cover distributed computer systems. Emphasis is placed on studying embedded computer systems and on viewing them within a system life cycle. The hierarchy of quality factors, criteria, and metrics was maintained. New software quality factors were added, including survivability, expandability, and evolvability.

  15. Influence of Musical Enculturation on Brain Responses to Metric Deviants

    Directory of Open Access Journals (Sweden)

    Niels T. Haumann

    2018-04-01

    Full Text Available The ability to recognize metric accents is fundamental in both music and language perception. It has been suggested that music listeners prefer rhythms that follow simple binary meters, which are common in Western music. This means that listeners expect odd-numbered beats to be strong and even-numbered beats to be weak. In support of this, studies have shown that listeners exposed to Western music show stronger novelty and incongruity related P3 and irregularity detection related mismatch negativity (MMN brain responses to attenuated odd- than attenuated even-numbered metric positions. Furthermore, behavioral evidence suggests that music listeners' preferences can be changed by long-term exposure to non-Western rhythms and meters, e.g., by listening to African or Balkan music. In our study, we investigated whether it might be possible to measure effects of music enculturation on neural responses to attenuated tones on specific metric positions. We compared the magnetic mismatch negativity (MMNm to attenuated beats in a “Western group” of listeners (n = 12 mainly exposed to Western music and a “Bicultural group” of listeners (n = 13 exposed for at least 1 year to both Sub-Saharan African music in addition to Western music. We found that in the “Western group” the MMNm was higher in amplitude to deviant tones on odd compared to even metric positions, but not in the “Bicultural group.” In support of this finding, there was also a trend of the “Western group” to rate omitted beats as more surprising on odd than even metric positions, whereas the “Bicultural group” seemed to discriminate less between metric positions in terms of surprise ratings. Also, we observed that the overall latency of the MMNm was significantly shorter in the Bicultural group compared to the Western group. These effects were not biased by possible differences in rhythm perception ability or music training, measured with the Musical Ear Test (MET

  16. Influence of Musical Enculturation on Brain Responses to Metric Deviants.

    Science.gov (United States)

    Haumann, Niels T; Vuust, Peter; Bertelsen, Freja; Garza-Villarreal, Eduardo A

    2018-01-01

    The ability to recognize metric accents is fundamental in both music and language perception. It has been suggested that music listeners prefer rhythms that follow simple binary meters, which are common in Western music. This means that listeners expect odd-numbered beats to be strong and even-numbered beats to be weak. In support of this, studies have shown that listeners exposed to Western music show stronger novelty and incongruity related P3 and irregularity detection related mismatch negativity (MMN) brain responses to attenuated odd- than attenuated even-numbered metric positions. Furthermore, behavioral evidence suggests that music listeners' preferences can be changed by long-term exposure to non-Western rhythms and meters, e.g., by listening to African or Balkan music. In our study, we investigated whether it might be possible to measure effects of music enculturation on neural responses to attenuated tones on specific metric positions. We compared the magnetic mismatch negativity (MMNm) to attenuated beats in a "Western group" of listeners ( n = 12) mainly exposed to Western music and a "Bicultural group" of listeners ( n = 13) exposed for at least 1 year to both Sub-Saharan African music in addition to Western music. We found that in the "Western group" the MMNm was higher in amplitude to deviant tones on odd compared to even metric positions, but not in the "Bicultural group." In support of this finding, there was also a trend of the "Western group" to rate omitted beats as more surprising on odd than even metric positions, whereas the "Bicultural group" seemed to discriminate less between metric positions in terms of surprise ratings. Also, we observed that the overall latency of the MMNm was significantly shorter in the Bicultural group compared to the Western group. These effects were not biased by possible differences in rhythm perception ability or music training, measured with the Musical Ear Test (MET). Furthermore, source localization analyses

  17. Considerations on development, validation, application, and quality control of immuno(metric) biomarker assays in clinical cancer research: an EORTC-NCI working group report.

    NARCIS (Netherlands)

    Sweep, C.G.J.; Fritsche, H.A.; Gion, M.; Klee, G.G.; Schmitt, M.

    2003-01-01

    A major dilemma associated with immuno(metric) assays for biomarkers is that various kits employing antibodies with differing specificities and binding affinities may generate non-equivalent test results. Also, variation in sample processing and the use of different standards (reference material)

  18. Survey of radiologic practices among dental practitioners

    International Nuclear Information System (INIS)

    Goren, A.D.; Sciubba, J.J.; Friedman, R.; Malamud, H.

    1989-01-01

    The purpose of this study was to determine the factors that influence and contribute to patient exposure in radiologic procedures performed in the offices of 132 staff members within the dental department of a teaching hospital. A questionnaire was prepared in which data were requested on brands of film used, type of x-ray unit used, processing, and use of leaded apron, cervical shield, and film holder. Offices were also visited to evaluate performance of existing dental x-ray equipment. Both the Dental Radiographic Normalizing and Monitoring Device and the Dental Quality Control Test Tool were evaluated. The average exposure was equivalent to the class D film (220 mR), but only 13% of those surveyed used the faster class E film, which would reduce patient exposure in half. The survey indicates that dentists are not using the newer low-exposure class E film in their practices

  19. External gamma exposure to radon progeny in indoor air

    International Nuclear Information System (INIS)

    Fujimoto, Kenzo

    1985-01-01

    The external γ-exposure from radon progeny uniformly distributed in indoor air was estimated by a computer program that was developed. This program can calculate the fluence rate, exposure rate and average energy for any given point in a room of any given size. As numerical example, the exposure rate normalized to unit airborne activity is presented, together with the fluence-weighted and exposure-weighted average photon energies, for a room of representative geometry containing radon progeny in equilibrium. To cover other conditions encountered in practice, quantitative evaluations are additionally presented of the effect on the exposure brought by changes in certain parameters, such as equilibrium factor, wall thickness, room size and receptor position. The study has quantitatively substantiated the prevailing postulate that the effective dose equivalent due to external exposure resulting from normal indoor concentrations of airborne radon progeny in the room of representative geometry should only amount to 0.04 % of that from the internal exposure from the same sources, and that it should be of similarly negligible order compared with internal exposure also in the case of other room geometries. (author)

  20. Equivalence and precision of knee cartilage morphometry between different segmentation teams, cartilage regions, and MR acquisitions

    Science.gov (United States)

    Schneider, E; Nevitt, M; McCulloch, C; Cicuttini, FM; Duryea, J; Eckstein, F; Tamez-Pena, J

    2012-01-01

    Objective To compare precision and evaluate equivalence of femorotibial cartilage volume (VC) and mean cartilage thickness (ThCtAB.Me) from independent segmentation teams using identical MR images from three series: sagittal 3D Dual Echo in the Steady State (DESS), coronal multi-planar reformat (DESS-MPR) of DESS and coronal 3D Fast Low Angle SHot (FLASH). Design 19 subjects underwent test-retest MR imaging at 3 Tesla. Four teams segmented the cartilage using prospectively defined plate regions and rules. Mixed models analysis of the pooled data were used to evaluate the effect of acquisition, team and plate on precision and Pearson correlations and mixed models to evaluate equivalence. Results Segmentation team differences dominated measurement variability in most cartilage regions for all image series. Precision of VC and ThCtAB.Me differed significantly by team and cartilage plate, but not between FLASH and DESS. Mean values of VC and ThCtAB.Me differed by team (P<0.05) for DESS, FLASH and DESS-MPR, FLASH VC was 4–6% larger than DESS in the medial tibia and lateral central femur, and FLASH ThCtAB.Me was 5–6% larger in the medial tibia, but 4–8% smaller in the medial central femur. Correlations betweenDESS and FLASH for VC and ThCtAB.Me were high (r=0.90–0.97), except for DESS versus FLASH medial central femur ThCtAB.Me (r=0.81–0.83). Conclusions Cartilage morphology metrics from different image contrasts had similar precision, were generally equivalent, and may be combined for cross-sectional analyses if potential systematic offsets are accounted for. Data from different teams should not be pooled unless equivalence is demonstrated for cartilage metrics of interest. PMID:22521758

  1. Sensory Metrics of Neuromechanical Trust.

    Science.gov (United States)

    Softky, William; Benford, Criscillia

    2017-09-01

    Today digital sources supply a historically unprecedented component of human sensorimotor data, the consumption of which is correlated with poorly understood maladies such as Internet addiction disorder and Internet gaming disorder. Because both natural and digital sensorimotor data share common mathematical descriptions, one can quantify our informational sensorimotor needs using the signal processing metrics of entropy, noise, dimensionality, continuity, latency, and bandwidth. Such metrics describe in neutral terms the informational diet human brains require to self-calibrate, allowing individuals to maintain trusting relationships. With these metrics, we define the trust humans experience using the mathematical language of computational models, that is, as a primitive statistical algorithm processing finely grained sensorimotor data from neuromechanical interaction. This definition of neuromechanical trust implies that artificial sensorimotor inputs and interactions that attract low-level attention through frequent discontinuities and enhanced coherence will decalibrate a brain's representation of its world over the long term by violating the implicit statistical contract for which self-calibration evolved. Our hypersimplified mathematical understanding of human sensorimotor processing as multiscale, continuous-time vibratory interaction allows equally broad-brush descriptions of failure modes and solutions. For example, we model addiction in general as the result of homeostatic regulation gone awry in novel environments (sign reversal) and digital dependency as a sub-case in which the decalibration caused by digital sensorimotor data spurs yet more consumption of them. We predict that institutions can use these sensorimotor metrics to quantify media richness to improve employee well-being; that dyads and family-size groups will bond and heal best through low-latency, high-resolution multisensory interaction such as shared meals and reciprocated touch; and

  2. Classification in medical image analysis using adaptive metric k-NN

    DEFF Research Database (Denmark)

    Chen, Chen; Chernoff, Konstantin; Karemore, Gopal

    2010-01-01

    The performance of the k-nearest neighborhoods (k-NN) classifier is highly dependent on the distance metric used to identify the k nearest neighbors of the query points. The standard Euclidean distance is commonly used in practice. This paper investigates the performance of k-NN classifier...

  3. The metric system: An introduction

    Science.gov (United States)

    Lumley, Susan M.

    On 13 Jul. 1992, Deputy Director Duane Sewell restated the Laboratory's policy on conversion to the metric system which was established in 1974. Sewell's memo announced the Laboratory's intention to continue metric conversion on a reasonable and cost effective basis. Copies of the 1974 and 1992 Administrative Memos are contained in the Appendix. There are three primary reasons behind the Laboratory's conversion to the metric system. First, Public Law 100-418, passed in 1988, states that by the end of fiscal year 1992 the Federal Government must begin using metric units in grants, procurements, and other business transactions. Second, on 25 Jul. 1991, President George Bush signed Executive Order 12770 which urged Federal agencies to expedite conversion to metric units. Third, the contract between the University of California and the Department of Energy calls for the Laboratory to convert to the metric system. Thus, conversion to the metric system is a legal requirement and a contractual mandate with the University of California. Public Law 100-418 and Executive Order 12770 are discussed in more detail later in this section, but first they examine the reasons behind the nation's conversion to the metric system. The second part of this report is on applying the metric system.

  4. The metric system: An introduction

    Energy Technology Data Exchange (ETDEWEB)

    Lumley, S.M.

    1995-05-01

    On July 13, 1992, Deputy Director Duane Sewell restated the Laboratory`s policy on conversion to the metric system which was established in 1974. Sewell`s memo announced the Laboratory`s intention to continue metric conversion on a reasonable and cost effective basis. Copies of the 1974 and 1992 Administrative Memos are contained in the Appendix. There are three primary reasons behind the Laboratory`s conversion to the metric system. First, Public Law 100-418, passed in 1988, states that by the end of fiscal year 1992 the Federal Government must begin using metric units in grants, procurements, and other business transactions. Second, on July 25, 1991, President George Bush signed Executive Order 12770 which urged Federal agencies to expedite conversion to metric units. Third, the contract between the University of California and the Department of Energy calls for the Laboratory to convert to the metric system. Thus, conversion to the metric system is a legal requirement and a contractual mandate with the University of California. Public Law 100-418 and Executive Order 12770 are discussed in more detail later in this section, but first they examine the reasons behind the nation`s conversion to the metric system. The second part of this report is on applying the metric system.

  5. Standard Practice for Exposure of Cover Materials for Solar Collectors to Natural Weathering Under Conditions Simulating Operational Mode

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    1995-01-01

    1.1 This practice provides a procedure for the exposure of cover materials for flat-plate solar collectors to the natural weather environment at temperatures that are elevated to approximate operating conditions. 1.2 This practice is suitable for exposure of both glass and plastic solar collector cover materials. Provisions are made for exposure of single and double cover assemblies to accommodate the need for exposure of both inner and outer solar collector cover materials. 1.3 This practice does not apply to cover materials for evacuated collectors or photovoltaics. 1.4 The values stated in SI units are to be regarded as the standard. 1.5 This standard does not purport to address all of the safety concerns, if any, associated with its use. It is the responsibility of the user of this standard to establish appropriate safety and health practices and determine the applicability of regulatory limitations prior to use.

  6. A multi-metric assessment of environmental contaminant exposure and effects in an urbanized reach of the Charles River near Watertown, Massachusetts

    Science.gov (United States)

    Smith, Stephen B.; Anderson, Patrick J.; Baumann, Paul C.; DeWeese, Lawrence R.; Goodbred, Steven L.; Coyle, James J.; Smith, David S.

    2012-01-01

    The Charles River Project provided an opportunity to simultaneously deploy a combination of biomonitoring techniques routinely used by the U.S. Geological Survey National Water Quality Assessment Program, the Biomonitoring of Environmental Status and Trends Project, and the Contaminant Biology Program at an urban site suspected to be contaminated with polycyclic aromatic hydrocarbons. In addition to these standardized methods, additional techniques were used to further elucidate contaminant exposure and potential impacts of exposure on biota. The purpose of the study was to generate a comprehensive, multi-metric data set to support assessment of contaminant exposure and effects at the site. Furthermore, the data set could be assessed to determine the relative performance of the standardized method suites typically used by the National Water Quality Assessment Program and the Biomonitoring of Environmental Status and Trends Project, as well as the additional biomonitoring methods used in the study to demonstrate ecological effects of contaminant exposure. The Contaminant Effects Workgroup, an advisory committee of the U.S. Geological Survey/Contaminant Biology Program, identified polycyclic aromatic hydrocarbons as the contaminant class of greatest concern in urban streams of all sizes. The reach of the Charles River near Watertown, Massachusetts, was selected as the site for this study based on the suspected presence of polycyclic aromatic hydrocarbon contamination and the presence of common carp (Cyprinus carpio), largemouth bass (Micropterus salmoides), and white sucker (Catostomus commersoni). All of these fish have extensive contaminant-exposure profiles related to polycyclic aromatic hydrocarbons and other environmental contaminants. This project represented a collaboration of universities, Department of the Interior bureaus including multiple components of the USGS (Biological Resources Discipline and Water Resources Discipline Science Centers, the

  7. Accounting for no net loss: A critical assessment of biodiversity offsetting metrics and methods.

    Science.gov (United States)

    Carreras Gamarra, Maria Jose; Lassoie, James Philip; Milder, Jeffrey

    2018-08-15

    Biodiversity offset strategies are based on the explicit calculation of both losses and gains necessary to establish ecological equivalence between impact and offset areas. Given the importance of quantifying biodiversity values, various accounting methods and metrics are continuously being developed and tested for this purpose. Considering the wide array of alternatives, selecting an appropriate one for a specific project can be not only challenging, but also crucial; accounting methods can strongly influence the biodiversity outcomes of an offsetting strategy, and if not well-suited to the context and values being offset, a no net loss outcome might not be delivered. To date there has been no systematic review or comparative classification of the available biodiversity accounting alternatives that aim at facilitating metric selection, and no tools that guide decision-makers throughout such a complex process. We fill this gap by developing a set of analyses to support (i) identifying the spectrum of available alternatives, (ii) understanding the characteristics of each and, ultimately (iii) making the most sensible and sound decision about which one to implement. The metric menu, scoring matrix, and decision tree developed can be used by biodiversity offsetting practitioners to help select an existing metric, and thus achieve successful outcomes that advance the goal of no net loss of biodiversity. Copyright © 2018 Elsevier Ltd. All rights reserved.

  8. Attack-Resistant Trust Metrics

    Science.gov (United States)

    Levien, Raph

    The Internet is an amazingly powerful tool for connecting people together, unmatched in human history. Yet, with that power comes great potential for spam and abuse. Trust metrics are an attempt to compute the set of which people are trustworthy and which are likely attackers. This chapter presents two specific trust metrics developed and deployed on the Advogato Website, which is a community blog for free software developers. This real-world experience demonstrates that the trust metrics fulfilled their goals, but that for good results, it is important to match the assumptions of the abstract trust metric computation to the real-world implementation.

  9. Performance evaluation of objective quality metrics for HDR image compression

    Science.gov (United States)

    Valenzise, Giuseppe; De Simone, Francesca; Lauga, Paul; Dufaux, Frederic

    2014-09-01

    Due to the much larger luminance and contrast characteristics of high dynamic range (HDR) images, well-known objective quality metrics, widely used for the assessment of low dynamic range (LDR) content, cannot be directly applied to HDR images in order to predict their perceptual fidelity. To overcome this limitation, advanced fidelity metrics, such as the HDR-VDP, have been proposed to accurately predict visually significant differences. However, their complex calibration may make them difficult to use in practice. A simpler approach consists in computing arithmetic or structural fidelity metrics, such as PSNR and SSIM, on perceptually encoded luminance values but the performance of quality prediction in this case has not been clearly studied. In this paper, we aim at providing a better comprehension of the limits and the potentialities of this approach, by means of a subjective study. We compare the performance of HDR-VDP to that of PSNR and SSIM computed on perceptually encoded luminance values, when considering compressed HDR images. Our results show that these simpler metrics can be effectively employed to assess image fidelity for applications such as HDR image compression.

  10. Young worker safety in construction: do family ties and workgroup size affect hazard exposures and safety practices?

    Science.gov (United States)

    Rauscher, Kimberly J; Myers, Douglas J; Runyan, Carol W; Schulman, Michael

    2012-01-01

    Little is known about how social aspects of the work environment influence exposures or safety practices affecting young construction workers. Our objective was to investigate whether working on a construction site with a small number of workers (≤10 vs. 11-50) or having a family-firm connection (working in a family-owned firm or one in which a family member also works) impacts hazard exposures and safety practices. Participants included 187 North Carolina construction workers 14 to 17 years old who were surveyed about their jobs. We conducted stratified analyses using cross-tabulations and chi-square statistics to measure associations between workgroup size (i.e., the total number of workers on a jobsite) and family-firm connections (yes/no) and hazard exposures (e.g., saws) and safety practices (e.g., supervision). Having a family-firm connection was associated with fewer hazard exposures and greater safety practices. Youth who worked on jobsites with a larger workgroup (11-50 workers) reported more hazards but also more safety practices. Family-firm connections, in particular, may have a protective effect for youth in construction. Even though the statistical significance of our findings on workgroup size was limited in places, the pattern of differences found suggest that further research in this area is warranted.

  11. The use of the kurtosis metric in the evaluation of occupational hearing loss in workers in China: Implications for hearing risk assessment

    Directory of Open Access Journals (Sweden)

    Robert I Davis

    2012-01-01

    Full Text Available This study examined: (1 the value of using the statistical metric, kurtosis [β(t], along with an energy metric to determine the hazard to hearing from high level industrial noise environments, and (2 the accuracy of the International Standard Organization (ISO-1999:1990 model for median noise-induced permanent threshold shift (NIPTS estimates with actual recent epidemiological data obtained on 240 highly screened workers exposed to high-level industrial noise in China. A cross-sectional approach was used in this study. Shift-long temporal waveforms of the noise that workers were exposed to for evaluation of noise exposures and audiometric threshold measures were obtained on all selected subjects. The subjects were exposed to only one occupational noise exposure without the use of hearing protection devices. The results suggest that: (1 the kurtosis metric is an important variable in determining the hazards to hearing posed by a high-level industrial noise environment for hearing conservation purposes, i.e., the kurtosis differentiated between the hazardous effects produced by Gaussian and non-Gaussian noise environments, (2 the ISO-1999 predictive model does not accurately estimate the degree of median NIPTS incurred to high level kurtosis industrial noise, and (3 the inherent large variability in NIPTS among subjects emphasize the need to develop and analyze a larger database of workers with well-documented exposures to better understand the effect of kurtosis on NIPTS incurred from high level industrial noise exposures. A better understanding of the role of the kurtosis metric may lead to its incorporation into a new generation of more predictive hearing risk assessment for occupational noise exposure.

  12. The use of the kurtosis metric in the evaluation of occupational hearing loss in workers in China: implications for hearing risk assessment.

    Science.gov (United States)

    Davis, Robert I; Qiu, Wei; Heyer, Nicholas J; Zhao, Yiming; Qiuling Yang, M S; Li, Nan; Tao, Liyuan; Zhu, Liangliang; Zeng, Lin; Yao, Daohua

    2012-01-01

    This study examined: (1) the value of using the statistical metric, kurtosis [β(t)], along with an energy metric to determine the hazard to hearing from high level industrial noise environments, and (2) the accuracy of the International Standard Organization (ISO-1999:1990) model for median noise-induced permanent threshold shift (NIPTS) estimates with actual recent epidemiological data obtained on 240 highly screened workers exposed to high-level industrial noise in China. A cross-sectional approach was used in this study. Shift-long temporal waveforms of the noise that workers were exposed to for evaluation of noise exposures and audiometric threshold measures were obtained on all selected subjects. The subjects were exposed to only one occupational noise exposure without the use of hearing protection devices. The results suggest that: (1) the kurtosis metric is an important variable in determining the hazards to hearing posed by a high-level industrial noise environment for hearing conservation purposes, i.e., the kurtosis differentiated between the hazardous effects produced by Gaussian and non-Gaussian noise environments, (2) the ISO-1999 predictive model does not accurately estimate the degree of median NIPTS incurred to high level kurtosis industrial noise, and (3) the inherent large variability in NIPTS among subjects emphasize the need to develop and analyze a larger database of workers with well-documented exposures to better understand the effect of kurtosis on NIPTS incurred from high level industrial noise exposures. A better understanding of the role of the kurtosis metric may lead to its incorporation into a new generation of more predictive hearing risk assessment for occupational noise exposure.

  13. Symmetries of the dual metrics

    International Nuclear Information System (INIS)

    Baleanu, D.

    1998-01-01

    The geometric duality between the metric g μν and a Killing tensor K μν is studied. The conditions were found when the symmetries of the metric g μν and the dual metric K μν are the same. Dual spinning space was constructed without introduction of torsion. The general results are applied to the case of Kerr-Newmann metric

  14. Quantitative comparison of mutagenic hazards: rad-equivalences

    International Nuclear Information System (INIS)

    1980-01-01

    The present situation concerning the problem of estimating genetic risks associated with the exposure of living beings, including man, to chemical compounds present in the environment is defined. Since these compounds affect the genetic material of cells by reactions similar to those produced by radiations, attempts have been made to establish rad-equivalences for some of these substances. This idea is discussed through the different publications mentioned [fr

  15. Weyl metrics and wormholes

    Energy Technology Data Exchange (ETDEWEB)

    Gibbons, Gary W. [DAMTP, University of Cambridge, Wilberforce Road, Cambridge, CB3 0WA U.K. (United Kingdom); Volkov, Mikhail S., E-mail: gwg1@cam.ac.uk, E-mail: volkov@lmpt.univ-tours.fr [Laboratoire de Mathématiques et Physique Théorique, LMPT CNRS—UMR 7350, Université de Tours, Parc de Grandmont, Tours, 37200 France (France)

    2017-05-01

    We study solutions obtained via applying dualities and complexifications to the vacuum Weyl metrics generated by massive rods and by point masses. Rescaling them and extending to complex parameter values yields axially symmetric vacuum solutions containing singularities along circles that can be viewed as singular matter sources. These solutions have wormhole topology with several asymptotic regions interconnected by throats and their sources can be viewed as thin rings of negative tension encircling the throats. For a particular value of the ring tension the geometry becomes exactly flat although the topology remains non-trivial, so that the rings literally produce holes in flat space. To create a single ring wormhole of one metre radius one needs a negative energy equivalent to the mass of Jupiter. Further duality transformations dress the rings with the scalar field, either conventional or phantom. This gives rise to large classes of static, axially symmetric solutions, presumably including all previously known solutions for a gravity-coupled massless scalar field, as for example the spherically symmetric Bronnikov-Ellis wormholes with phantom scalar. The multi-wormholes contain infinite struts everywhere at the symmetry axes, apart from solutions with locally flat geometry.

  16. Characterization of exposure to extremely low frequency magnetic fields using multidimensional analysis techniques.

    Science.gov (United States)

    Verrier, A; Souques, M; Wallet, F

    2005-05-01

    Our lack of knowledge about the biological mechanisms of 50 Hz magnetic fields makes it hard to improve exposure assessment. To provide better information about these exposure measures, we use multidimensional analysis techniques to examine the relations between different exposure metrics for a group of subjects. We used a combination of a two stage Principal Component Analysis (PCA) followed by an ascending hierarchical classification (AHC) to identify a set of measures that would capture the characteristics of the total exposure. This analysis gives an indication of the aspects of the exposure that are important to capture to get a complete picture of the magnetic field environment. We calculated 44 metrics of exposure measures from 16 exposed EDF employees and 15 control subjects, containing approximately 20,000 recordings of magnetic field measurements, taken every 30 s for 7 days with an EMDEX II dosimeter. These metrics included parameters used routinely or occasionally and some that were new. To eliminate those that expressed the least variability and that were most highly correlated to one another, we began with an initial Principal Component Analysis (PCA). A second PCA of the remaining 12 metrics enabled us to identify from the foreground 82.7% of the variance: the first component (62.0%) was characterized by central tendency metrics, and the second (20.7%) by dispersion characteristics. We were able to use AHC to divide the entire sample (of individuals) into four groups according to the axes that emerged from the PCA. Finally, discriminant analysis tested the discriminant power of the variables in the exposed/control classification as well as those from the AHC classification. The first showed that two subjects had been incorrectly classified, while no classification error was observed in the second. This exploratory study underscores the need to improve exposure measures by using at least two dimensions: intensity and dispersion. It also indicates the

  17. The monetary value of the collective dose equivalent unit (person-rem)

    International Nuclear Information System (INIS)

    Rodgers, Reginald C.

    1978-01-01

    In the design and operation of nuclear power reactor facilities, it is recommended that radiation exposures to the workers and the general public be kept as 'low as reasonably achievable' (ALARA). In the process of implementing this principle cost-benefit evaluations are part of the decision making process. For this reason a monetary value has to be assigned to the collective dose equivalent unit (person-rem). The various factors such as medical health care, societal penalty and manpower replacement/saving are essential ingredients to determine a monetary value for the person-rem. These factors and their dependence on the level of risk (or exposure level) are evaluated. Monetary values of well under $100 are determined for the public dose equivalent unit. The occupational worker person-rem value is determined to be in the range of $500 to about $5000 depending on the exposure level and the type of worker and his affiliation, i.e., temporary or permanent. A discussion of the variability and the range of the monetary values will be presented. (author)

  18. Guidelines for personal exposure monitoring of chemicals: Part III.

    Science.gov (United States)

    Hashimoto, Haruo; Yamada, Kenichi; Hori, Hajime; Kumagai, Shinji; Murata, Masaru; Nagoya, Toshio; Nakahara, Hirohiko; Mochida, Nobuyuki

    2018-01-25

    This Document, "Guidelines for personal exposure monitoring of chemicals" ("this Guideline"), has been prepared by "The Committee for Personal Exposure Monitoring" ("the Committee") of the Expert Division of Occupational Hygiene & Ergonomics, Japan Society for Occupational Health. Considering the background of the growing importance of personal exposure monitoring in risk assessment and the need to prepare for the introduction of monitoring using personal samplers from an administrative perspective in recent years, the Committee was organized in November 2012. The Committee has prepared this Guideline as a "practical guideline" for personal exposure monitoring, so as to offer proposals and recommendations to the members of the Japan Society for Occupational Health and to society in general. The scope of this Guideline covers all chemical substances and all related workplaces regarded as targets for general assessment and the management of risk. It thus is not to be considered to comment on legal regulations and methodology. The main text provides the basic methods and concepts of personal exposure monitoring, while 31 "Appendices" are provided in this Guideline throughout the series; technical descriptions, statistical bases, and actual workplace examples are provided in these appendices, to assist better understanding. The personal exposure monitoring described as per this Guideline is equivalent to an "expert-centered basic method to reasonably proceed with the assessment and management of risk at workplaces." It is considered that practicing and expanding on this method will significantly contribute in reforming the overall framework of occupational hygiene management in Japan.

  19. Guidelines for personal exposure monitoring of chemicals: Part IV.

    Science.gov (United States)

    Hashimoto, Haruo; Yamada, Kenichi; Hori, Hajime; Kumagai, Shinji; Murata, Masaru; Nagoya, Toshio; Nakahara, Hirohiko; Mochida, Nobuyuki

    2018-03-27

    This Document, "Guidelines for personal exposure monitoring of chemicals" ("this Guideline"), has been prepared by "The Committee for Personal Exposure Monitoring" ("the Committee") of the Expert Division of Occupational Hygiene & Ergonomics, Japan Society for Occupational Health. Considering the background of the growing importance of personal exposure monitoring in risk assessment and the need to prepare for the introduction of monitoring using personal samplers from an administrative perspective in recent years, the Committee was organized in November 2012. The Committee has prepared this Guideline as a "practical guideline" for personal exposure monitoring, so as to offer proposals and recommendations to the members of the Japan Society for Occupational Health and to society in general. The scope of this Guideline covers all chemical substances and all related workplaces regarded as targets for general assessment and the management of risk. It thus is not to be considered to comment on legal regulations and methodology. The main text provides the basic methods and concepts of personal exposure monitoring, while 31 "Appendices" are provided in this Guideline throughout the series; technical descriptions, statistical bases, and actual workplace examples are provided in these appendices, to assist better understanding. The personal exposure monitoring described as per this Guideline is equivalent to an "expert-centered basic method to reasonably proceed with the assessment and management of risk at workplaces." It is considered that practicing and expanding on this method will significantly contribute in reforming the overall framework of occupational hygiene management in Japan.

  20. Guidelines for personal exposure monitoring of chemicals: Part II.

    Science.gov (United States)

    Hashimoto, Haruo; Yamada, Kenichi; Hori, Hajime; Kumagai, Shinji; Murata, Masaru; Nagoya, Toshio; Nakahara, Hirohiko; Mochida, Nobuyuki

    2017-11-25

    This Document, "Guidelines for personal exposure monitoring of chemicals" ("this Guideline"), has been prepared by "The Committee for Personal Exposure Monitoring" ("the Committee") of the Expert Division of Occupational Hygiene & Ergonomics, Japan Society for Occupational Health. Considering the background of the growing importance of personal exposure monitoring in risk assessment and the need to prepare for the introduction of monitoring using personal samplers from an administrative perspective in recent years, the Committee was organized in November 2012. The Committee has prepared this Guideline as a "practical guideline" for personal exposure monitoring, so as to offer proposals and recommendations to the members of the Japan Society for Occupational Health and to society in general. The scope of this Guideline covers all chemical substances and all related workplaces regarded as targets for general assessment and the management of risk. It thus is not to be considered to comment on legal regulations and methodology. The main text provides the basic methods and concepts of personal exposure monitoring, while 31 "Appendices" are provided in this Guideline throughout the series; technical descriptions, statistical bases, and actual workplace examples are provided in these appendices, to assist better understanding. The personal exposure monitoring described as per this Guideline is equivalent to an "expert-centered basic method to reasonably proceed with the assessment and management of risk at workplaces." It is considered that practicing and expanding on this method will significantly contribute in reforming the overall framework of occupational hygiene management in Japan.

  1. Guidelines for personal exposure monitoring of chemicals: Part I.

    Science.gov (United States)

    Hashimoto, Haruo; Yamada, Kenichi; Hori, Hajime; Kumagai, Shinji; Murata, Masaru; Nagoya, Toshio; Nakahara, Hirohiko; Mochida, Nobuyuki

    2017-09-28

    This Document, "Guidelines for personal exposure monitoring of chemicals" ("this Guideline"), has been prepared by "The Committee for Personal Exposure Monitoring" ("the Committee") of the Expert Division of Occupational Hygiene & Ergonomics, Japan Society for Occupational Health. Considering the background of the growing importance of personal exposure monitoring in risk assessment and the need to prepare for the introduction of monitoring using personal samplers from an administrative perspective in recent years, the Committee was organized in November 2012. The Committee has prepared this Guideline as a "practical guideline" for personal exposure monitoring, so as to offer proposals and recommendations to the members of the Japan Society for Occupational Health and to society in general. The scope of this Guideline covers all chemical substances and all related workplaces regarded as targets for general assessment and the management of risk. It thus is not to be considered to comment on legal regulations and methodology. The main text provides the basic methods and concepts of personal exposure monitoring, while 31 "Appendices" are provided later in this Guideline throughout the series; technical descriptions, statistical bases, and actual workplace examples are provided in these appendices, to assist better understanding. The personal exposure monitoring described as per this Guideline is equivalent to an "expert-centered basic method to reasonably proceed with the assessment and management of risk at workplaces." It is considered that practicing and expanding on this method will significantly contribute in reforming the overall framework of occupational hygiene management in Japan.

  2. Guidelines for personal exposure monitoring of chemicals: Part V.

    Science.gov (United States)

    Hashimoto, Haruo; Yamada, Kenichi; Hori, Hajime; Kumagai, Shinji; Murata, Masaru; Nagoya, Toshio; Nakahara, Hirohiko; Mochida, Nobuyuki

    2018-05-25

    This Document, "Guidelines for personal exposure monitoring of chemicals" ("this Guideline"), has been prepared by "The Committee for Personal Exposure Monitoring" ("the Committee") of the Expert Division of Occupational Hygiene & Ergonomics, Japan Society for Occupational Health. Considering the background of the growing importance of personal exposure monitoring in risk assessment and the need to prepare for the introduction of monitoring using personal samplers from an administrative perspective in recent years, the Committee was organized in November 2012. The Committee has prepared this Guideline as a "practical guideline" for personal exposure monitoring, so as to offer proposals and recommendations to the members of the Japan Society for Occupational Health and to society in general. The scope of this Guideline covers all chemical substances and all related workplaces regarded as targets for general assessment and the management of risk. It thus is not to be considered to comment on legal regulations and methodology. The main text provides the basic methods and concepts of personal exposure monitoring, while 31 "Appendices" are provided in this Guideline throughout the series; technical descriptions, statistical bases, and actual workplace examples are provided in these appendices, to assist better understanding. The personal exposure monitoring described as per this Guideline is equivalent to an "expert-centered basic method to reasonably proceed with the assessment and management of risk at workplaces." It is considered that practicing and expanding on this method will significantly contribute in reforming the overall framework of occupational hygiene management in Japan.

  3. Overview of journal metrics

    Directory of Open Access Journals (Sweden)

    Kihong Kim

    2018-02-01

    Full Text Available Various kinds of metrics used for the quantitative evaluation of scholarly journals are reviewed. The impact factor and related metrics including the immediacy index and the aggregate impact factor, which are provided by the Journal Citation Reports, are explained in detail. The Eigenfactor score and the article influence score are also reviewed. In addition, journal metrics such as CiteScore, Source Normalized Impact per Paper, SCImago Journal Rank, h-index, and g-index are discussed. Limitations and problems that these metrics have are pointed out. We should be cautious to rely on those quantitative measures too much when we evaluate journals or researchers.

  4. An Innovative Metric to Evaluate Satellite Precipitation's Spatial Distribution

    Science.gov (United States)

    Liu, H.; Chu, W.; Gao, X.; Sorooshian, S.

    2011-12-01

    Thanks to its capability to cover the mountains, where ground measurement instruments cannot reach, satellites provide a good means of estimating precipitation over mountainous regions. In regions with complex terrains, accurate information on high-resolution spatial distribution of precipitation is critical for many important issues, such as flood/landslide warning, reservoir operation, water system planning, etc. Therefore, in order to be useful in many practical applications, satellite precipitation products should possess high quality in characterizing spatial distribution. However, most existing validation metrics, which are based on point/grid comparison using simple statistics, cannot effectively measure satellite's skill of capturing the spatial patterns of precipitation fields. This deficiency results from the fact that point/grid-wised comparison does not take into account of the spatial coherence of precipitation fields. Furth more, another weakness of many metrics is that they can barely provide information on why satellite products perform well or poor. Motivated by our recent findings of the consistent spatial patterns of the precipitation field over the western U.S., we developed a new metric utilizing EOF analysis and Shannon entropy. The metric can be derived through two steps: 1) capture the dominant spatial patterns of precipitation fields from both satellite products and reference data through EOF analysis, and 2) compute the similarities between the corresponding dominant patterns using mutual information measurement defined with Shannon entropy. Instead of individual point/grid, the new metric treat the entire precipitation field simultaneously, naturally taking advantage of spatial dependence. Since the dominant spatial patterns are shaped by physical processes, the new metric can shed light on why satellite product can or cannot capture the spatial patterns. For demonstration, a experiment was carried out to evaluate a satellite

  5. MO-A-16A-01: QA Procedures and Metrics: In Search of QA Usability

    International Nuclear Information System (INIS)

    Sathiaseelan, V; Thomadsen, B

    2014-01-01

    Radiation therapy has undergone considerable changes in the past two decades with a surge of new technology and treatment delivery methods. The complexity of radiation therapy treatments has increased and there has been increased awareness and publicity about the associated risks. In response, there has been proliferation of guidelines for medical physicists to adopt to ensure that treatments are delivered safely. Task Group recommendations are copious, and clinical physicists' hours are longer, stretched to various degrees between site planning and management, IT support, physics QA, and treatment planning responsibilities.Radiation oncology has many quality control practices in place to ensure the delivery of high-quality, safe treatments. Incident reporting systems have been developed to collect statistics about near miss events at many radiation oncology centers. However, tools are lacking to assess the impact of these various control measures. A recent effort to address this shortcoming is the work of Ford et al (2012) who recently published a methodology enumerating quality control quantification for measuring the effectiveness of safety barriers. Over 4000 near-miss incidents reported from 2 academic radiation oncology clinics were analyzed using quality control quantification, and a profile of the most effective quality control measures (metrics) was identified.There is a critical need to identify a QA metric to help the busy clinical physicists to focus their limited time and resources most effectively in order to minimize or eliminate errors in the radiation treatment delivery processes. In this symposium the usefulness of workflows and QA metrics to assure safe and high quality patient care will be explored.Two presentations will be given:Quality Metrics and Risk Management with High Risk Radiation Oncology ProceduresStrategies and metrics for quality management in the TG-100 Era Learning Objectives: Provide an overview and the need for QA usability

  6. MO-A-16A-01: QA Procedures and Metrics: In Search of QA Usability

    Energy Technology Data Exchange (ETDEWEB)

    Sathiaseelan, V [Northwestern Memorial Hospital, Chicago, IL (United States); Thomadsen, B [University of Wisconsin, Madison, WI (United States)

    2014-06-15

    Radiation therapy has undergone considerable changes in the past two decades with a surge of new technology and treatment delivery methods. The complexity of radiation therapy treatments has increased and there has been increased awareness and publicity about the associated risks. In response, there has been proliferation of guidelines for medical physicists to adopt to ensure that treatments are delivered safely. Task Group recommendations are copious, and clinical physicists' hours are longer, stretched to various degrees between site planning and management, IT support, physics QA, and treatment planning responsibilities.Radiation oncology has many quality control practices in place to ensure the delivery of high-quality, safe treatments. Incident reporting systems have been developed to collect statistics about near miss events at many radiation oncology centers. However, tools are lacking to assess the impact of these various control measures. A recent effort to address this shortcoming is the work of Ford et al (2012) who recently published a methodology enumerating quality control quantification for measuring the effectiveness of safety barriers. Over 4000 near-miss incidents reported from 2 academic radiation oncology clinics were analyzed using quality control quantification, and a profile of the most effective quality control measures (metrics) was identified.There is a critical need to identify a QA metric to help the busy clinical physicists to focus their limited time and resources most effectively in order to minimize or eliminate errors in the radiation treatment delivery processes. In this symposium the usefulness of workflows and QA metrics to assure safe and high quality patient care will be explored.Two presentations will be given:Quality Metrics and Risk Management with High Risk Radiation Oncology ProceduresStrategies and metrics for quality management in the TG-100 Era Learning Objectives: Provide an overview and the need for QA usability

  7. Biomonitoring Equivalents for interpretation of urinary fluoride.

    Science.gov (United States)

    Aylward, L L; Hays, S M; Vezina, A; Deveau, M; St-Amand, A; Nong, A

    2015-06-01

    Exposure to fluoride is widespread due to its natural occurrence in the environment and addition to drinking water and dental products for the prevention of dental caries. The potential health risks of excess fluoride exposure include aesthetically unacceptable dental fluorosis (tooth mottling) and increased skeletal fragility. Numerous organizations have conducted risk assessments and set guidance values to represent maximum recommended exposure levels as well as recommended adequate intake levels based on potential public health benefits of fluoride exposure. Biomonitoring Equivalents (BEs) are estimates of the average biomarker concentrations corresponding to such exposure guidance values. The literature on daily urinary fluoride excretion rates as a function of daily fluoride exposure was reviewed and BE values corresponding to the available US and Canadian exposure guidance values were derived for fluoride in urine. The derived BE values range from 1.1 to 2.1mg/L (1.2-2.5μg/g creatinine). Concentrations of fluoride in single urinary spot samples from individuals, even under exposure conditions consistent with the exposure guidance values, may vary from the predicted average concentrations by several-fold due to within- and across-individual variation in urinary flow and creatinine excretion rates and due to the rapid elimination kinetics of fluoride. Thus, the BE values are most appropriately applied to screen population central tendency estimates for biomarker concentrations rather than interpretation of individual spot sample concentrations. Copyright © 2015. Published by Elsevier Inc.

  8. Holographic Spherically Symmetric Metrics

    Science.gov (United States)

    Petri, Michael

    The holographic principle (HP) conjectures, that the maximum number of degrees of freedom of any realistic physical system is proportional to the system's boundary area. The HP has its roots in the study of black holes. It has recently been applied to cosmological solutions. In this article we apply the HP to spherically symmetric static space-times. We find that any regular spherically symmetric object saturating the HP is subject to tight constraints on the (interior) metric, energy-density, temperature and entropy-density. Whenever gravity can be described by a metric theory, gravity is macroscopically scale invariant and the laws of thermodynamics hold locally and globally, the (interior) metric of a regular holographic object is uniquely determined up to a constant factor and the interior matter-state must follow well defined scaling relations. When the metric theory of gravity is general relativity, the interior matter has an overall string equation of state (EOS) and a unique total energy-density. Thus the holographic metric derived in this article can serve as simple interior 4D realization of Mathur's string fuzzball proposal. Some properties of the holographic metric and its possible experimental verification are discussed. The geodesics of the holographic metric describe an isotropically expanding (or contracting) universe with a nearly homogeneous matter-distribution within the local Hubble volume. Due to the overall string EOS the active gravitational mass-density is zero, resulting in a coasting expansion with Ht = 1, which is compatible with the recent GRB-data.

  9. Mercury use and exposure among Santeria practitioners: religious versus folk practice in Northern New Jersey, USA.

    Science.gov (United States)

    Alison Newby, C; Riley, Donna M; Leal-Almeraz, Tomás O

    2006-08-01

    To understand and characterize exposure to and use of elemental mercury among practitioners of Afro-Cuban religions in Hudson County, New Jersey, USA. Participant observation and open-ended interviews with 22 religious supply store employees and practitioners of Santeria, Espiritismo or Palo Mayombe probed respondents' knowledge and use of mercury, as well as their beliefs about its benefits and risks. Including a cultural and religious insider as part of the research team was crucial in working with this relatively closed community. Seventeen of the 21 practitioners reported using mercury or mercury compounds in various forms of practice and in services that they provide to clients. The contained nature of these uses suggests that accidental spills, as opposed to the practices themselves, emerge as the greatest exposure concern for this population. Mercury was never recommended to clients for individual use. This restriction appears to be rooted in the way the religion is practiced and in the way santeros receive compensation, not in a perception of mercury as hazardous. Most practitioners were aware that mercury can be hazardous, but were not familiar with the most significant exposure pathway, inhalation of mercury vapor. A climate of fear surrounds the use of mercury in this community, so that health concerns pale in comparison to fear of reprisal from authorities. Among those who sell or formerly sold mercury, several shared the erroneous belief that it was illegal to sell mercury in New Jersey. Despite widespread reported use, there were no reports of practices believed to result in the highest exposures. To reduce exposure in the community, interventions presenting general information on mercury hazards and instructions for cleaning up spills are recommended. To address insider-outsider dynamics and the climate of fear, educational materials should be accessible to the community and avoid any mention of religious practice.

  10. Expression of proliferative and inflammatory markers in a full-thickness human skin equivalent following exposure to the model sulfur mustard vesicant, 2-chloroethyl ethyl sulfide

    International Nuclear Information System (INIS)

    Black, Adrienne T.; Hayden, Patrick J.; Casillas, Robert P.; Heck, Diane E.; Gerecke, Donald R.; Sinko, Patrick J.; Laskin, Debra L.; Laskin, Jeffrey D.

    2010-01-01

    Sulfur mustard is a potent vesicant that induces inflammation, edema and blistering following dermal exposure. To assess molecular mechanisms mediating these responses, we analyzed the effects of the model sulfur mustard vesicant, 2-chloroethyl ethyl sulfide, on EpiDerm-FT TM , a commercially available full-thickness human skin equivalent. CEES (100-1000 μM) caused a concentration-dependent increase in pyknotic nuclei and vacuolization in basal keratinocytes; at high concentrations (300-1000 μM), CEES also disrupted keratin filament architecture in the stratum corneum. This was associated with time-dependent increases in expression of proliferating cell nuclear antigen, a marker of cell proliferation, and poly(ADP-ribose) polymerase (PARP) and phosphorylated histone H2AX, markers of DNA damage. Concentration- and time-dependent increases in mRNA and protein expression of eicosanoid biosynthetic enzymes including COX-2, 5-lipoxygenase, microsomal PGE 2 synthases, leukotriene (LT) A 4 hydrolase and LTC 4 synthase were observed in CEES-treated skin equivalents, as well as in antioxidant enzymes, glutathione S-transferases A1-2 (GSTA1-2), GSTA3 and GSTA4. These data demonstrate that CEES induces rapid cellular damage, cytotoxicity and inflammation in full-thickness skin equivalents. These effects are similar to human responses to vesicants in vivo and suggest that the full thickness skin equivalent is a useful in vitro model to characterize the biological effects of mustards and to develop potential therapeutics.

  11. Determination of the dose equivalent Hp(0.07) in hands of occupationally exposed personnel in the practice of proton emission tomography (PET/CT); Determinacion de la dosis equivalente Hp(0.07) en manos de trabajadores ocupacionalmente expuestos en la practica de Tomografia por Emision de Positrones (PET/CT)

    Energy Technology Data Exchange (ETDEWEB)

    Lea, D. [Servicio de Radiofisica Sanitaria, Unidad de Tecnologia Nuclear, Instituto Venezolano de Investigaciones Cientificas, Ministerio de Ciencia y Tecnologia, Km 11 Carretera Panamerican, Altos del Pipe, Caracas (Venezuela); Ruiz, N.; Esteves, L. [Centro Diagnostico Docente Las Mercedes, Calle Paris cruce con calle Caroni, Edif. CDD, Las Mercedes, Caracas (Venezuela)]. e-mail: dlea@ivic.ve

    2006-07-01

    In Venezuela recently it was implanted the Positron Emission Tomography technique (PET) with the perspective of implanting it at national level. Even when in our country practices it of nuclear medicine it exists from early of 70, there is not experience in the determination of the occupational doses by exposure to the external radiation in hands. By this reason, a concern exists in the workers of the centers of nuclear medicine where it is practiced the Positron Emission Tomography technique. In absence of the TLD dosimetry to measure dose in hands in our country, measurements of the dose equivalent of the workers of the PET national reference center were made, using a detector of hands type diode. It was determined the dose in hands in terms of dose equivalent Hp(0.07) in two work positions, that is: the corresponding to the transfer of the receiving vial of ({sup 18}F) FDG to the shield, quality control and uni doses division. The second work position corresponds the person in charge of administering, via intravenous, the ({sup 18}F) FDG. In this work it realizes the dose equivalent in hands Hp(0.07) measures in each one of the work positions before described by daily production. The informed doses correspond to a total average produced activity of 20.4 GBq (550 mCi). The results of the measurements in terms of dose equivalent in hands Hp(0.07) correspond to 2.1 {+-} 20% mSv in the work position of division and 0.4 {+-} 10% mSv in the position of injection of the radioactive material. At short term this foreseen until 4 productions per week, what means an annual dose equivalent Hp(0.07) in hands of 400 mSv approximately, without taking into account abnormal situations as its are spills of the ({sup 18}F) FDG in the work place. This work is the starting point so that the regulatory authority settles down, in Venezuela, dose restrictions in the PET practices and implant, in the centers of nuclear medicine, an optimization politics of this practice in conformity

  12. Politico-economic equivalence

    DEFF Research Database (Denmark)

    Gonzalez Eiras, Martin; Niepelt, Dirk

    2015-01-01

    Traditional "economic equivalence'' results, like the Ricardian equivalence proposition, define equivalence classes over exogenous policies. We derive "politico-economic equivalence" conditions that apply in environments where policy is endogenous and chosen sequentially. A policy regime and a st......Traditional "economic equivalence'' results, like the Ricardian equivalence proposition, define equivalence classes over exogenous policies. We derive "politico-economic equivalence" conditions that apply in environments where policy is endogenous and chosen sequentially. A policy regime...... their use in the context of several applications, relating to social security reform, tax-smoothing policies and measures to correct externalities....

  13. Derived limits and their practical applications

    International Nuclear Information System (INIS)

    Minarik, F.

    1982-01-01

    Attention is devoted to the approach and methodological procedure in determining derived limits for surface contamination with special regard to external exposure of hands, inhalation of resuspended activity, contamination of the skin and with regard to ingestion of radioactive material from the skin. In the practical part of the article, results are given of measuring contamination by radionuclides 134 Cs, 133 Cs and 60 Co, and the resulting annual equivalent dose rates and their comparison with derived air contamination in the sense of ICRP Publ. No. 30. (author)

  14. Current Best Practices for Preventing Asbestos Exposure Among Brake and Clutch Repair Workers

    Science.gov (United States)

    Covers concerns about asbestos exposure for mechanics, how to tell if asbestos brake or clutch components contain asbestos, work practices to follow, protecting yourself for home mechanics, disposal of waste that contains asbestos.

  15. Head impact exposure measured in a single youth football team during practice drills.

    Science.gov (United States)

    Kelley, Mireille E; Kane, Joeline M; Espeland, Mark A; Miller, Logan E; Powers, Alexander K; Stitzel, Joel D; Urban, Jillian E

    2017-11-01

    OBJECTIVE This study evaluated the frequency, magnitude, and location of head impacts in practice drills within a youth football team to determine how head impact exposure varies among different types of drills. METHODS On-field head impact data were collected from athletes participating in a youth football team for a single season. Each athlete wore a helmet instrumented with a Head Impact Telemetry (HIT) System head acceleration measurement device during all preseason, regular season, and playoff practices. Video was recorded for all practices, and video analysis was performed to verify head impacts and assign each head impact to a specific drill. Eleven drills were identified: dummy/sled tackling, install, special teams, Oklahoma, one-on-one, open-field tackling, passing, position skill work, multiplayer tackle, scrimmage, and tackling drill stations. Generalized linear models were fitted to log-transformed data, and Wald tests were used to assess differences in head accelerations and impact rates. RESULTS A total of 2125 impacts were measured during 30 contact practices in 9 athletes (mean age 11.1 ± 0.6 years, mean mass 44.9 ± 4.1 kg). Open-field tackling had the highest median and 95th percentile linear accelerations (24.7 g and 97.8 g, respectively) and resulted in significantly higher mean head accelerations than several other drills. The multiplayer tackle drill resulted in the highest head impact frequency, with an average of 0.59 impacts per minute per athlete, but the lowest 95th percentile linear accelerations of all drills. The front of the head was the most common impact location for all drills except dummy/sled tackling. CONCLUSIONS Head impact exposure varies significantly in youth football practice drills, with several drills exposing athletes to high-magnitude and/or high-frequency head impacts. These data suggest that further study of practice drills is an important step in developing evidence-based recommendations for modifying or eliminating

  16. The Relationship between the Level and Modality of HRM Metrics, Quality of HRM Practice and Organizational Performance

    OpenAIRE

    Nina Pološki Vokić

    2011-01-01

    The paper explores the relationship between the way organizations measure HRM and overall quality of HRM activities, as well as the relationship between HRM metrics used and financial performance of an organization. In the theoretical part of the paper modalities of HRM metrics are grouped into five groups (evaluating HRM using accounting principles, evaluating HRM using management techniques, evaluating individual HRM activities, aggregate evaluation of HRM, and evaluating HRM de...

  17. Construct Equivalence and Latent Means Analysis of Health Behaviors Between Male and Female Middle School Students

    OpenAIRE

    Park, Jeong Mo; Han, Ae Kyung; Cho, Yoon Hee

    2011-01-01

    Purpose: The purpose of this study was to investigate the construct equivalence of the five general factors (subjective health, eating habits, physical activities, sedentary lifestyle, and sleeping behaviors) and to compare the latent means between male and female middle school students in Incheon, Korea. Methods: The 2008 Korean Youth Risk Behavior Survey data was used for analysis. Multigroup confirmatory factor analysis was performed to test whether the scale has configural, metric, and...

  18. Metric regularity and subdifferential calculus

    International Nuclear Information System (INIS)

    Ioffe, A D

    2000-01-01

    The theory of metric regularity is an extension of two classical results: the Lyusternik tangent space theorem and the Graves surjection theorem. Developments in non-smooth analysis in the 1980s and 1990s paved the way for a number of far-reaching extensions of these results. It was also well understood that the phenomena behind the results are of metric origin, not connected with any linear structure. At the same time it became clear that some basic hypotheses of the subdifferential calculus are closely connected with the metric regularity of certain set-valued maps. The survey is devoted to the metric theory of metric regularity and its connection with subdifferential calculus in Banach spaces

  19. Occupational radiation exposure trends in the nuclear industry of NEA/IAEA Member States

    International Nuclear Information System (INIS)

    Ilari, O.; Horan, J.R.; Franzen, F.L.

    1980-01-01

    After various introductory statements on current occupational radiation exposure trends in nuclear facilities, the authors briefly discuss the problems involved in the application of the ICRP principle of optimization of radiological protection to the design and, in particular, the operation of nuclear plants, with the aim of comparing present exposure trends. To assemble an adequate data base for supporting the technical studies required to optimize radiological protection, the OECD Nuclear Energy Agency and the International Atomic Energy Agency have launched a survey aimed at collecting information on the levels and trends of occupational radiation exposure in the nuclear industry. The features of this study, based on the answers of NEA/IAEA Member States to a questionnaire, are described. The first results of the survey, regarding the situation and time trends of the average individual dose equivalents and collective dose equivalents for different plant types and for several countries, are also given. A preliminary analysis of the data collected allows certain considerations to be made relating to the influence of size, age and plant type, as well as of different national practices in plant operation and maintenance. (author)

  20. Context-dependent ATC complexity metric

    NARCIS (Netherlands)

    Mercado Velasco, G.A.; Borst, C.

    2015-01-01

    Several studies have investigated Air Traffic Control (ATC) complexity metrics in a search for a metric that could best capture workload. These studies have shown how daunting the search for a universal workload metric (one that could be applied in different contexts: sectors, traffic patterns,

  1. Accidental over-exposure from dental X-ray equipment

    Energy Technology Data Exchange (ETDEWEB)

    Phillips, B G [National Radiological Protection Board, Harwell (UK)

    1976-07-01

    A description is given of an unusual dental X-ray procedure which resulted in accidental over-exposure both to the dentist and to several of his patients when a short-circuit was present in newly-installed equipment. The short-circuit by-passed the exposure control and energized the tube for certain orientations of the X-ray tube. The dentist left the patients, who wore protective aprons, to initiate the exposure themselves, using the control button. Although the warning lights were on, the dentist was not present in the room during the exposure, and the over-exposures were only detected when the developed X-ray films were found to be completely blackened. A reconstruction of the procedure enabled estimates to be made of the dose equivalents to the dentist's body and to the skin of the head, the eyes and the gonads of the patients. The dentist had overlooked several of basic principles recommended in the Code of Practice for the Protection of Persons against Ionizing Radiations from Medical and Dental Use (1972). It is pointed out that incidents involving failure of dental equipment (usually the timer mechanism) are not infrequent.

  2. DLA Energy Biofuel Feedstock Metrics Study

    Science.gov (United States)

    2012-12-11

    moderately/highly in- vasive  Metric 2: Genetically modified organism ( GMO ) hazard, Yes/No and Hazard Category  Metric 3: Species hybridization...4– biofuel distribution Stage # 5– biofuel use Metric 1: State inva- siveness ranking Yes Minimal Minimal No No Metric 2: GMO hazard Yes...may utilize GMO microbial or microalgae species across the applicable biofuel life cycles (stages 1–3). The following consequence Metrics 4–6 then

  3. The alarming problems of confounding equivalence using logistic regression models in the perspective of causal diagrams.

    Science.gov (United States)

    Yu, Yuanyuan; Li, Hongkai; Sun, Xiaoru; Su, Ping; Wang, Tingting; Liu, Yi; Yuan, Zhongshang; Liu, Yanxun; Xue, Fuzhong

    2017-12-28

    Confounders can produce spurious associations between exposure and outcome in observational studies. For majority of epidemiologists, adjusting for confounders using logistic regression model is their habitual method, though it has some problems in accuracy and precision. It is, therefore, important to highlight the problems of logistic regression and search the alternative method. Four causal diagram models were defined to summarize confounding equivalence. Both theoretical proofs and simulation studies were performed to verify whether conditioning on different confounding equivalence sets had the same bias-reducing potential and then to select the optimum adjusting strategy, in which logistic regression model and inverse probability weighting based marginal structural model (IPW-based-MSM) were compared. The "do-calculus" was used to calculate the true causal effect of exposure on outcome, then the bias and standard error were used to evaluate the performances of different strategies. Adjusting for different sets of confounding equivalence, as judged by identical Markov boundaries, produced different bias-reducing potential in the logistic regression model. For the sets satisfied G-admissibility, adjusting for the set including all the confounders reduced the equivalent bias to the one containing the parent nodes of the outcome, while the bias after adjusting for the parent nodes of exposure was not equivalent to them. In addition, all causal effect estimations through logistic regression were biased, although the estimation after adjusting for the parent nodes of exposure was nearest to the true causal effect. However, conditioning on different confounding equivalence sets had the same bias-reducing potential under IPW-based-MSM. Compared with logistic regression, the IPW-based-MSM could obtain unbiased causal effect estimation when the adjusted confounders satisfied G-admissibility and the optimal strategy was to adjust for the parent nodes of outcome, which

  4. The alarming problems of confounding equivalence using logistic regression models in the perspective of causal diagrams

    Directory of Open Access Journals (Sweden)

    Yuanyuan Yu

    2017-12-01

    Full Text Available Abstract Background Confounders can produce spurious associations between exposure and outcome in observational studies. For majority of epidemiologists, adjusting for confounders using logistic regression model is their habitual method, though it has some problems in accuracy and precision. It is, therefore, important to highlight the problems of logistic regression and search the alternative method. Methods Four causal diagram models were defined to summarize confounding equivalence. Both theoretical proofs and simulation studies were performed to verify whether conditioning on different confounding equivalence sets had the same bias-reducing potential and then to select the optimum adjusting strategy, in which logistic regression model and inverse probability weighting based marginal structural model (IPW-based-MSM were compared. The “do-calculus” was used to calculate the true causal effect of exposure on outcome, then the bias and standard error were used to evaluate the performances of different strategies. Results Adjusting for different sets of confounding equivalence, as judged by identical Markov boundaries, produced different bias-reducing potential in the logistic regression model. For the sets satisfied G-admissibility, adjusting for the set including all the confounders reduced the equivalent bias to the one containing the parent nodes of the outcome, while the bias after adjusting for the parent nodes of exposure was not equivalent to them. In addition, all causal effect estimations through logistic regression were biased, although the estimation after adjusting for the parent nodes of exposure was nearest to the true causal effect. However, conditioning on different confounding equivalence sets had the same bias-reducing potential under IPW-based-MSM. Compared with logistic regression, the IPW-based-MSM could obtain unbiased causal effect estimation when the adjusted confounders satisfied G-admissibility and the optimal

  5. Using an Individual Procedure Score Before and After the Advanced Surgical Skills Exposure for Trauma Course Training to Benchmark a Hemorrhage-Control Performance Metric.

    Science.gov (United States)

    Mackenzie, Colin F; Garofalo, Evan; Shackelford, Stacy; Shalin, Valerie; Pugh, Kristy; Chen, Hegang; Puche, Adam; Pasley, Jason; Sarani, Babak; Henry, Sharon; Bowyer, Mark

    2015-01-01

    Test with an individual procedure score (IPS) to assess whether an unpreserved cadaver trauma training course, including upper and lower limb vascular exposure, improves correct identification of surgical landmarks, underlying anatomy, and shortens time to vascular control. Prospective study of performance of 3 vascular exposure and control procedures (axillary, brachial, and femoral arteries) using IPS metrics by 2 colocated and trained evaluators before and after training with the Advanced Surgical Skills Exposure for Trauma (ASSET) course. IPS, including identification of anatomical landmarks, incisions, underlying structures, and time to completion of each procedure was compared before and after training using repeated measurement models. Audio-video instrumented cadaver laboratory at University of Maryland School of Medicine. A total of 41 second to sixth year surgical residents from surgical programs throughout Mid-Atlantic States who had not previously taken the ASSET course were enrolled, 40 completed the pre- and post-ASSET performance evaluations. After ASSET training, all components of IPS increased and time shortened for each of the 3 artery exposures. Procedure steps performed correctly increased 57%, anatomical knowledge increased 43% and skin incision to passage of a vessel loop twice around the correct vessel decreased by a mean of 2.5 minutes. An overall vascular trauma readiness index, a comprehensive IPS score for 3 procedures increased 28% with ASSET Training. Improved knowledge of surface landmarks and underlying anatomy is associated with increased IPS, faster procedures, more accurate incision placement, and successful vascular control. Structural recognition during specific procedural steps and anatomical knowledge were key points learned during the ASSET course. Such training may accelerate acquisition of specific trauma surgery skills to compensate for shortened training hours, infrequent exposure to major vascular injuries, or when just

  6. Dietary Phthalate Exposure in Pregnant Women and the Impact of Consumer Practices

    Directory of Open Access Journals (Sweden)

    Samantha E. Serrano

    2014-06-01

    Full Text Available Phthalates are ubiquitous endocrine-disrupting chemicals that are contaminants in food and contribute to significant dietary exposures. We examined associations between reported consumption of specific foods and beverages and first trimester urinary phthalate metabolite concentrations in 656 pregnant women within a multicenter cohort study, The Infant Development and Environment Study (TIDES, using multivariate regression analysis. We also examined whether reported use of ecofriendly and chemical-free products was associated with lower phthalate biomarker levels in comparison to not following such practices. Consumption of one additional serving of dairy per week was associated with decreases of 1% in the sum of di-2-ethylhexyl phthalate (DEHP metabolite levels (95% CI: −2.0, −0.2. Further, participants who reported sometimes eating homegrown food had monoisobutyl phthalate (MiBP levels that were 16.6% lower (95% CI: −29.5, −1.3 in comparison to participants in the rarely/never category. In contrast to rarely/never eating frozen fruits and vegetables, participants who reported sometimes following this practice had monobenzyl phthalate (MBzP levels that were 21% higher (95% CI: 3.3, 41.7 than rarely/ever respondents. Future study on prenatal dietary phthalate exposure and the role of consumer product choices in reducing such exposure is needed.

  7. Dietary Phthalate Exposure in Pregnant Women and the Impact of Consumer Practices

    Science.gov (United States)

    Serrano, Samantha E.; Karr, Catherine J.; Seixas, Noah S.; Nguyen, Ruby H. N.; Barrett, Emily S.; Janssen, Sarah; Redmon, Bruce; Swan, Shanna H.; Sathyanarayana, Sheela

    2014-01-01

    Phthalates are ubiquitous endocrine-disrupting chemicals that are contaminants in food and contribute to significant dietary exposures. We examined associations between reported consumption of specific foods and beverages and first trimester urinary phthalate metabolite concentrations in 656 pregnant women within a multicenter cohort study, The Infant Development and Environment Study (TIDES), using multivariate regression analysis. We also examined whether reported use of ecofriendly and chemical-free products was associated with lower phthalate biomarker levels in comparison to not following such practices. Consumption of one additional serving of dairy per week was associated with decreases of 1% in the sum of di-2-ethylhexyl phthalate (DEHP) metabolite levels (95% CI: −2.0, −0.2). Further, participants who reported sometimes eating homegrown food had monoisobutyl phthalate (MiBP) levels that were 16.6% lower (95% CI: −29.5, −1.3) in comparison to participants in the rarely/never category. In contrast to rarely/never eating frozen fruits and vegetables, participants who reported sometimes following this practice had monobenzyl phthalate (MBzP) levels that were 21% higher (95% CI: 3.3, 41.7) than rarely/ever respondents. Future study on prenatal dietary phthalate exposure and the role of consumer product choices in reducing such exposure is needed. PMID:24927036

  8. Radiation beans characterization and implantation for study of lead equivalent individual protection device used in radiodiagnostic practices

    International Nuclear Information System (INIS)

    Pereira, Leslie Silva

    2004-01-01

    The protective shielding (IPC) must be used by occupationally exposed professionals, patients and volunteers, in order to optimize the doses who receive due to radiological practices. International and national norms establish the methodology to be adopted for determination of the IPC attenuation. In this work, the IPC had been submitted to X-rays beams with known characteristics, standardized for determination of their attenuation equivalent thickness by comparison to an experimental lead attenuation slope. This comparison technique allowed insurance estimative of the IPC attenuation equivalent thickness in mm of lead. Thus, it was possible to verify the conformity of the attenuation equivalent thickness determined experimentally and the value of the thickness indicated by the manufacturer. To carry out this work, it was necessary the implementation of experimental setups stated in the specifics norms, the study of the X-rays beams original features and the determination of combined additional filters, in order to allow the X-ray equipment used operates in compliance with Norm IEC 61331-1 IEC. The radiation quality selected is characterized by a 100 kV voltage and a 0.25 mm of copper overall filtration. The implementation of this radiation quality it was carried through of its first and second HVL (Half Value Layer). Thus, a methodology according to the international Norms has been implemented in the laboratory. The results of the present work provide suitable and useful information about radiation beams features related to the determination techniques of the attenuation properties. Once implemented the procedures for conformity evaluation of the protection devices, it will be possible to carry out specific quality control tests, which will be helpful to manufacturers, customers, as well as authorities in the radiological protection and health areas. (author)

  9. Analysis of Skeletal Muscle Metrics as Predictors of Functional Task Performance

    Science.gov (United States)

    Ryder, Jeffrey W.; Buxton, Roxanne E.; Redd, Elizabeth; Scott-Pandorf, Melissa; Hackney, Kyle J.; Fiedler, James; Ploutz-Snyder, Robert J.; Bloomberg, Jacob J.; Ploutz-Snyder, Lori L.

    2010-01-01

    PURPOSE: The ability to predict task performance using physiological performance metrics is vital to ensure that astronauts can execute their jobs safely and effectively. This investigation used a weighted suit to evaluate task performance at various ratios of strength, power, and endurance to body weight. METHODS: Twenty subjects completed muscle performance tests and functional tasks representative of those that would be required of astronauts during planetary exploration (see table for specific tests/tasks). Subjects performed functional tasks while wearing a weighted suit with additional loads ranging from 0-120% of initial body weight. Performance metrics were time to completion for all tasks except hatch opening, which consisted of total work. Task performance metrics were plotted against muscle metrics normalized to "body weight" (subject weight + external load; BW) for each trial. Fractional polynomial regression was used to model the relationship between muscle and task performance. CONCLUSION: LPMIF/BW is the best predictor of performance for predominantly lower-body tasks that are ambulatory and of short duration. LPMIF/BW is a very practical predictor of occupational task performance as it is quick and relatively safe to perform. Accordingly, bench press work best predicts hatch-opening work performance.

  10. Standard Practice for Exposure of Solar Collector Cover Materials to Natural Weathering Under Conditions Simulating Stagnation Mode

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    1992-01-01

    1.1 This practice covers a procedure for the exposure of solar collector cover materials to the natural weather environment at elevated temperatures that approximate stagnation conditions in solar collectors having a combined back and edge loss coefficient of less than 1.5 W/(m2 · °C). 1.2 This practice is suitable for exposure of both glass and plastic solar collector cover materials. Provisions are made for exposure of single and double cover assemblies to accommodate the need for exposure of both inner and outer solar collector cover materials. 1.3 This practice does not apply to cover materials for evacuated collectors, photovoltaic cells, flat-plate collectors having a combined back and edge loss coefficient greater than 1.5 W/(m2 ·° C), or flat-plate collectors whose design incorporates means for limiting temperatures during stagnation. 1.4 This standard does not purport to address all of the safety concerns, if any, associated with its use. It is the responsibility of the user of this standard t...

  11. Predicting Adolescent Dating Violence Perpetration: Role of Exposure to Intimate Partner Violence and Parenting Practices.

    Science.gov (United States)

    Latzman, Natasha E; Vivolo-Kantor, Alana M; Holditch Niolon, Phyllis; Ghazarian, Sharon R

    2015-09-01

    Exposure to adult intimate partner violence (IPV) places youth at risk for a range of outcomes, including perpetration of adolescent dating violence (ADV). However, there is variability in the effect of IPV exposure, as many youth who are exposed to IPV do not go on to exhibit problems. Thus, research is needed to examine contextual factors, such as parenting practices, to more fully explain heterogeneity in outcomes and better predict ADV perpetration. The current research draws from a multisite study to investigate the predictive power of IPV exposure and parenting practices on subsequent ADV perpetration. Participants included 417 adolescents (48.7% female) drawn from middle schools in high-risk, urban communities. IPV exposure, two types of parenting practices (positive parenting/involvement and parental knowledge of their child's dating), and five types of ADV perpetration (threatening behaviors, verbal/emotional abuse, relational abuse, physical abuse, and sexual abuse) were assessed at baseline (2012) and approximately 5 months later (2013) via adolescent report. Analyses (conducted in 2015) used a structural equation modeling approach. Structural models indicated that IPV exposure was positively related only to relational abuse at follow-up. Further, adolescents who reported parents having less knowledge of dating partners were more likely to report perpetrating two types of ADV (physical and verbal/emotional abuse) at follow-up. Analyses did not demonstrate any significant interaction effects. Results fill a critical gap in understanding of important targets to prevent ADV in middle school and highlight the important role that parents may play in ADV prevention. Published by Elsevier Inc.

  12. Measuring population transmission risk for HIV: an alternative metric of exposure risk in men who have sex with men (MSM in the US.

    Directory of Open Access Journals (Sweden)

    Colleen F Kelley

    Full Text Available Various metrics for HIV burden and treatment success [e.g. HIV prevalence, community viral load (CVL, population viral load (PVL, percent of HIV-positive persons with undetectable viral load] have important public health limitations for understanding disparities.Using data from an ongoing HIV incidence cohort of black and white men who have sex with men (MSM, we propose a new metric to measure the prevalence of those at risk of transmitting HIV and illustrate its value. MSM with plasma VL>400 copies/mL were defined as having 'transmission risk'. We calculated HIV prevalence, CVL, PVL, percent of HIV-positive with undetectable viral loads, and prevalence of plasma VL>400 copies/ml (%VL400 for black and white MSM. We used Monte Carlo simulation incorporating data on sexual mixing by race to estimate exposure of black and white HIV-negative MSM to a partner with transmission risk via unprotected anal intercourse (UAI. Of 709 MSM recruited, 42% (168/399 black and 14% (44/310 white MSM tested HIV-positive (p<.0001. No significant differences were seen in CVL, PVL, or percent of HIV positive with undetectable viral loads. The %VL400 was 25% (98/393 for black vs. 8% (25/310 for white MSM (p<.0001. Black MSM with 2 UAI partners were estimated to have 40% probability (95% CI: 35%, 45% of having ≥1 UAI partner with transmission risk vs. 20% for white MSM (CI: 15%, 24%.Despite similarities in other metrics, black MSM in our cohort are three times as likely as white MSM to have HIV transmission risk. With comparable risk behaviors, HIV-negative black MSM have a substantially higher likelihood of encountering a UAI partner at risk of transmitting HIV. Our results support increasing HIV testing, linkage to care, and antiretroviral treatment of HIV-positive MSM to reduce prevalence of those with transmission risk, particularly for black MSM.

  13. Development of a standardized transfusion ratio as a metric for evaluating dialysis facility anemia management practices.

    Science.gov (United States)

    Liu, Jiannong; Li, Suying; Gilbertson, David T; Monda, Keri L; Bradbury, Brian D; Collins, Allan J

    2014-10-01

    Because transfusion avoidance has been the cornerstone of anemia treatment for patients with kidney disease, direct measurement of red blood cell transfusion use to assess dialysis facility anemia management performance is reasonable. We aimed to explore methods for estimating facility-level standardized transfusion ratios (STfRs) to assess provider anemia treatment practices. Retrospective cohort study. Point prevalent US hemodialysis patients on January 1, 2009, with Medicare as primary payer and dialysis duration of 90 days or longer were included (n = 223,901). All dialysis facilities with eligible patients were included (n = 5,345). Dialysis facility assignment. Receiving a red blood cell transfusion in the inpatient or outpatient setting. We evaluated 3 approaches for estimating STfR: ratio of observed to expected numbers of transfusions (STfR(obs)), a Bayesian approach (STfR(Bayes)), and a modified version of the Bayesian approach (STfR(modBayes)). The overall national transfusion rate in 2009 was 23.2 per 100 patient-years. Our model for predicting the expected number of transfusions performed well. For large facilities, all 3 STfRs worked well. However, for small facilities, while the STfR(modBayes) worked well, STfR(obs) values demonstrated instability and the STfR(Bayes) may produce more bias. Administration of transfusions to dialysis patients reflects medical practice both within and outside the dialysis unit. Some transfusions may be deemed unavoidable and transfusion practices are subject to considerable regional variation. Development of an STfR metric is feasible and reasonable for assessing anemia treatment at dialysis facilities. The STfR(obs) is simple to calculate and works well for larger dialysis facilities. The STfR(modBayes) is more analytically complex, but facilitates comparisons across all dialysis facilities, including small facilities. Copyright © 2014 National Kidney Foundation, Inc. Published by Elsevier Inc. All rights reserved.

  14. Disease and Health Inequalities Attributable to Air Pollutant Exposure in Detroit, Michigan

    Directory of Open Access Journals (Sweden)

    Sheena E. Martenies

    2017-10-01

    Full Text Available The environmental burden of disease is the mortality and morbidity attributable to exposures of air pollution and other stressors. The inequality metrics used in cumulative impact and environmental justice studies can be incorporated into environmental burden studies to better understand the health disparities of ambient air pollutant exposures. This study examines the diseases and health disparities attributable to air pollutants for the Detroit urban area. We apportion this burden to various groups of emission sources and pollutants, and show how the burden is distributed among demographic and socioeconomic subgroups. The analysis uses spatially-resolved estimates of exposures, baseline health rates, age-stratified populations, and demographic characteristics that serve as proxies for increased vulnerability, e.g., race/ethnicity and income. Based on current levels, exposures to fine particulate matter (PM2.5, ozone (O3, sulfur dioxide (SO2, and nitrogen dioxide (NO2 are responsible for more than 10,000 disability-adjusted life years (DALYs per year, causing an annual monetized health impact of $6.5 billion. This burden is mainly driven by PM2.5 and O3 exposures, which cause 660 premature deaths each year among the 945,000 individuals in the study area. NO2 exposures, largely from traffic, are important for respiratory outcomes among older adults and children with asthma, e.g., 46% of air-pollution related asthma hospitalizations are due to NO2 exposures. Based on quantitative inequality metrics, the greatest inequality of health burdens results from industrial and traffic emissions. These metrics also show disproportionate burdens among Hispanic/Latino populations due to industrial emissions, and among low income populations due to traffic emissions. Attributable health burdens are a function of exposures, susceptibility and vulnerability (e.g., baseline incidence rates, and population density. Because of these dependencies, inequality

  15. Symmetries of Taub-NUT dual metrics

    International Nuclear Information System (INIS)

    Baleanu, D.; Codoban, S.

    1998-01-01

    Recently geometric duality was analyzed for a metric which admits Killing tensors. An interesting example arises when the manifold has Killing-Yano tensors. The symmetries of the dual metrics in the case of Taub-NUT metric are investigated. Generic and non-generic symmetries of dual Taub-NUT metric are analyzed

  16. Metoprolol Dose Equivalence in Adult Men and Women Based on Gender Differences: Pharmacokinetic Modeling and Simulations

    Directory of Open Access Journals (Sweden)

    Andy R. Eugene

    2016-11-01

    Full Text Available Recent meta-analyses and publications over the past 15 years have provided evidence showing there are considerable gender differences in the pharmacokinetics of metoprolol. Throughout this time, there have not been any research articles proposing a gender stratified dose-adjustment resulting in an equivalent total drug exposure. Metoprolol pharmacokinetic data was obtained from a previous publication. Data was modeled using nonlinear mixed effect modeling using the MONOLIX software package to quantify metoprolol concentration–time data. Gender-stratified dosing simulations were conducted to identify equivalent total drug exposure based on a 100 mg dose in adults. Based on the pharmacokinetic modeling and simulations, a 50 mg dose in adult women provides an approximately similar metoprolol drug exposure to a 100 mg dose in adult men.

  17. Metric learning

    CERN Document Server

    Bellet, Aurelien; Sebban, Marc

    2015-01-01

    Similarity between objects plays an important role in both human cognitive processes and artificial systems for recognition and categorization. How to appropriately measure such similarities for a given task is crucial to the performance of many machine learning, pattern recognition and data mining methods. This book is devoted to metric learning, a set of techniques to automatically learn similarity and distance functions from data that has attracted a lot of interest in machine learning and related fields in the past ten years. In this book, we provide a thorough review of the metric learnin

  18. Technical Privacy Metrics: a Systematic Survey

    OpenAIRE

    Wagner, Isabel; Eckhoff, David

    2018-01-01

    The file attached to this record is the author's final peer reviewed version The goal of privacy metrics is to measure the degree of privacy enjoyed by users in a system and the amount of protection offered by privacy-enhancing technologies. In this way, privacy metrics contribute to improving user privacy in the digital world. The diversity and complexity of privacy metrics in the literature makes an informed choice of metrics challenging. As a result, instead of using existing metrics, n...

  19. A theoretical study for the real-time assessment of external gamma exposure using equivalent-volume numerical integration

    International Nuclear Information System (INIS)

    Han, Moon Hee

    1995-02-01

    An approximate method for estimating gamma external dose due to an arbitrary distribution of radioactive material has been developed. For the assessment of external gamma dose, the space over which radioactive material is distributed has been assumed to be composed of hexagonal cells. The evaluation of three-dimensional integration over the space is an extremely time-consuming task. Hence, a different approach has been used for the study, i.e., a equivalent-volume spherical approach in which a regular hexahedron is modeled as a equivalent-volume sphere to simplify the integration. For the justification of the current approach, two case studies have been performed: a comparison with a point source approximation and a comparison of external dose rate with the Monte Carlo integration. These comparisons show that the current approach gives reasonable results in a physical sense. Computing times of the developed and Monte Carlo integration method on VAX system have been compared as a function of the number of hexagonal cells. This comparison shows that CPU times for both methods are comparable in the region of small number of cells, but in the region of large number, Monte Carlo integration needs much more computing times. The proposed method is shown to have an accuracy equivalent to Monte Carlo method with an advantage of much shorter calculation time. Then, the method developed here evaluates early off-site consequences of a nuclear accident. An accident consequence assessment model has been integrated using Gaussian puff model which is used to obtain the distribution of radioactive material in the air and on the ground. For this work, the real meteorological data measured at Kori site for 10 years (1976 - 1985) have been statistically analyzed for obtaining site-specific conditions. The short-term external gamma exposures have been assessed for several site-specific meteorological conditions. The results show that the extent and the pattern of short-term external

  20. Evidence-based practice exposure and physiotherapy students' behaviour during clinical placements: a survey.

    Science.gov (United States)

    Olsen, Nina Rydland; Lygren, Hildegunn; Espehaug, Birgitte; Nortvedt, Monica Wammen; Bradley, Peter; Bjordal, Jan Magnus

    2014-12-01

    Physiotherapists are expected to practice in an evidence-based way. Evidence-based practice (EBP) should be an integral part of the curriculum to ensure use of the five EBP steps: asking clinical questions, searching for and appraising research evidence, integrating the evidence into clinical practice and evaluating this process. The aim of this study was to compare self-reported EBP behaviour, abilities and barriers during clinical placements reported by five cohorts of final year physiotherapy students' with different EBP exposure across the 3-year bachelor programme. A cross-sectional study was conducted among five cohorts (2006-2010) with third year physiotherapy students at a University College in Norway. In total, 246 students were eligible for this study. To collect data, we used a questionnaire with 42 items related to EBP behaviour, ability and barriers. Associations were investigated using the Spearman's rho (r). In total, 180 out of 246 third year physiotherapy students, who had recently completed a clinical placement, filled out the questionnaire (73 %). The association between the level of EBP exposure and students' self-reported EBP behaviour, abilities and barriers was low for most items in the questionnaire. Statistically significant correlations were found for eight items, related to information need, question formulation, use of checklists, searching and perceived ability to search for and critically appraise research evidence. The strongest correlation was found between the level of EBP exposure and ability to critically appraise research evidence (r = 0.41, p physiotherapy students' EBP behaviour was found for elements such as asking and searching, ability to search for and critically appraise research evidence, and experience of critical appraisal as a barrier. Further research need to explore strategies for EBP exposure throughout the curriculum, regarding content, timing, amount and type of training. Copyright © 2014 John Wiley & Sons

  1. Radiation exposures for DOE and DOE contractor employees Eighteenth annual report, 1985

    International Nuclear Information System (INIS)

    1986-12-01

    All US Department of Energy (DOE) and DOE contractors are required to submit occupational radiation exposure records to a central repository. The data required include a summary of whole-body exposures to ionizing radiation, a summary of internal depositions of radioactive materials above specified limits, and occupational exposure reports for terminating employees. This report is a summary of the data submitted by DOE and DOE contractors for 1985. A total of 95,806 DOE and DOE contractor employees were monitored for whole-body ionizing radiation exposures in 1985. In addition to the employees, 96,665 visitors were monitored. Of all employees monitored, 58.4% received a dose equivalent that was less than measurable, 39.8% a measurable exposure less than 1 rem, and 1.9% an exposure greater than 1 rem. One employee received a dose equivalent greater than 5 rem (8.66 rem). The exposure received by 91.9% of the visitors to DOE facilities was less than measurable. No visitors received a dose equivalent greater than 2 rem. The collective dose equivalent for DOE and DOE contractor employees was 8223 person-rem. The collective dose equivalent for visitors was 461 person-rem. These averages are significantly less than the DOE 5-rem/year radiation protection standard for whole-body exposures. Ten new cases of internal depositions were reported in 1985 that exceeded 50% of the pertinent annual dose-equivalent standard. Of these ten cases, eight occurred in a previous year and are reported now because recent revisions in the dose calculations established these cases as reportable depositions. Twenty-six cases reported during 1985 were considered to be the continued tracking of previous depositions. 5 figs., 32 tabs

  2. On Information Metrics for Spatial Coding.

    Science.gov (United States)

    Souza, Bryan C; Pavão, Rodrigo; Belchior, Hindiael; Tort, Adriano B L

    2018-04-01

    The hippocampal formation is involved in navigation, and its neuronal activity exhibits a variety of spatial correlates (e.g., place cells, grid cells). The quantification of the information encoded by spikes has been standard procedure to identify which cells have spatial correlates. For place cells, most of the established metrics derive from Shannon's mutual information (Shannon, 1948), and convey information rate in bits/s or bits/spike (Skaggs et al., 1993, 1996). Despite their widespread use, the performance of these metrics in relation to the original mutual information metric has never been investigated. In this work, using simulated and real data, we find that the current information metrics correlate less with the accuracy of spatial decoding than the original mutual information metric. We also find that the top informative cells may differ among metrics, and show a surrogate-based normalization that yields comparable spatial information estimates. Since different information metrics may identify different neuronal populations, we discuss current and alternative definitions of spatially informative cells, which affect the metric choice. Copyright © 2018 IBRO. Published by Elsevier Ltd. All rights reserved.

  3. Generalized Painleve-Gullstrand metrics

    Energy Technology Data Exchange (ETDEWEB)

    Lin Chunyu [Department of Physics, National Cheng Kung University, Tainan 70101, Taiwan (China)], E-mail: l2891112@mail.ncku.edu.tw; Soo Chopin [Department of Physics, National Cheng Kung University, Tainan 70101, Taiwan (China)], E-mail: cpsoo@mail.ncku.edu.tw

    2009-02-02

    An obstruction to the implementation of spatially flat Painleve-Gullstrand (PG) slicings is demonstrated, and explicitly discussed for Reissner-Nordstroem and Schwarzschild-anti-deSitter spacetimes. Generalizations of PG slicings which are not spatially flat but which remain regular at the horizons are introduced. These metrics can be obtained from standard spherically symmetric metrics by physical Lorentz boosts. With these generalized PG metrics, problematic contributions to the imaginary part of the action in the Parikh-Wilczek derivation of Hawking radiation due to the obstruction can be avoided.

  4. Kerr metric in the deSitter background

    International Nuclear Information System (INIS)

    Vaidya, P.C.

    1984-01-01

    In addition to the Kerr metric with cosmological constant Λ several other metrics are presented giving a Kerr-like solution of Einstein's equations in the background of deSitter universe. A new metric of what may be termed as rotating deSitter space-time devoid of matter but containing null fluid with twisting null rays, has been presented. This metric reduces to the standard deSitter metric when the twist in the rays vanishes. Kerr metric in this background is the immediate generalization of Schwarzschild's exterior metric with cosmological constant. (author)

  5. A Study on intelligent measurement of nuclear explosion equivalent in atmosphere

    International Nuclear Information System (INIS)

    Wang Desheng; Wu Xiaohong

    1999-01-01

    Measurement of nuclear explosion equivalent in atmosphere is an important subject for nuclear survey. Based on the relations between nuclear explosion equivalent and the minimum illuminance time of light radiation from nuclear explosion. The method of RC differential valley time detection and mean-time taking is presented the method, using a single-chip computer as a intelligent part, can realize intelligent measurement of minimum illuminance time with high reliability and low power consumption. This method provides a practical mean for quick, accurate and reliable measurement of nuclear explosion equivalent in atmosphere

  6. Kerr metric in cosmological background

    Energy Technology Data Exchange (ETDEWEB)

    Vaidya, P C [Gujarat Univ., Ahmedabad (India). Dept. of Mathematics

    1977-06-01

    A metric satisfying Einstein's equation is given which in the vicinity of the source reduces to the well-known Kerr metric and which at large distances reduces to the Robertson-Walker metric of a nomogeneous cosmological model. The radius of the event horizon of the Kerr black hole in the cosmological background is found out.

  7. Equivalence Testing as a Tool for Fatigue Risk Management in Aviation.

    Science.gov (United States)

    Wu, Lora J; Gander, Philippa H; van den Berg, Margo; Signal, T Leigh

    2018-04-01

    Many civilian aviation regulators favor evidence-based strategies that go beyond hours-of-service approaches for managing fatigue risk. Several countries now allow operations to be flown outside of flight and duty hour limitations, provided airlines demonstrate an alternative method of compliance that yields safety levels "at least equivalent to" the prescriptive regulations. Here we discuss equivalence testing in occupational fatigue risk management. We present suggested ratios/margins of practical equivalence when comparing operations inside and outside of prescriptive regulations for two common aviation safety performance indicators: total in-flight sleep duration and psychomotor vigilance task reaction speed. Suggested levels of practical equivalence, based on expertise coupled with evidence from field and laboratory studies, are ≤ 30 min in-flight sleep and ± 15% of reference response speed. Equivalence testing is illustrated in analyses of a within-subjects field study during an out-and-back long-range trip. During both sectors of their trip, 41 pilots were monitored via actigraphy, sleep diary, and top of descent psychomotor vigilance task. Pilots were assigned to take rest breaks in a standard lie-flat bunk on one sector and in a bunk tapered 9 from hip to foot on the other sector. Total in-flight sleep duration (134 ± 53 vs. 135 ± 55 min) and mean reaction speed at top of descent (3.94 ± 0.58 vs. 3.77 ± 0.58) were equivalent after rest in the full vs. tapered bunk. Equivalence testing is a complimentary statistical approach to difference testing when comparing levels of fatigue and performance in occupational settings and can be applied in transportation policy decision making.Wu LJ, Gander PH, van den Berg M, Signal TL. Equivalence testing as a tool for fatigue risk management in aviation. Aerosp Med Hum Perform. 2018; 89(4):383-388.

  8. On the equivalence of vacuum equations of gauge quadratic theory of gravity and general relativity theory

    International Nuclear Information System (INIS)

    Zhitnikov, V.V.; Ponomarev, V.N.

    1986-01-01

    An attempt is made to compare the solution of field equations, corresponding to quadratic equations for the fields (g μν , Γ μν α ) in gauge gravitation theory (GGT) with general relativity theory solutions. Without restrictions for a concrete type of metrics only solutions of equations, for which torsion turns to zero, are considered. Equivalence of vacuum equations of gauge quadratic theory of gravity and general relativity theory is proved using the Newman-Penrose formalism

  9. Sample size for comparing negative binomial rates in noninferiority and equivalence trials with unequal follow-up times.

    Science.gov (United States)

    Tang, Yongqiang

    2017-05-25

    We derive the sample size formulae for comparing two negative binomial rates based on both the relative and absolute rate difference metrics in noninferiority and equivalence trials with unequal follow-up times, and establish an approximate relationship between the sample sizes required for the treatment comparison based on the two treatment effect metrics. The proposed method allows the dispersion parameter to vary by treatment groups. The accuracy of these methods is assessed by simulations. It is demonstrated that ignoring the between-subject variation in the follow-up time by setting the follow-up time for all individuals to be the mean follow-up time may greatly underestimate the required size, resulting in underpowered studies. Methods are provided for back-calculating the dispersion parameter based on the published summary results.

  10. Temporal and Other Exposure Aspects of Residential Magnetic Fields Measurement in Relation to Acute Lymphoblastic Leukaemia in Children: The National Cancer Institute Children's Cancer Group Study (invited paper)

    International Nuclear Information System (INIS)

    Baris, D.; Linet, M.; Auvinen, A.; Kaune, W.T.; Wacholder, S.; Kleinerman, R.; Hatch, E.; Robison, L.; Niwa, S.; Haines, C.; Tarone, R.E.

    1999-01-01

    Case-control studies have used a variety of measurements to evaluate the relationship of children's exposure to magnetic fields (50 or 60 Hz) with childhood leukaemia and other childhood cancers. In the absence of knowledge about which exposure metrics may be biologically meaningful, studies during the past 10 years have often used time-weighted average (TWA) summaries of home measurements. Recently, other exposure metrics have been suggested, usually based on theoretical considerations or limited laboratory data. In this paper, the rationale and associated preliminary studies undertaken are described as well as feasibility and validity issues governing the choice of the primary magnetic field exposure assessment methods and summary metric used to estimate children's exposure in the National Cancer Institute/Children's Cancer Group (NCI/CCG) case-control study. Also provided are definitions and discussion of the strengths and weaknesses of the various exposure metrics used in exploratory analyses of the NCI/CCG measurement data. Exposure metrics evaluated include measures of central tendency (mean, median, 30th to 70th percentiles), peak exposures (90th and higher percentiles, peak values of the 24 h measurements), and measurements of short-term temporal variability (rate of change). This report describes correlations of the various metrics with the time-weighted average for the 24 h period (TWA-24-h). Most of the metrics were found to be positively and highly correlated with TWA-24-h, but lower correlations of TWA-24-h with peak exposure and with rate of change were observed. To examine further the relation between TWA and alternative metrics, similar exploratory analysis should be considered for existing data sets and for forthcoming measurement investigations of residential magnetic fields and childhood leukaemia. (author)

  11. Two classes of metric spaces

    Directory of Open Access Journals (Sweden)

    Isabel Garrido

    2016-04-01

    Full Text Available The class of metric spaces (X,d known as small-determined spaces, introduced by Garrido and Jaramillo, are properly defined by means of some type of real-valued Lipschitz functions on X. On the other hand, B-simple metric spaces introduced by Hejcman are defined in terms of some kind of bornologies of bounded subsets of X. In this note we present a common framework where both classes of metric spaces can be studied which allows us to see not only the relationships between them but also to obtain new internal characterizations of these metric properties.

  12. Risk equivalent of exposure versus dose of radiation

    International Nuclear Information System (INIS)

    Bond, V.P.

    1986-01-01

    Radiation is perhaps unique among all agents of interest in the Health Sciences in that it alone is both a therapeutic agent for the control of cancer and an essentially ubiquitous environmental agent with a potential for increasing the cancer rate in human populations. Therapy of tumors is accomplished with the high-level exposure (HLE) to radiation in order to effect control or a cure. Thus, it conforms to the concepts and approaches of pharmacology, toxicology, and therapeutic medicine. Only one function, that which relates the object-oriented and nonstochastic independent variable organ dose to its effect on a cancer or an organ, is needed to estimate the probability, P 2 , of a quantal response. Only P 2 is needed because P 1 , that the cancer slated for such treatment will receive some amount of the agent and be affected to some degree, is effectively unity. The health problem involving low-level exposure (LLE) to radiation, in contrast, is not at all analogous to those of pharmacology, toxicology, and medicine. Rather, it presents a public health problem in that it is a health population, albeit of cells, that is exposed in a radiation field composed of moving radiation particles with some attendant low-order carcinogenic or mutagenic risk. Thus, the concepts, quantities, and terminology applied to low-level radiation must be modified from their present orientation toward pharmacology, toxicology, medicine, and dose to conform to those of public health and accident statistics, in which both P 1 and P 2 for the exposed cells must be estimated

  13. ACREM: A new air crew radiation exposure measuring system

    International Nuclear Information System (INIS)

    Beck, P.; Duftschmid, K.; Kerschbaumer, S.; Schmitzer, C.; Strachotinsky, C.; Grosskopf, A.; Winkler, N.

    1996-01-01

    Cosmic radiation has already been discovered in 1912 by the Austrian Nobel Laureate Victor F. Hess. After Hess up to now numerous measurements of the radiation exposure by cosmic rays in different altitudes have been performed, however, this has not been taken serious in view of radiation protection.Today, with the fast development of modern airplanes, an ever increasing number of civil aircraft is flying in increasing altitudes for considerable time. Members of civil aircrew spending up to 1000 hours per year in cruising altitudes and therefore are subject to significant levels of radiation exposure. In 1990 ICRP published its report ICRP 60 with updated excess cancer risk estimates, which led to significantly higher risk coefficients for some radiation qualities. An increase of the radiation weighting factors for mean energy neutron radiation increases the contribution for the neutron component to the equivalent dose by about 60%, as compared to the earlier values of ICRP26. This higher risk coefficients lead to the recommendation of the ICRP, that cosmic radiation exposure in civil aviation should be taken into account as occupational exposure. Numerous recent exposure measurements at civil airliners in Germany, Sweden, USA, and Russia show exposure levels in the range of 3-10 mSv/year. This is significantly more than the average annual dose of radiation workers (in Austria about 1.5 mSv/year). Up to now no practicable and economic radiation monitoring system for routine application on board exits. A fairly simple and economic approach to a practical, active in-flight dosimeter for the assessment of individual crew exposure is discussed in this paper

  14. Criteria and methods for estimating external effective dose equivalent from personnel monitoring results: EDE implementation guide. Final report

    International Nuclear Information System (INIS)

    Owen, D.

    1998-09-01

    Title 10 Part 20 of the Code of Federal regulations requires that nuclear power plant licensees evaluate worker radiation exposure using a risk-based methodology termed the effective dose equivalent (EDE). EDE is a measure of radiation exposure that represents an individual's risk of stochastic injury from their exposure. EPRI has conducted research into how photons interact with the body. These results have been coupled with information on how the body's organs differ in their susceptibility to radiation injury, to produce a methodology for assessing the effective dose equivalent. The research and the resultant methodology have been described in numerous technical reports, scientific journal articles, and technical meetings. EPRI is working with the Nuclear Energy Institute to have the EPRI effective dose equivalent methodology accepted by the Nuclear Regulatory Commission for use at US nuclear power plants. In order to further familiarize power plant personnel with the methodology, this report summarizes the EDE research and presents some simple guidelines for its implementing the methodology

  15. Attitudes, knowledge and practices of healthcare workers regarding occupational exposure of pulmonary tuberculosis

    Directory of Open Access Journals (Sweden)

    Lesley T. Bhebhe

    2014-10-01

    Objective: Following a high incidence of TB among HCWs at Maluti Adventist Hospital in Lesotho, a study was carried out to assess the knowledge, attitudes and practices of HCWs regarding healthcare-associated TB infection and infection controls. Methods: This was a cross-sectional study performed in June 2011; it involved HCWs at Maluti Adventist Hospital who were involved with patients and/or sputum. Stratified sampling of 140 HCWs was performed, of whom, 129 (92.0% took part. A self-administered, semi-structured questionnaire was used. Results: Most respondents (89.2% had appropriate knowledge of transmission, diagnosis and prevention of TB; however, only 22.0% of the respondents knew the appropriate method of sputum collection. All of the respondents (100.0% were motivated and willing to implement IPC measures. A significant proportion of participants (36.4% reported poor infection control practices, with the majority of inappropriate practices being the administrative infection controls (> 80.0%. Only 38.8% of the participants reported to be using the appropriate N-95 respirator. Conclusion: Poor infection control practices regarding occupational TB exposure were demonstrated, the worst being the first-line administrative infection controls. Critical knowledge gaps were identified; however, there was encouraging willingness by HCWs to adapt to recommended infection control measures. Healthcare workers are inevitably exposed to TB, due to frequent interaction with patients with undiagnosed and potentially contagious TB. Implementation of infection prevention and control practices is critical whenever there is a possibility of exposure.

  16. A Metric for Heterotic Moduli

    Science.gov (United States)

    Candelas, Philip; de la Ossa, Xenia; McOrist, Jock

    2017-12-01

    Heterotic vacua of string theory are realised, at large radius, by a compact threefold with vanishing first Chern class together with a choice of stable holomorphic vector bundle. These form a wide class of potentially realistic four-dimensional vacua of string theory. Despite all their phenomenological promise, there is little understanding of the metric on the moduli space of these. What is sought is the analogue of special geometry for these vacua. The metric on the moduli space is important in phenomenology as it normalises D-terms and Yukawa couplings. It is also of interest in mathematics, since it generalises the metric, first found by Kobayashi, on the space of gauge field connections, to a more general context. Here we construct this metric, correct to first order in {α^{\\backprime}}, in two ways: first by postulating a metric that is invariant under background gauge transformations of the gauge field, and also by dimensionally reducing heterotic supergravity. These methods agree and the resulting metric is Kähler, as is required by supersymmetry. Checking the metric is Kähler is intricate and the anomaly cancellation equation for the H field plays an essential role. The Kähler potential nevertheless takes a remarkably simple form: it is the Kähler potential of special geometry with the Kähler form replaced by the {α^{\\backprime}}-corrected hermitian form.

  17. Testing the equivalence principle on cosmological scales

    Science.gov (United States)

    Bonvin, Camille; Fleury, Pierre

    2018-05-01

    The equivalence principle, that is one of the main pillars of general relativity, is very well tested in the Solar system; however, its validity is more uncertain on cosmological scales, or when dark matter is concerned. This article shows that relativistic effects in the large-scale structure can be used to directly test whether dark matter satisfies Euler's equation, i.e. whether its free fall is characterised by geodesic motion, just like baryons and light. After having proposed a general parametrisation for deviations from Euler's equation, we perform Fisher-matrix forecasts for future surveys like DESI and the SKA, and show that such deviations can be constrained with a precision of order 10%. Deviations from Euler's equation cannot be tested directly with standard methods like redshift-space distortions and gravitational lensing, since these observables are not sensitive to the time component of the metric. Our analysis shows therefore that relativistic effects bring new and complementary constraints to alternative theories of gravity.

  18. Doses from radiation exposure

    International Nuclear Information System (INIS)

    Menzel, H-G.; Harrison, J.D.

    2012-01-01

    Practical implementation of the International Commission on Radiological Protection’s (ICRP) system of protection requires the availability of appropriate methods and data. The work of Committee 2 is concerned with the development of reference data and methods for the assessment of internal and external radiation exposure of workers and members of the public. This involves the development of reference biokinetic and dosimetric models, reference anatomical models of the human body, and reference anatomical and physiological data. Following ICRP’s 2007 Recommendations, Committee 2 has focused on the provision of new reference dose coefficients for external and internal exposure. As well as specifying changes to the radiation and tissue weighting factors used in the calculation of protection quantities, the 2007 Recommendations introduced the use of reference anatomical phantoms based on medical imaging data, requiring explicit sex averaging of male and female organ-equivalent doses in the calculation of effective dose. In preparation for the calculation of new dose coefficients, Committee 2 and its task groups have provided updated nuclear decay data (ICRP Publication 107) and adult reference computational phantoms (ICRP Publication 110). New dose coefficients for external exposures of workers are complete (ICRP Publication 116), and work is in progress on a series of reports on internal dose coefficients to workers from inhaled and ingested radionuclides. Reference phantoms for children will also be provided and used in the calculation of dose coefficients for public exposures. Committee 2 also has task groups on exposures to radiation in space and on the use of effective dose.

  19. Occupational radiation exposure in the french nuclear industry: impact of 1990'S ICRP recommendations

    International Nuclear Information System (INIS)

    Pages, P.; Hubert, P.

    1994-01-01

    The study addresses the issue of the impact the forthcoming regulations derived from ICRP 60 recommendations will have on radiological protection practices. A questionnaire has been sent to companies carrying out tasks involving exposures to ionizing radiation. 55 companies reported the exposures of their personnel (annual collective effective dose equivalent and distribution of individual doses). The reference year is 1991. Results were obtained for a total of 43789 workers, with a corresponding collective dose equivalent of 96 man.Sv and 1100 persons with individual dose in excess of 20 mSv (1800 in excess of 15 mSv). The major part of collective, as well as the higher individual exposures are found in subcontract companies involved in maintenance, cleaning and specialized tasks during reactor shutdown. Based on this inquiry, results have been extrapolated to the whole nuclear fuel cycle. 68000 workers are estimated to be exposed, with a total collective dose of 160 man.Sv. Among them 2200 workers would be exposed to dose equivalent in excess of 20 mSv, 3400 in excess of 15 mSv. Even if higher doses concern few people, they are associated with important tasks at particular steps of the fuel cycle. In the questionnaire, companies were asked for ways and means envisaged or already in use to keep these doses within present or tighter regulatory limits. Some account of the efforts to achieve this goal will be given

  20. On the relationship between metrics to compare greenhouse gases – the case of IGTP, GWP and SGTP

    Directory of Open Access Journals (Sweden)

    D. J. A. Johansson

    2012-11-01

    Full Text Available Metrics for comparing greenhouse gases are analyzed, with a particular focus on the integrated temperature change potential (IGTP following a call from IPCC to investigate this metric. It is shown that the global warming potential (GWP and IGTP are asymptotically equal when the time horizon approaches infinity when standard assumptions about a constant background atmosphere are used. The difference between IGTP and GWP is estimated for different greenhouse gases using an upwelling diffusion energy balance model with different assumptions on the climate sensitivity and the parameterization governing the rate of ocean heat uptake. It is found that GWP and IGTP differ by some 10% for CH4 (for a time horizon of less than 500 yr, and that the relative difference between GWP and IGTP is less for gases with a longer atmospheric life time. Further, it is found that the relative difference between IGTP and GWP increases with increasing rates of ocean heat uptake and increasing climate sensitivity since these changes increase the inertia of the climate system. Furthermore, it is shown that IGTP is equivalent to the sustained global temperature change potential (SGTP under standard assumptions when estimating GWPs. We conclude that while it matters little for abatement policy whether IGTP, SGTP or GWP is used when making trade-offs, it is more important to decide whether society should use a metric based on time integrated effects such as GWP, a "snapshot metric" as GTP, or metrics where both economics and physical considerations are taken into account. Of equal importance is the question of how to choose the time horizon, regardless of the chosen metric. For both these overall questions, value judgments are needed.

  1. Disturbance metrics predict a wetland Vegetation Index of Biotic Integrity

    Science.gov (United States)

    Stapanian, Martin A.; Mack, John; Adams, Jean V.; Gara, Brian; Micacchion, Mick

    2013-01-01

    Indices of biological integrity of wetlands based on vascular plants (VIBIs) have been developed in many areas in the USA. Knowledge of the best predictors of VIBIs would enable management agencies to make better decisions regarding mitigation site selection and performance monitoring criteria. We use a novel statistical technique to develop predictive models for an established index of wetland vegetation integrity (Ohio VIBI), using as independent variables 20 indices and metrics of habitat quality, wetland disturbance, and buffer area land use from 149 wetlands in Ohio, USA. For emergent and forest wetlands, predictive models explained 61% and 54% of the variability, respectively, in Ohio VIBI scores. In both cases the most important predictor of Ohio VIBI score was a metric that assessed habitat alteration and development in the wetland. Of secondary importance as a predictor was a metric that assessed microtopography, interspersion, and quality of vegetation communities in the wetland. Metrics and indices assessing disturbance and land use of the buffer area were generally poor predictors of Ohio VIBI scores. Our results suggest that vegetation integrity of emergent and forest wetlands could be most directly enhanced by minimizing substrate and habitat disturbance within the wetland. Such efforts could include reducing or eliminating any practices that disturb the soil profile, such as nutrient enrichment from adjacent farm land, mowing, grazing, or cutting or removing woody plants.

  2. On characterizations of quasi-metric completeness

    Energy Technology Data Exchange (ETDEWEB)

    Dag, H.; Romaguera, S.; Tirado, P.

    2017-07-01

    Hu proved in [4] that a metric space (X, d) is complete if and only if for any closed subspace C of (X, d), every Banach contraction on C has fixed point. Since then several authors have investigated the problem of characterizing the metric completeness by means of fixed point theorems. Recently this problem has been studied in the more general context of quasi-metric spaces for different notions of completeness. Here we present a characterization of a kind of completeness for quasi-metric spaces by means of a quasi-metric versions of Hu’s theorem. (Author)

  3. Drill-specific head impact exposure in youth football practice.

    Science.gov (United States)

    Campolettano, Eamon T; Rowson, Steven; Duma, Stefan M

    2016-11-01

    OBJECTIVE Although 70% of football players in the United States are youth players (6-14 years old), most research on head impacts in football has focused on high school, collegiate, or professional populations. The objective of this study was to identify the specific activities associated with high-magnitude (acceleration > 40g) head impacts in youth football practices. METHODS A total of 34 players (mean age 9.9 ± 0.6 years) on 2 youth teams were equipped with helmet-mounted accelerometer arrays that recorded head accelerations associated with impacts in practices and games. Videos of practices and games were used to verify all head impacts and identify specific drills associated with each head impact. RESULTS A total of 6813 impacts were recorded, of which 408 had accelerations exceeding 40g (6.0%). For each type of practice drill, impact rates were computed that accounted for the length of time that teams spent on each drill. The tackling drill King of the Circle had the highest impact rate (95% CI 25.6-68.3 impacts/hr). Impact rates for tackling drills (those conducted without a blocker [95% CI 14.7-21.9 impacts/hr] and those with a blocker [95% CI 10.5-23.1 impacts/hr]) did not differ from game impact rates (95% CI 14.2-21.6 impacts/hr). Tackling drills were observed to have a greater proportion (between 40% and 50%) of impacts exceeding 60g than games (25%). The teams in this study participated in tackling or blocking drills for only 22% of their overall practice times, but these drills were responsible for 86% of all practice impacts exceeding 40g. CONCLUSIONS In youth football, high-magnitude impacts occur more often in practices than games, and some practice drills are associated with higher impact rates and accelerations than others. To mitigate high-magnitude head impact exposure in youth football, practices should be modified to decrease the time spent in drills with high impact rates, potentially eliminating a drill such as King of the Circle

  4. Statistical versus Musical Significance: Commentary on Leigh VanHandel's 'National Metrical Types in Nineteenth Century Art Song'

    Directory of Open Access Journals (Sweden)

    Justin London

    2010-01-01

    Full Text Available In “National Metrical Types in Nineteenth Century Art Song” Leigh Van Handel gives a sympathetic critique of William Rothstein’s claim that in western classical music of the late 18th and 19th centuries there are discernable differences in the phrasing and metrical practice of German versus French and Italian composers. This commentary (a examines just what Rothstein means in terms of his proposed metrical typology, (b questions Van Handel on how she has applied it to a purely melodic framework, (c amplifies Van Handel’s critique of Rothstein, and then (d concludes with a rumination on the reach of quantitative (i.e., statistically-driven versus qualitative claims regarding such things as “national metrical types.”

  5. Effects of Ethanol Exposure during Distinct Periods of Brain Development on Hippocampal Synaptic Plasticity

    Directory of Open Access Journals (Sweden)

    Brian R. Christie

    2013-07-01

    Full Text Available Fetal alcohol spectrum disorders occur when a mother drinks during pregnancy and can greatly influence synaptic plasticity and cognition in the offspring. In this study we determined whether there are periods during brain development that are more susceptible to the effects of ethanol exposure on hippocampal synaptic plasticity. In particular, we evaluated how the ability to elicit long-term potentiation (LTP in the hippocampal dentate gyrus (DG was affected in young adult rats that were exposed to ethanol during either the 1st, 2nd, or 3rd trimester equivalent. As expected, the effects of ethanol on young adult DG LTP were less severe when exposure was limited to a particular trimester equivalent when compared to exposure throughout gestation. In males, ethanol exposure during the 1st, 2nd or 3rd trimester equivalent did not significantly reduce LTP in the DG. In females, ethanol exposure during either the 1st or 2nd trimester equivalents did not impact LTP in early adulthood, but following exposure during the 3rd trimester equivalent alone, LTP was significantly increased in the female DG. These results further exemplify the disparate effects between the ability to elicit LTP in the male and female brain following perinatal ethanol exposure (PNEE.

  6. 16 CFR 680.23 - Contents of opt-out notice; consolidated and equivalent notices.

    Science.gov (United States)

    2010-01-01

    ... equivalent notices. 680.23 Section 680.23 Commercial Practices FEDERAL TRADE COMMISSION THE FAIR CREDIT REPORTING ACT AFFILIATE MARKETING § 680.23 Contents of opt-out notice; consolidated and equivalent notices... receiving marketing than is required by this part, the requirements of this section may be satisfied by...

  7. Engineering performance metrics

    Science.gov (United States)

    Delozier, R.; Snyder, N.

    1993-03-01

    Implementation of a Total Quality Management (TQM) approach to engineering work required the development of a system of metrics which would serve as a meaningful management tool for evaluating effectiveness in accomplishing project objectives and in achieving improved customer satisfaction. A team effort was chartered with the goal of developing a system of engineering performance metrics which would measure customer satisfaction, quality, cost effectiveness, and timeliness. The approach to developing this system involved normal systems design phases including, conceptual design, detailed design, implementation, and integration. The lessons teamed from this effort will be explored in this paper. These lessons learned may provide a starting point for other large engineering organizations seeking to institute a performance measurement system accomplishing project objectives and in achieving improved customer satisfaction. To facilitate this effort, a team was chartered to assist in the development of the metrics system. This team, consisting of customers and Engineering staff members, was utilized to ensure that the needs and views of the customers were considered in the development of performance measurements. The development of a system of metrics is no different than the development of any type of system. It includes the steps of defining performance measurement requirements, measurement process conceptual design, performance measurement and reporting system detailed design, and system implementation and integration.

  8. Equivalent Lagrangians

    International Nuclear Information System (INIS)

    Hojman, S.

    1982-01-01

    We present a review of the inverse problem of the Calculus of Variations, emphasizing the ambiguities which appear due to the existence of equivalent Lagrangians for a given classical system. In particular, we analyze the properties of equivalent Lagrangians in the multidimensional case, we study the conditions for the existence of a variational principle for (second as well as first order) equations of motion and their solutions, we consider the inverse problem of the Calculus of Variations for singular systems, we state the ambiguities which emerge in the relationship between symmetries and conserved quantities in the case of equivalent Lagrangians, we discuss the problems which appear in trying to quantize classical systems which have different equivalent Lagrangians, we describe the situation which arises in the study of equivalent Lagrangians in field theory and finally, we present some unsolved problems and discussion topics related to the content of this article. (author)

  9. Metrics to assess ecological condition, change, and impacts in sandy beach ecosystems.

    Science.gov (United States)

    Schlacher, Thomas A; Schoeman, David S; Jones, Alan R; Dugan, Jenifer E; Hubbard, David M; Defeo, Omar; Peterson, Charles H; Weston, Michael A; Maslo, Brooke; Olds, Andrew D; Scapini, Felicita; Nel, Ronel; Harris, Linda R; Lucrezi, Serena; Lastra, Mariano; Huijbers, Chantal M; Connolly, Rod M

    2014-11-01

    Complexity is increasingly the hallmark in environmental management practices of sandy shorelines. This arises primarily from meeting growing public demands (e.g., real estate, recreation) whilst reconciling economic demands with expectations of coastal users who have modern conservation ethics. Ideally, shoreline management is underpinned by empirical data, but selecting ecologically-meaningful metrics to accurately measure the condition of systems, and the ecological effects of human activities, is a complex task. Here we construct a framework for metric selection, considering six categories of issues that authorities commonly address: erosion; habitat loss; recreation; fishing; pollution (litter and chemical contaminants); and wildlife conservation. Possible metrics were scored in terms of their ability to reflect environmental change, and against criteria that are widely used for judging the performance of ecological indicators (i.e., sensitivity, practicability, costs, and public appeal). From this analysis, four types of broadly applicable metrics that also performed very well against the indicator criteria emerged: 1.) traits of bird populations and assemblages (e.g., abundance, diversity, distributions, habitat use); 2.) breeding/reproductive performance sensu lato (especially relevant for birds and turtles nesting on beaches and in dunes, but equally applicable to invertebrates and plants); 3.) population parameters and distributions of vertebrates associated primarily with dunes and the supralittoral beach zone (traditionally focused on birds and turtles, but expandable to mammals); 4.) compound measurements of the abundance/cover/biomass of biota (plants, invertebrates, vertebrates) at both the population and assemblage level. Local constraints (i.e., the absence of birds in highly degraded urban settings or lack of dunes on bluff-backed beaches) and particular issues may require alternatives. Metrics - if selected and applied correctly - provide

  10. Temporal and Other Exposure Aspects of Residential Magnetic Fields Measurement in Relation to Acute Lymphoblastic Leukaemia in Children: The National Cancer Institute Children's Cancer Group Study (invited paper)

    Energy Technology Data Exchange (ETDEWEB)

    Baris, D.; Linet, M.; Auvinen, A.; Kaune, W.T.; Wacholder, S.; Kleinerman, R.; Hatch, E.; Robison, L.; Niwa, S.; Haines, C.; Tarone, R.E

    1999-07-01

    Case-control studies have used a variety of measurements to evaluate the relationship of children's exposure to magnetic fields (50 or 60 Hz) with childhood leukaemia and other childhood cancers. In the absence of knowledge about which exposure metrics may be biologically meaningful, studies during the past 10 years have often used time-weighted average (TWA) summaries of home measurements. Recently, other exposure metrics have been suggested, usually based on theoretical considerations or limited laboratory data. In this paper, the rationale and associated preliminary studies undertaken are described as well as feasibility andvalidity issues governing the choice of the primary magnetic field exposure assessment methods and summary metric used to estimate children's exposure in the National Cancer Institute/Children's Cancer Group (NCI/CCG) case-control study. Also provided are definitions and discussion of the strengths and weaknesses of the various exposure metrics used in exploratory analyses of the NCI/CCG measurement data. Exposure metrics evaluated include measures of central tendency (mean, median, 30th to 70th percentiles), peak exposures (90th and higher percentiles, peak values of the 24 h measurements), and measurements of short-term temporal variability (rate of change). This report describes correlations of the various metrics with the time-weighted average for the 24 h period (TWA-24-h). Most of the metrics were found to be positively and highly correlated with TWA-24-h, but lower correlations of TWA-24-h with peak exposure and with rate of change were observed. To examine further the relation between TWA and alternative metrics, similar exploratory analysis should be considered for existing data sets and for forthcoming measurement investigations of residential magnetic fields and childhood leukaemia. (author)

  11. Brand metrics that matter

    NARCIS (Netherlands)

    Muntinga, D.; Bernritter, S.

    2017-01-01

    Het merk staat steeds meer centraal in de organisatie. Het is daarom essentieel om de gezondheid, prestaties en ontwikkelingen van het merk te meten. Het is echter een uitdaging om de juiste brand metrics te selecteren. Een enorme hoeveelheid metrics vraagt de aandacht van merkbeheerders. Maar welke

  12. Privacy Metrics and Boundaries

    NARCIS (Netherlands)

    L-F. Pau (Louis-François)

    2005-01-01

    textabstractThis paper aims at defining a set of privacy metrics (quantitative and qualitative) in the case of the relation between a privacy protector ,and an information gatherer .The aims with such metrics are: -to allow to assess and compare different user scenarios and their differences; for

  13. The application of simple metrics in the assessment of glycaemic variability.

    Science.gov (United States)

    Monnier, L; Colette, C; Owens, D R

    2018-03-06

    The assessment of glycaemic variability (GV) remains a subject of debate with many indices proposed to represent either short- (acute glucose fluctuations) or long-term GV (variations of HbA 1c ). For the assessment of short-term within-day GV, the coefficient of variation for glucose (%CV) defined as the standard deviation adjusted on the 24-h mean glucose concentration is easy to perform and with a threshold of 36%, recently adopted by the international consensus on use of continuous glucose monitoring, separating stable from labile glycaemic states. More complex metrics such as the Low Blood Glucose Index (LBGI) or High Blood Glucose Index (HBGI) allow the risk of hypo or hyperglycaemic episodes, respectively to be assessed although in clinical practice its application is limited due to the need for more complex computation. This also applies to other indices of short-term intraday GV including the mean amplitude of glycemic excursions (MAGE), Shlichtkrull's M-value and CONGA. GV is important clinically as exaggerated glucose fluctuations are associated with an enhanced risk of adverse cardiovascular outcomes due primarily to hypoglycaemia. In contrast, there is at present no compelling evidence that elevated short-term GV is an independent risk factor of microvascular complications of diabetes. Concerning long-term GV there are numerous studies supporting its association with an enhanced risk of cardiovascular events. However, this association raises the question as to whether the impact of long-term variability is not simply the consequence of repeated exposure to short-term GV or ambient chronic hyperglycaemia. The renewed emphasis on glucose monitoring with the introduction of continuous glucose monitoring technologies can benefit from the introduction and application of simple metrics for describing GV along with supporting recommendations. Copyright © 2018 Elsevier Masson SAS. All rights reserved.

  14. Annual Surveillance Summary: Clostridium difficile Infections in the Military Health System (MHS), 2016

    Science.gov (United States)

    2017-06-01

    The frequency is based on the demographic value of the index incident episode. Data Source: NMCPHC HL7-formatted CHCS microbiology and...pharmacy data to assess prescription practices and the Standard Inpatient Data Record (SIDR) to determine healthcare -associated exposures. CDI...Metrics Table 4 presents two different metrics for describing CDI rates for healthcare -associated exposures. The admission prevalence metric

  15. Assessing the importance of different exposure metrics and time-activity data to predict 24-H personal PM2.5 exposures.

    Science.gov (United States)

    Chang, Li-Te; Koutrakis, Petros; Catalano, Paul J; Suh, Helen H

    Personal PM(2.5) data from two recent exposure studies, the Scripted Activity Study and the Older Adults Study, were used to develop models predicting 24-h personal PM(2.5) exposures. Both studies were conducted concurrently in the summer of 1998 and the winter of 1999 in Baltimore, MD. In the Scripted Activity Study, 1-h personal PM(2.5) exposures were measured. Data were used to identify significant factors affecting personal exposures and to develop 1-h personal exposure models for five different micro-environments. By incorporating the time-activity diary data, these models were then combined to develop a time-weighted microenvironmental personal model (model M1AD) to predict the 24-h PM(2.5) exposures measured for individuals in the Older Adults Study. Twenty-four-hour time-weighted models were also developed using 1-h ambient PM(2.5) levels and time-activity data (model A1AD) or using 24-h ambient PM(2.5) levels and time-activity data (model A24AD). The performance of these three models was compared to that using 24-h ambient concentrations alone (model A24). Results showed that factors affecting 1-h personal PM(2.5) exposures included air conditioning status and the presence of environmental tobacco smoke (ETS) for indoor micro-environments, consistent with previous studies. ETS was identified as a significant contributor to measured 24-h personal PM(2.5) exposures. Staying in an ETS-exposed microenvironment for 1 h elevated 24-h personal PM(2.5) exposures by approximately 4 microg/m 3 on average. Cooking and washing activities were identified in the winter as significant contributors to 24-h personal exposures as well, increasing 24-h personal PM(2.5) exposures by about 4 and 5 microg/m 3 per hour of activity, respectively. The ability of 3 microenvironmental personal exposure models to estimate 24-h personal PM(2.5) exposures was generally comparable to and consistently greater than that of model A24. Results indicated that using time-activity data with 1

  16. Chinese Pediatrician Attitudes and Practices Regarding Child Exposure to Secondhand Smoke (SHS and Clinical Efforts against SHS Exposure

    Directory of Open Access Journals (Sweden)

    Kaiyong Huang

    2015-05-01

    Full Text Available Background: Secondhand Smoke (SHS exposure is a leading cause of childhood illness and premature death. Pediatricians play an important role in helping parents to quit smoking and reducing children’s SHS exposure. This study examined Chinese pediatricians’ attitudes and practices regarding children’s exposure to SHS and clinical efforts against SHS exposure. Methods: A cross-sectional survey of pediatricians was conducted in thirteen conveniently selected hospitals in southern China, during September to December 2013. Five hundred and four pediatricians completed self-administered questionnaires with a response rate of 92%. χ2 tests were used to compare categorical variables differences between smokers and non-smokers and other categorical variables. Results: Pediatricians thought that the key barriers to encouraging parents to quit smoking were: lack of professional training (94%, lack of time (84%, resistance to discussions about smoking (77%. 94% of the pediatricians agreed that smoking in enclosed public places should be prohibited and more than 70% agreed that smoking should not be allowed in any indoor places and in cars. Most of the pediatricians thought that their current knowledge on helping people to quit smoking and SHS exposure reduction counseling was insufficient. Conclusions: Many Chinese pediatricians did not have adequate knowledge about smoking and SHS, and many lacked confidence about giving cessation or SHS exposure reduction counseling to smoking parents. Lack of professional training and time were the most important barriers to help parents quit smoking among the Chinese pediatricians. Intensified efforts are called for to provide the necessary professional training and increase pediatricians’ participation in the training.

  17. Chinese Pediatrician Attitudes and Practices Regarding Child Exposure to Secondhand Smoke (SHS) and Clinical Efforts against SHS Exposure

    Science.gov (United States)

    Huang, Kaiyong; Abdullah, Abu S.; Huo, Haiying; Liao, Jing; Yang, Li; Zhang, Zhiyong; Chen, Hailian; Nong, Guangmin; Winickoff, Jonathan P.

    2015-01-01

    Background: Secondhand Smoke (SHS) exposure is a leading cause of childhood illness and premature death. Pediatricians play an important role in helping parents to quit smoking and reducing children’s SHS exposure. This study examined Chinese pediatricians’ attitudes and practices regarding children’s exposure to SHS and clinical efforts against SHS exposure. Methods: A cross-sectional survey of pediatricians was conducted in thirteen conveniently selected hospitals in southern China, during September to December 2013. Five hundred and four pediatricians completed self-administered questionnaires with a response rate of 92%. χ2 tests were used to compare categorical variables differences between smokers and non-smokers and other categorical variables. Results: Pediatricians thought that the key barriers to encouraging parents to quit smoking were: lack of professional training (94%), lack of time (84%), resistance to discussions about smoking (77%). 94% of the pediatricians agreed that smoking in enclosed public places should be prohibited and more than 70% agreed that smoking should not be allowed in any indoor places and in cars. Most of the pediatricians thought that their current knowledge on helping people to quit smoking and SHS exposure reduction counseling was insufficient. Conclusions: Many Chinese pediatricians did not have adequate knowledge about smoking and SHS, and many lacked confidence about giving cessation or SHS exposure reduction counseling to smoking parents. Lack of professional training and time were the most important barriers to help parents quit smoking among the Chinese pediatricians. Intensified efforts are called for to provide the necessary professional training and increase pediatricians’ participation in the training. PMID:26006117

  18. Food allergy: practical approach on education and accidental exposure prevention.

    Science.gov (United States)

    Pádua, I; Moreira, A; Moreira, P; Barros, R

    2016-09-01

    Food allergies are a growing problem and currently the primary treatment of food allergy is avoidance of culprit foods. However, given the lack of information and education and also the ubiquitous nature of allergens, accidental exposures to food allergens are not uncommon. The fear of potential fatal reactions and the need of a proper avoidance leads in most of the cases to the limitation of leisure and social activities. This review aims to be a practical approach on education and accidental exposure prevention regarding activities like shopping, eating out, and travelling. The recommendations are focused especially on proper reading of food labels and the management of the disease, namely in restaurants and airplanes, concerning cross-contact and communication with other stakeholders. The implementation of effective tools is essential to manage food allergy outside home, avoid serious allergic reactions and minimize the disease's impact on individuals' quality of life.

  19. Exposure to cosmic radiation: a developing major problem in radiation protection

    International Nuclear Information System (INIS)

    Lowder, W.M.; Hajnal, F.

    1992-01-01

    'Full Text:' Cosmic radiation at ground altitudes is usually a relatively minor contributor to human radiation exposure, producing a global collective dose equivalent that is about 10 percent of the total from all natural sources. However, more than a million people living at high altitudes receive annual dose equivalents in excess of 5 mSv. In recent years, there has been increasing concern about the exposure of aircraft flight crews and passengers, for whom annual dose equivalents of up to several mSv have been estimated. Recent EML results indicate the presence of an important high-energy neutron component at jet aircraft altitudes, perhaps producing dose equivalents of the order of 0.1. mSv/h at high latitudes. Finally, space agencies have been long concerned with the potential exposures of astronauts, especially from the rare massive solar flare events. As more people venture into space, this source of human radiation exposure will become increasingly important. Available date on those aspects of cosmic radiation exposure will be reviewed, along with current and anticipated future research activities that may yield and improve assessment of the problem. The question of how such exposures might be controlled will be addressed, but not answered. (author)

  20. Are exposure index values consistent in clinical practice? A multi-manufacturer investigation

    International Nuclear Information System (INIS)

    Butler, M. L.; Rainford, L.; Last, J.; Brennan, P. C.

    2010-01-01

    The advent of digital radiography poses the risk of unnoticed increases in patient dose. Manufacturers have responded to this by offering an exposure index (EI) value to the clinician. Whilst the EI value is a measure of the air kerma at the detector surface, it has been recommended by international agencies as a method of monitoring radiation dose to the patient. Recent studies by the group have shown that EI values are being used in clinical practice to monitor radiation dose and assess image quality. This study aims to compare the clinical consistency of the EI value in computed radiography (CR) and direct digital radiography (DR) systems. An anthropomorphic phantom was used to simulate four common radiographic examinations: skull, pelvis, chest and hand. These examinations were chosen as they provide contrasting exposure parameters, image detail and radiation dose measurements. Four manufacturers were used for comparison: Agfa Gaevert CR, Carestream CR, Philips Digital Diagnost DR and Siemens DR. For each examination, the phantom was placed in the optimal position and exposure parameters were chosen in accordance with European guidelines and clinical practice. Multiple exposures were taken and the EI recorded. All exposure parameters and clinical conditions remained constant throughout. For both DR systems, the EI values remained consistent throughout. No significant change was noted in any examination. In both CR systems, there were noteworthy fluctuations in the EI values for all examinations. The largest for the Agfa system was a variation of 1.88-2.21 for the skull examination. This represents to the clinician a doubling of detector dose, despite all exposure parameters remaining constant. In the Kodak system, the largest fluctuation was seen for the chest examination where the EI ranged from 2560 to 2660, representing approximately an increase of 30% in radiation dose, despite consistent parameters. The fluctuations seen with the CR systems are most likely

  1. Correlation between practice location as a surrogate for UV exposure and practice patterns to prevent corneal haze after photorefractive keratectomy (PRK).

    Science.gov (United States)

    Al-Sharif, Eman M; Stone, Donald U

    2016-01-01

    PRK is a refractive surgery that reshapes the corneal surface by excimer laser photoablation to correct refractive errors. The effect of increased ultraviolet (UV) exposure on promoting post-PRK corneal haze has been reported in the literature; however, information is lacking regarding the effect of ambient UV exposure on physician practice patterns. The aim of this study was to evaluate the effect of ophthalmologists' practice location on their reported practice patterns to prevent post-PRK corneal haze. A cross-sectional observational study was conducted through an online survey sent to ophthalmologists performing PRK. The survey recorded the primary city of practice from which the two independent variables, latitude and average annual sunshine days, were determined. It also measured the frequency of use of postoperative preventive interventions (dependent variables) which are as follows: intraoperative Mitomycin-C, oral vitamin C, sunglasses, topical corticosteroids, topical cyclosporine, oral tetracyclines and amniotic membrane graft. Fifty-one ophthalmologists completed the survey. Practice locations' mean latitude was 36.4 degrees north, and average sunshine days annually accounted for 60% of year days. There was no significant relation between latitude/average annual sunshine days and usual post-PRK prophylactic treatments ( P  > 0.05). The commonest protective maneuvers were sunglasses (78%), prolonged topical corticosteroids (57%), Mitomycin-C (39%) and oral vitamin C (37%). We found no significant difference in ophthalmologists' practice patterns to prevent post-PRK corneal haze in relation to practice location latitude and average sunshine days. Moreover, the results demonstrated that the most widely used postoperative measures to prevent post-PRK haze are sunglasses, Mitomycin-C, topical corticosteroids, and oral Vitamin C.

  2. An Assessment on Cu-Equivalent Image of Digital Intraoral Radiography

    International Nuclear Information System (INIS)

    Kim, Jae Duk

    1999-01-01

    Geometrically standardized dental radiographs were taken. We prepared Digital Cu-Equivalent Image Analyzing System for quantitative assessment of mandible bone. Images of radiographs were digitized by means of Quick scanner and personal Mcquintosh computer. NIH image as software was used for analyzing images. A step wedge composed of 10 steps of 0.1 mm copper foil in thickness was used for reference material. This study evaluated the effects of step numbers of copper wedge adopted for calculating equation, kVp and exposure time on the coefficient of determination (r2)of the equation for conversion to Cu-equivalent image and the coefficient of variation and Cu-Eq value (mm) measured at each copper step and alveolar bone of mandible. The results were as follows: 1. The coefficients of determination (r2) of 10 conversion equations ranged from 0.9996 to 0.9973 (mean=0.9988) under 70 kVp and 0.16 sec. exposure. The equation showed the highest r2 was Y=4.75614612-0.06300524x +0.00032367x 2 -0.00000060x 3 . 2. The value of r 2 became lower when the equation was calculated from the copper step wedge including 1.0 mm step. In case of including 0 mm step for calculation, the value of r 2 showed variability. 3. The coefficient of variation showed 0.11, 0.20 respectively at each copper step of 0.2, 0.1 mm in thickness. Those of the other steps to 0.9 mm ranged from 0.06 to 0.09 in mean value. 4. The mean Cu-Eq value of alveolar bone was 0.14 ± 0.02 mm under optimal exposure. The values were lower than the mean under the exposures over 0.20 sec. in 60 kVp and over 0.16 sec. in 70 kVp . 5. Under the exposure condition of 60 kVp 0.16 sec., the coefficient of variation showed 0.03, 0.05 respectively at each copper-step of 0.3, 0.2 mm in thickness. The value of r 2 showed over 0.9991 from both 9 and 10 steps of copper. The Cu-Eq value and the coefficient of variation was 0.14 ± 0.01 mm and 0.07 at alveolar bone respectively. In summary, A clinical application of this system

  3. Chemical composition dependence of exposure buildup factors for some polymers

    Energy Technology Data Exchange (ETDEWEB)

    Singh, Tejbir [Department of Physics, S.D.D.I.E.T., Barwala, District Panchkula, Haryana 134 118 (India)], E-mail: tejbir.s@rediffmail.com; Kumar, Naresh [Department of Physics, Lovely Professional University, Phagwara 144 402 (India)], E-mail: naresh20dhiman@yahoo.com; Singh, Parjit S. [Department of Physics, Punjabi University, Patiala 147 002 (India)], E-mail: dr_parjit@hotmail.com

    2009-01-15

    Exposure buildup factors for some polymers such as poly-acrylo-nitrile (PAN), poly-methyl-acrylate (PMA), poly-vinyl-chloride (PVC), synthetic rubber (SR), tetra-fluro-ethylene (Teflon) have been computed using the G.P. fitting method in the energy range of 0.015-15.0 MeV, up to the penetration of 40 mean free paths (mfp). The variation of exposure buildup factors for all the selected polymers with incident photon energy at the fixed penetration depths has been studied, mainly emphasizing on chemical composition (equivalent atomic number) of the selected polymers. It has been observed that for the lower penetration depths (below 10 mfp), the exposure buildup factor decreases with the increase in equivalent atomic number of the selected polymers at all the incident photon energies. However, at the penetration depth of 10 mfp and incident photon energy above 3 MeV, the exposure buildup factor becomes almost independent of the equivalent atomic number of the selected polymers. Further, above the fixed penetration depth of 15 mfp of the selected polymers and above the incident photon energy of 3 MeV, reversal in the trend has been observed, i.e., the exposure buildup factor increases with the increase in equivalent atomic number.

  4. Chemical composition dependence of exposure buildup factors for some polymers

    International Nuclear Information System (INIS)

    Singh, Tejbir; Kumar, Naresh; Singh, Parjit S.

    2009-01-01

    Exposure buildup factors for some polymers such as poly-acrylo-nitrile (PAN), poly-methyl-acrylate (PMA), poly-vinyl-chloride (PVC), synthetic rubber (SR), tetra-fluro-ethylene (Teflon) have been computed using the G.P. fitting method in the energy range of 0.015-15.0 MeV, up to the penetration of 40 mean free paths (mfp). The variation of exposure buildup factors for all the selected polymers with incident photon energy at the fixed penetration depths has been studied, mainly emphasizing on chemical composition (equivalent atomic number) of the selected polymers. It has been observed that for the lower penetration depths (below 10 mfp), the exposure buildup factor decreases with the increase in equivalent atomic number of the selected polymers at all the incident photon energies. However, at the penetration depth of 10 mfp and incident photon energy above 3 MeV, the exposure buildup factor becomes almost independent of the equivalent atomic number of the selected polymers. Further, above the fixed penetration depth of 15 mfp of the selected polymers and above the incident photon energy of 3 MeV, reversal in the trend has been observed, i.e., the exposure buildup factor increases with the increase in equivalent atomic number

  5. Cyber threat metrics.

    Energy Technology Data Exchange (ETDEWEB)

    Frye, Jason Neal; Veitch, Cynthia K.; Mateski, Mark Elliot; Michalski, John T.; Harris, James Mark; Trevino, Cassandra M.; Maruoka, Scott

    2012-03-01

    Threats are generally much easier to list than to describe, and much easier to describe than to measure. As a result, many organizations list threats. Fewer describe them in useful terms, and still fewer measure them in meaningful ways. This is particularly true in the dynamic and nebulous domain of cyber threats - a domain that tends to resist easy measurement and, in some cases, appears to defy any measurement. We believe the problem is tractable. In this report we describe threat metrics and models for characterizing threats consistently and unambiguously. The purpose of this report is to support the Operational Threat Assessment (OTA) phase of risk and vulnerability assessment. To this end, we focus on the task of characterizing cyber threats using consistent threat metrics and models. In particular, we address threat metrics and models for describing malicious cyber threats to US FCEB agencies and systems.

  6. Fixed point theory in metric type spaces

    CERN Document Server

    Agarwal, Ravi P; O’Regan, Donal; Roldán-López-de-Hierro, Antonio Francisco

    2015-01-01

    Written by a team of leading experts in the field, this volume presents a self-contained account of the theory, techniques and results in metric type spaces (in particular in G-metric spaces); that is, the text approaches this important area of fixed point analysis beginning from the basic ideas of metric space topology. The text is structured so that it leads the reader from preliminaries and historical notes on metric spaces (in particular G-metric spaces) and on mappings, to Banach type contraction theorems in metric type spaces, fixed point theory in partially ordered G-metric spaces, fixed point theory for expansive mappings in metric type spaces, generalizations, present results and techniques in a very general abstract setting and framework. Fixed point theory is one of the major research areas in nonlinear analysis. This is partly due to the fact that in many real world problems fixed point theory is the basic mathematical tool used to establish the existence of solutions to problems which arise natur...

  7. Deep Transfer Metric Learning.

    Science.gov (United States)

    Junlin Hu; Jiwen Lu; Yap-Peng Tan; Jie Zhou

    2016-12-01

    Conventional metric learning methods usually assume that the training and test samples are captured in similar scenarios so that their distributions are assumed to be the same. This assumption does not hold in many real visual recognition applications, especially when samples are captured across different data sets. In this paper, we propose a new deep transfer metric learning (DTML) method to learn a set of hierarchical nonlinear transformations for cross-domain visual recognition by transferring discriminative knowledge from the labeled source domain to the unlabeled target domain. Specifically, our DTML learns a deep metric network by maximizing the inter-class variations and minimizing the intra-class variations, and minimizing the distribution divergence between the source domain and the target domain at the top layer of the network. To better exploit the discriminative information from the source domain, we further develop a deeply supervised transfer metric learning (DSTML) method by including an additional objective on DTML, where the output of both the hidden layers and the top layer are optimized jointly. To preserve the local manifold of input data points in the metric space, we present two new methods, DTML with autoencoder regularization and DSTML with autoencoder regularization. Experimental results on face verification, person re-identification, and handwritten digit recognition validate the effectiveness of the proposed methods.

  8. Energy functionals for Calabi-Yau metrics

    International Nuclear Information System (INIS)

    Headrick, M; Nassar, A

    2013-01-01

    We identify a set of ''energy'' functionals on the space of metrics in a given Kähler class on a Calabi-Yau manifold, which are bounded below and minimized uniquely on the Ricci-flat metric in that class. Using these functionals, we recast the problem of numerically solving the Einstein equation as an optimization problem. We apply this strategy, using the ''algebraic'' metrics (metrics for which the Kähler potential is given in terms of a polynomial in the projective coordinates), to the Fermat quartic and to a one-parameter family of quintics that includes the Fermat and conifold quintics. We show that this method yields approximations to the Ricci-flat metric that are exponentially accurate in the degree of the polynomial (except at the conifold point, where the convergence is polynomial), and therefore orders of magnitude more accurate than the balanced metrics, previously studied as approximations to the Ricci-flat metric. The method is relatively fast and easy to implement. On the theoretical side, we also show that the functionals can be used to give a heuristic proof of Yau's theorem

  9. Low-level human equivalent gestational lead exposure produces sex-specific motor and coordination abnormalities and late-onset obesity in year-old mice.

    Science.gov (United States)

    Leasure, J Leigh; Giddabasappa, Anand; Chaney, Shawntay; Johnson, Jerry E; Pothakos, Konstantinos; Lau, Yuen Sum; Fox, Donald A

    2008-03-01

    Low-level developmental lead exposure is linked to cognitive and neurological disorders in children. However, the long-term effects of gestational lead exposure (GLE) have received little attention. Our goals were to establish a murine model of human equivalent GLE and to determine dose-response effects on body weight, motor functions, and dopamine neurochemistry in year-old offspring. We exposed female C57BL/6 mice to water containing 0, 27 (low), 55 (moderate), or 109 ppm (high) of lead from 2 weeks prior to mating, throughout gestation, and until postnatal day 10 (PN10). Maternal and litter measures, blood lead concentrations ([BPb]), and body weights were obtained throughout the experiment. Locomotor behavior in the absence and presence of amphetamine, running wheel activity, rotarod test, and dopamine utilization were examined in year-old mice. Peak [BPb] were obesity. Similarly, we observed male-specific decreased spontaneous motor activity, increased amphetamine-induced motor activity, and decreased rotarod performance in year-old GLE mice. Levels of dopamine and its major metabolite were altered in year-old male mice, although only forebrain utilization increased. GLE-induced alterations were consistently larger in low-dose GLE mice. Our novel results show that GLE produced permanent male-specific deficits. The nonmonotonic dose-dependent responses showed that low-level GLE produced the most adverse effects. These data reinforce the idea that lifetime measures of dose-response toxicant exposure should be a component of the neurotoxic risk assessment process.

  10. Current practices for maintaining occupational exposures ALARA at low-level waste disposal sites

    International Nuclear Information System (INIS)

    Hadlock, D.E.; Herrington, W.N.; Hooker, C.D.; Murphy, D.W.; Gilchrist, R.L.

    1983-12-01

    The United States Nuclear Regulatory Commission contracted with Pacific Northwest Laboratory (PNL) to provide technical assistance in establishing operational guidelines, with respect to radiation control programs and methods of minimizing occupational radiation exposure, at Low-Level Waste (LLW) disposal sites. The PNL, through site visits, evaluated operations at LLW disposal sites to determine the adequacy of current practices in maintaining occupational exposures as low as is reasonably achievable (ALARA). The data sought included the specifics of: ALARA programs, training programs, external exposure control, internal exposure control, respiratory protection, surveillance, radioactive waste management, facilities and equipment, and external dose analysis. The results of the study indicated the following: The Radiation Protection and ALARA programs at the three commercial LLW disposal sites were observed to be adequate in scope and content compared to similar programs at other types of nuclear facilities. However, it should be noted that there were many areas that could be improved upon to help ensure the health and safety of occupationally exposed individuals

  11. Current practices for maintaining occupational exposures ALARA at low-level waste disposal sites

    Energy Technology Data Exchange (ETDEWEB)

    Hadlock, D.E.; Herrington, W.N.; Hooker, C.D.; Murphy, D.W.; Gilchrist, R.L.

    1983-12-01

    The United States Nuclear Regulatory Commission contracted with Pacific Northwest Laboratory (PNL) to provide technical assistance in establishing operational guidelines, with respect to radiation control programs and methods of minimizing occupational radiation exposure, at Low-Level Waste (LLW) disposal sites. The PNL, through site visits, evaluated operations at LLW disposal sites to determine the adequacy of current practices in maintaining occupational exposures as low as is reasonably achievable (ALARA). The data sought included the specifics of: ALARA programs, training programs, external exposure control, internal exposure control, respiratory protection, surveillance, radioactive waste management, facilities and equipment, and external dose analysis. The results of the study indicated the following: The Radiation Protection and ALARA programs at the three commercial LLW disposal sites were observed to be adequate in scope and content compared to similar programs at other types of nuclear facilities. However, it should be noted that there were many areas that could be improved upon to help ensure the health and safety of occupationally exposed individuals.

  12. Application of maximum radiation exposure values and monitoring of radiation exposure

    International Nuclear Information System (INIS)

    1993-01-01

    According to the Section 32 of the Radiation Act (592/91) the Finnish Centre for Radiation and Nuclear Safety gives instructions concerning the monitoring of the radiation exposure and the application of the dose limits in Finland. The principles to be applied to calculating the equivalent and the effective doses are presented in the guide. Also the detailed instructions on the application of the maximum exposure values for the radiation work and for the natural radiation as well as the instructions on the monitoring of the exposures are given. Quantities and units for assessing radiation exposure are presented in the appendix of the guide

  13. US exposure to multiple landscape stressors and climate change

    Science.gov (United States)

    Becky K. Kerns; John B. Kim; Jeffrey D. Kline; Michelle A. Day

    2016-01-01

    We examined landscape exposure to wildfire potential, insects and disease risk, and urban and exurban development for the conterminous US (CONUS). Our analysis relied on spatial data used by federal agencies to evaluate these stressors nationally. We combined stressor data with a climate change exposure metric to identify when temperature is likely to depart from...

  14. Equivalent electricity storage capacity of domestic thermostatically controlled loads

    International Nuclear Information System (INIS)

    Sossan, Fabrizio

    2017-01-01

    A method to quantify the equivalent storage capacity inherent the operation of thermostatically controlled loads (TCLs) is developed. Equivalent storage capacity is defined as the amount of power and electricity consumption which can be deferred or anticipated in time with respect to the baseline consumption (i.e. when no demand side event occurs) without violating temperature limits. The analysis is carried out for 4 common domestic TCLs: an electric space heating system, freezer, fridge, and electric water heater. They are simulated by applying grey-box thermal models identified from measurements. They describe the heat transfer of the considered TCLs as a function of the electric power consumption and environment conditions. To represent typical TCLs operating conditions, Monte Carlo simulations are developed, where models inputs and parameters are sampled from relevant statistical distributions. The analysis provides a way to compare flexible demand against competitive storage technologies. It is intended as a tool for system planners to assess the TCLs potential to support electrical grid operation. In the paper, a comparison of the storage capacity per unit of capital investment cost is performed considering the selected TCLs and two grid-connected battery storage systems (a 720 kVA/500 kWh lithium-ion unit and 15 kVA/120 kWh Vanadium flow redox) is performed. - Highlights: • The equivalent storage capacity of domestic TCLs is quantified • A comparison with battery-based storage technologies is performed • We derive metrics for system planners to plan storage in power system networks • Rule-of-thumb cost indicators for flexible demand and battery-based storage

  15. Application of A150-plastic equivalent gases in microdosimetric measurements

    International Nuclear Information System (INIS)

    DeLuca, P.M. Jr.; Higgins, P.D.; Pearson, D.W.; Schell, M.; Attix, F.H.

    1981-01-01

    Neutron dosimetry measurements with ionization chambers, for the most part, employ tissue equivalent plastic-walled cavities (Shonka A150) filled with either air or a methane-base tissue-like gas. The atomic composition of TE-gas and A150 plastic are not matched and are quite dissimilar from muscle. Awschalom and Attix (1980) have partially resolved the problem by formulating a novel A150-plastic equivalent gas. This establishes a homogeneous wall-gas cavity dosimeter for neutron measurements and confines the necessary corrections to the applications of kerma ratios. In this report, we present measurements of applications of two A150-plastic equivalent gases in a low pressure spherical proportional counter. Gas gains and alpha-particle resolutions were determined. For these A150-mixtures as well as a methane-based TE-gas and an Ar-CO 2 mixture, we report measurements of event size distributions from exposure to a beam of 14.8 MeV neutrons

  16. Putting health metrics into practice: using the disability-adjusted life year for strategic decision making.

    Science.gov (United States)

    Longfield, Kim; Smith, Brian; Gray, Rob; Ngamkitpaiboon, Lek; Vielot, Nadja

    2013-01-01

    Implementing organizations are pressured to be accountable for performance. Many health impact metrics present limitations for priority setting; they do not permit comparisons across different interventions or health areas. In response, Population Services International (PSI) adopted the disability-adjusted life year (DALY) averted as its bottom-line performance metric. While international standards exist for calculating DALYs to determine burden of disease (BOD), PSI's use of DALYs averted is novel. It uses DALYs averted to assess and compare the health impact of its country programs, and to understand the effectiveness of a portfolio of interventions. This paper describes how the adoption of DALYs averted influenced organizational strategy and presents the advantages and constraints of using the metric. Health impact data from 2001-2011 were analyzed by program area and geographic region to measure PSI's performance against its goal of doubling health impact between 2007-2011. Analyzing 10 years of data permitted comparison with previous years' performance. A case study of PSI's Asia and Eastern European (A/EE) region, and PSI/Laos, is presented to illustrate how the adoption of DALYs averted affected strategic decision making. Between 2007-2011, PSI's programs doubled the total number of DALYs averted from 2002-2006. Most DALYs averted were within malaria, followed by HIV/AIDS and family planning (FP). The performance of PSI's A/EE region relative to other regions declined with the switch to DALYs averted. As a result, the region made a strategic shift to align its work with countries' BOD. In PSI/Laos, this redirection led to better-targeted programs and an approximate 50% gain in DALYs averted from 2009-2011. PSI's adoption of DALYs averted shifted the organization's strategic direction away from product sales and toward BOD. Now, many strategic decisions are based on "BOD-relevance," the share of the BOD that interventions can potentially address. This switch

  17. Personal power-frequency magnetic field exposure in women recruited at an infertility clinic: association with physical activity and temporal variability.

    Science.gov (United States)

    Lewis, Ryan C; Hauser, Russ; Wang, Lu; Kavet, Robert; Meeker, John D

    2016-03-01

    Current epidemiologic approaches for studying exposure to power-frequency magnetic fields and the risk of miscarriage are potentially biased due to lack of attention to the relationship of exposure with physical activity and within-individual variability in exposures over time. This analysis examines these two issues using data from a longitudinal pilot study of 40 women recruited from an infertility clinic that contributed data for up to three 24-h periods separated by a median of 3.6 weeks. Physical activity was positively associated with peak exposure metrics. Higher physical activity within environments did not necessarily lead to higher peak exposures, suggesting that movement between and not within environments increases one's probability of encountering a high field source. Peak compared with central tendency metrics were more variable over time. Future epidemiology studies associated with peak exposure metrics should adjust for physical activity and collect more than 1 d of exposure measurement to reduce bias. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  18. Splenectomy in a rural surgical practice | Alufohai | Nigerian Journal ...

    African Journals Online (AJOL)

    No Abstract. Nigrian Journal of Clinical Practice Vol.9 (1) 2006: pp.81-83. Full Text: EMAIL FREE FULL TEXT EMAIL FREE FULL TEXT · DOWNLOAD FULL TEXT DOWNLOAD FULL TEXT. Article Metrics. Metrics Loading ... Metrics powered by PLOS ALM · AJOL African Journals Online. HOW TO USE AJOL.

  19. Monte-Carlo and multi-exposure assessment for the derivation of criteria for disinfection byproducts and volatile organic compounds in drinking water: Allocation factors and liter-equivalents per day.

    Science.gov (United States)

    Akiyama, Megumi; Matsui, Yoshihiko; Kido, Junki; Matsushita, Taku; Shirasaki, Nobutaka

    2018-06-01

    The probability distributions of total potential doses of disinfection byproducts and volatile organic compounds via ingestion, inhalation, and dermal exposure were estimated with Monte Carlo simulations, after conducting physiologically based pharmacokinetic model simulations to takes into account the differences in availability between the three exposures. If the criterion that the 95th percentile estimate equals the TDI (tolerable daily intake) is regarded as protecting the majority of a population, the drinking water criteria would be 140 (trichloromethane), 66 (bromodichloromethane), 157 (dibromochloromethane), 203 (tribromomethane), 140 (dichloroacetic acid), 78 (trichloroacetic acid), 6.55 (trichloroethylene, TCE), and 22 μg/L (perchloroethylene). The TCE criterion was lower than the Japanese Drinking Water Quality Standard (10 μg/L). The latter would allow the intake of 20% of the population to exceed the TDI. Indirect inhalation via evaporation from water, especially in bathrooms, was the major route of exposure to compounds other than haloacetic acids (HAAs) and accounted for 1.2-9 liter-equivalents/day for the median-exposure subpopulation. The ingestion of food was a major indirect route of exposure to HAAs. Contributions of direct water intake were not very different for trihalomethanes (30-45% of TDIs) and HAAs (45-52% of TDIs). Copyright © 2018 Elsevier Inc. All rights reserved.

  20. Estimating raw material equivalents on a macro-level: comparison of multi-regional input-output analysis and hybrid LCI-IO.

    Science.gov (United States)

    Schoer, Karl; Wood, Richard; Arto, Iñaki; Weinzettel, Jan

    2013-12-17

    The mass of material consumed by a population has become a useful proxy for measuring environmental pressure. The "raw material equivalents" (RME) metric of material consumption addresses the issue of including the full supply chain (including imports) when calculating national or product level material impacts. The RME calculation suffers from data availability, however, as quantitative data on production practices along the full supply chain (in different regions) is required. Hence, the RME is currently being estimated by three main approaches: (1) assuming domestic technology in foreign economies, (2) utilizing region-specific life-cycle inventories (in a hybrid framework), and (3) utilizing multi-regional input-output (MRIO) analysis to explicitly cover all regions of the supply chain. While the first approach has been shown to give inaccurate results, this paper focuses on the benefits and costs of the latter two approaches. We analyze results from two key (MRIO and hybrid) projects modeling raw material equivalents, adjusting the models in a stepwise manner in order to quantify the effects of individual conceptual elements. We attempt to isolate the MRIO gap, which denotes the quantitative impact of calculating the RME of imports by an MRIO approach instead of the hybrid model, focusing on the RME of EU external trade imports. While, the models give quantitatively similar results, differences become more pronounced when tracking more detailed material flows. We assess the advantages and disadvantages of the two approaches and look forward to ways to further harmonize data and approaches.

  1. Hydration and transparency of the rabbit cornea irradiated with UVB-doses of 0.25 J/cm(2) and 0.5 J/cm(2) compared with equivalent UVB radiation exposure reaching the human cornea from sunlight.

    Science.gov (United States)

    Cejka, Cestmír; Ardan, Taras; Sirc, Jakub; Michálek, Jiří; Beneš, Jiří; Brůnová, Blanka; Rosina, Jozef

    2011-07-01

    Exposure of the cornea to UV radiation from sunlight evokes intraocular inflammation, photokeratitis. Photokeratitis is caused by UVB radiation. It is accompanied by changes of corneal hydration and light absorption. The aim of this study was to examine the effect of two UVB doses on corneal optics in rabbits and to compare these UVB doses with the equivalent exposure of UVB radiation reaching the human cornea from sunlight. Rabbit corneas were irradiated with a daily UVB dose of 0.25 J/cm(2) or 0.5 J/cm(2) for 4 days. One day after finishing the irradiations the rabbits were sacrificed and corneal light absorption measured using our spectrophotometrical method. Corneal hydration was examined using an ultrasonic Pachymeter every experimental day before the irradiation procedure and the last day before sacrificing the animals. Changes in corneal optics appeared after the repeated exposure of the cornea to a UVB dose of 0.25 J/ cm(2) and massively increased after the repeated exposure of the cornea to a UVB dose of 0.5 J/cm(2). The first significant changes in corneal hydration appeared after a single exposure of the cornea to a UVB dose of 0.25 J/cm(2). Changes in corneal hydration appeared after the exposure of the rabbit cornea to a single UVB dose equivalent to 2.6 hours of solar UVB radiation reaching the human cornea, as measured by UVB sensors embedded in the eyes of mannequin heads facing the sun on a beach at noon in July. Repeated exposure of the rabbit cornea to the same UVB dose evoked profound changes in corneal optics. Although comparison of experimental and outdoor conditions are only approximate, the results in rabbits point to the danger for the human eye from UVB radiation when short stays in sunlight are repeated for several consecutive days without UV protection.

  2. Regge calculus from discontinuous metrics

    International Nuclear Information System (INIS)

    Khatsymovsky, V.M.

    2003-01-01

    Regge calculus is considered as a particular case of the more general system where the linklengths of any two neighbouring 4-tetrahedra do not necessarily coincide on their common face. This system is treated as that one described by metric discontinuous on the faces. In the superspace of all discontinuous metrics the Regge calculus metrics form some hypersurface defined by continuity conditions. Quantum theory of the discontinuous metric system is assumed to be fixed somehow in the form of quantum measure on (the space of functionals on) the superspace. The problem of reducing this measure to the Regge hypersurface is addressed. The quantum Regge calculus measure is defined from a discontinuous metric measure by inserting the δ-function-like phase factor. The requirement that continuity conditions be imposed in a 'face-independent' way fixes this factor uniquely. The term 'face-independent' means that this factor depends only on the (hyper)plane spanned by the face, not on it's form and size. This requirement seems to be natural from the viewpoint of existence of the well-defined continuum limit maximally free of lattice artefacts

  3. Numerical Calabi-Yau metrics

    International Nuclear Information System (INIS)

    Douglas, Michael R.; Karp, Robert L.; Lukic, Sergio; Reinbacher, Rene

    2008-01-01

    We develop numerical methods for approximating Ricci flat metrics on Calabi-Yau hypersurfaces in projective spaces. Our approach is based on finding balanced metrics and builds on recent theoretical work by Donaldson. We illustrate our methods in detail for a one parameter family of quintics. We also suggest several ways to extend our results

  4. Public health accreditation and metrics for ethics: a case study on environmental health and community engagement.

    Science.gov (United States)

    Bernheim, Ruth Gaare; Stefanak, Matthew; Brandenburg, Terry; Pannone, Aaron; Melnick, Alan

    2013-01-01

    As public health departments around the country undergo accreditation using the Public Health Accreditation Board standards, the process provides a new opportunity to integrate ethics metrics into day-to-day public health practice. While the accreditation standards do not explicitly address ethics, ethical tools and considerations can enrich the accreditation process by helping health departments and their communities understand what ethical principles underlie the accreditation standards and how to use metrics based on these ethical principles to support decision making in public health practice. We provide a crosswalk between a public health essential service, Public Health Accreditation Board community engagement domain standards, and the relevant ethical principles in the Public Health Code of Ethics (Code). A case study illustrates how the accreditation standards and the ethical principles in the Code together can enhance the practice of engaging the community in decision making in the local health department.

  5. On the equivalence of inertial and gravitational mass of extended bodies in metric theories of gravity

    International Nuclear Information System (INIS)

    Denisov, V.I.; Logunov, A.A.; Mestvirishvili, M.A.; Chugreev, Yu.V.

    1985-01-01

    It is shown that in any metric theory of gravitation passessing conservation laws for energy-momentum of the substance and gravitational field taken together, the motion of centre of extended body mass occurs not according to the geodesic Riemann space-time. The centre of mass of the extended body during its motion about the orbit makes a vibrational movement in relation to supporting geodesic. Application of obtained general formulas to the Sun-Earth system and the use of experimental results on the Moon location with the regard of other experiments has shown with high accuracy of 10 -10 that the relation of gravitational passive Earth mass to its inert mass does not equal to 1 differing from it about 10 -8 . The Earth at its orbital motion makes a vibrational movement in relation to supporting geodesic with a period of 1 hour and amplitude not less than 10 -2 sm. the deviation of the Earth mass center motion from geodesic movement can be found in a corresponding experiment having a postnewton accuracy degree

  6. Relevance of protection quantities in medical exposures

    International Nuclear Information System (INIS)

    Pradhan, A.S.

    2008-01-01

    International Commission on Radiological Protection (ICRP) continues to classify the exposures to radiation in three categories; namely 1- occupational exposure, 2- public exposure, and 3- medical exposure. Protection quantities are primarily meant for the regulatory purpose in radiological protection for controlling and limiting stochastic risks in occupational and public exposures. These are based on two basic assumptions of 1- linear no-threshold dose-effect relationship (LNT) at low doses and 2- long-term additivity of low doses. Medical exposure are predominantly delivered to individuals (patients) undergoing diagnostic examinations, interventional procedures and radiation therapy but also include individual caring for or comforting patients incurring exposure and the volunteers of biomedical medical research programmes. Radiation protection is as relevant to occupational and public exposure as to medical exposures except that the dose limits set for the formers are not applicable to medical exposure but reference levels and dose constrains are recommended for diagnostic and interventional medical procedures. In medical institutions, both the occupational and medical exposure takes place. Since the doses in diagnostic examinations are low, it has been observed that not only the protection quantities are often used in such cases but these are extended to estimate the number of cancer deaths due to such practices. One of the striking features of the new ICRP recommendations has been to elaborate the concepts of the dosimetric quantities. The limitation of protection quantities ((Effective dose, E=Σ RT D TR .W T .W R and Equivalent Dose H T =Σ RT D TR .W R ) have been brought out and this has raised a great concern and initiated debates on the use of these quantities in medical exposures. Consequently, ICRP has set a task group to provide more details and the recommendations. It has, therefore, became important to draw the attention of medical physics community

  7. Standard practice of calibration of force-measuring instruments for verifying the force indication of testing machines

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    2006-01-01

    1.1 The purpose of this practice is to specify procedures for the calibration of force-measuring instruments. Procedures are included for the following types of instruments: 1.1.1 Elastic force-measuring instruments, and 1.1.2 Force-multiplying systems, such as balances and small platform scales. Note 1Verification by deadweight loading is also an acceptable method of verifying the force indication of a testing machine. Tolerances for weights for this purpose are given in Practices E 4; methods for calibration of the weights are given in NIST Technical Note 577, Methods of Calibrating Weights for Piston Gages. 1.2 The values stated in SI units are to be regarded as the standard. Other metric and inch-pound values are regarded as equivalent when required. 1.3 This practice is intended for the calibration of static force measuring instruments. It is not applicable for dynamic or high speed force calibrations, nor can the results of calibrations performed in accordance with this practice be assumed valid for...

  8. A study on assessment of bone mass from aluminum-equivalent image by digital imaging system

    International Nuclear Information System (INIS)

    Kim, Jin Soo; Kim, Jae Duck; Choi, Eui Hwan

    1997-01-01

    The purpose of this study was to evaluated the method for quantitative assessment of bone mass from aluminum-equivalent value of hydroxyapatite by using digital imaging system consisted of Power Macintosh 7200/120, 15-inch color monitor, and GT-9000 scanner with transparency unit. After aluminum-equivalent image made from correlation between aluminum thickness and grey scale, the accuracy of conversion to mass from aluminum-equivalent value was evaluated. Measured bone mass was compared with converted bone mass from aluminum-equivalent value of hydroxyapatite block by correlation formula between aluminum-equivalent value of hydroxy apatite block and hydroxyapatite mass. The results of this study were as follows : 1. Correlation between aluminum thickness and grey level for obtaining aluminum-equivalent image was high positively associated (r2=0.99). Converted masses from aluminum-equivalent value were very similar to measured masses. There was, statistically, no significant difference (P<0.05) between them. 2. Correlation between hydroxyapatite aluminum-equivalent and hydroxyapatite mass was shown to linear relation (r2 =0.95). 3. Converted masses from aluminum-equivalent value of 3 dry mandible segments were similar to measured masses. The difference between the exposure directions was not significantly different (P<0.05).

  9. Radiation exposures for DOE and DOE contractor employees, 1989

    International Nuclear Information System (INIS)

    Smith, M.H.; Eschbach, P.A.; Harty, R.; Millet, W.H.; Scholes, V.A.

    1992-12-01

    All US Department of Energy (DOE) and DOE contractors, are required to submit occupational radiation exposure records to a central depository. In 1989, data were required to be submitted for all employees who were required to be monitored in accordance with DOE Order 5480.11 and for all visitors who had a positive exposure. The data required included the external penetrating whole-body dose equivalent, the shallow dose equivalent, and a summary of internal depositions of radioactive material above specified limits. Data regarding the exposed individuals included the individual's age, sex, and occupational category. This report is a summary of the external penetrating whole-body dose equivalents and shallow dose equivalents reported by DOE and DOE contractors for the calendar year 1989. A total of 90,882 DOE and DOE contractor employees were reported to have been monitored for whole-body ionizing radiation exposure during 1989. This represents 53.6% of all DOE and DOE contractor employees and is an increase (4.3 %) from the number of monitored employees for 1988. In addition to the employees, 12,643 visitors were monitored

  10. Quality Markers in Cardiology. Main Markers to Measure Quality of Results (Outcomes) and Quality Measures Related to Better Results in Clinical Practice (Performance Metrics). INCARDIO (Indicadores de Calidad en Unidades Asistenciales del Área del Corazón): A SEC/SECTCV Consensus Position Paper.

    Science.gov (United States)

    López-Sendón, José; González-Juanatey, José Ramón; Pinto, Fausto; Cuenca Castillo, José; Badimón, Lina; Dalmau, Regina; González Torrecilla, Esteban; López-Mínguez, José Ramón; Maceira, Alicia M; Pascual-Figal, Domingo; Pomar Moya-Prats, José Luis; Sionis, Alessandro; Zamorano, José Luis

    2015-11-01

    Cardiology practice requires complex organization that impacts overall outcomes and may differ substantially among hospitals and communities. The aim of this consensus document is to define quality markers in cardiology, including markers to measure the quality of results (outcomes metrics) and quality measures related to better results in clinical practice (performance metrics). The document is mainly intended for the Spanish health care system and may serve as a basis for similar documents in other countries. Copyright © 2015 Sociedad Española de Cardiología. Published by Elsevier España, S.L.U. All rights reserved.

  11. Equivalence between short-time biphasic and incompressible elastic material responses.

    Science.gov (United States)

    Ateshian, Gerard A; Ellis, Benjamin J; Weiss, Jeffrey A

    2007-06-01

    Porous-permeable tissues have often been modeled using porous media theories such as the biphasic theory. This study examines the equivalence of the short-time biphasic and incompressible elastic responses for arbitrary deformations and constitutive relations from first principles. This equivalence is illustrated in problems of unconfined compression of a disk, and of articular contact under finite deformation, using two different constitutive relations for the solid matrix of cartilage, one of which accounts for the large disparity observed between the tensile and compressive moduli in this tissue. Demonstrating this equivalence under general conditions provides a rationale for using available finite element codes for incompressible elastic materials as a practical substitute for biphasic analyses, so long as only the short-time biphasic response is sought. In practice, an incompressible elastic analysis is representative of a biphasic analysis over the short-term response deltatelasticity tensor, and K is the hydraulic permeability tensor of the solid matrix. Certain notes of caution are provided with regard to implementation issues, particularly when finite element formulations of incompressible elasticity employ an uncoupled strain energy function consisting of additive deviatoric and volumetric components.

  12. Metric dimensional reduction at singularities with implications to Quantum Gravity

    International Nuclear Information System (INIS)

    Stoica, Ovidiu Cristinel

    2014-01-01

    A series of old and recent theoretical observations suggests that the quantization of gravity would be feasible, and some problems of Quantum Field Theory would go away if, somehow, the spacetime would undergo a dimensional reduction at high energy scales. But an identification of the deep mechanism causing this dimensional reduction would still be desirable. The main contribution of this article is to show that dimensional reduction effects are due to General Relativity at singularities, and do not need to be postulated ad-hoc. Recent advances in understanding the geometry of singularities do not require modification of General Relativity, being just non-singular extensions of its mathematics to the limit cases. They turn out to work fine for some known types of cosmological singularities (black holes and FLRW Big-Bang), allowing a choice of the fundamental geometric invariants and physical quantities which remain regular. The resulting equations are equivalent to the standard ones outside the singularities. One consequence of this mathematical approach to the singularities in General Relativity is a special, (geo)metric type of dimensional reduction: at singularities, the metric tensor becomes degenerate in certain spacetime directions, and some properties of the fields become independent of those directions. Effectively, it is like one or more dimensions of spacetime just vanish at singularities. This suggests that it is worth exploring the possibility that the geometry of singularities leads naturally to the spontaneous dimensional reduction needed by Quantum Gravity. - Highlights: • The singularities we introduce are described by finite geometric/physical objects. • Our singularities are accompanied by dimensional reduction effects. • They affect the metric, the measure, the topology, the gravitational DOF (Weyl = 0). • Effects proposed in other approaches to Quantum Gravity are obtained naturally. • The geometric dimensional reduction obtained

  13. Formulating a coastal zone health metric for landuse impact management in urban coastal zones.

    Science.gov (United States)

    Anilkumar, P P; Varghese, Koshy; Ganesh, L S

    2010-11-01

    The need for ICZM arises often due to inadequate or inappropriate landuse planning practices and policies, especially in urban coastal zones which are more complex due to the larger number of components, their critical dimensions, attributes and interactions. A survey of literature shows that there is no holistic metric for assessing the impacts of landuse planning on the health of a coastal zone. Thus there is a need to define such a metric. The proposed metric, CHI (Coastal zone Health Indicator), developed on the basis of coastal system sustainability, attempts to gauge the health status of any coastal zone. It is formulated and modeled through an expert survey and pertains to the characteristic components of coastal zones, their critical dimensions, and relevant attributes. The proposed metric is applied to two urban coastal zones and validated. It can be used for more coast friendly and sustainable landuse planning/masterplan preparation and thereby for the better management of landuse impacts on coastal zones. Copyright 2010 Elsevier Ltd. All rights reserved.

  14. Conditions needed to give meaning to rad-equivalence principle

    International Nuclear Information System (INIS)

    Latarjet, R.

    1980-01-01

    To legislate on mutagenic chemical pollution the problem to be faced is similar to that tackled about 30 years ago regarding pollution by ionizing radiations. It would be useful to benefit from the work of these 30 years by establishing equivalences, if possible, between chemical mutagens and radiations. Inevitable mutagenic pollutions are considered here, especially those associated with fuel based energy production. As with radiations the legislation must derive from a compromise between the harmful and beneficial effects of the polluting system. When deciding on tolerance doses it is necessary to safeguard the biosphere without inflicting excessive restrictions on industry and on the economy. The present article discusses the conditions needed to give meaning to the notion of rad-equivalence. Some examples of already established equivalences are given, together with the first practical consequences which emerge [fr

  15. Anti-Authoritarian Metrics: Recursivity as a strategy for post-capitalism

    Directory of Open Access Journals (Sweden)

    David Adam Banks

    2016-12-01

    Full Text Available This essay proposes that those seeking to build counter-power institutions and communities learn to think in terms of what I call “recursivity.” Recursivity is an anti-authoritarian metric that helps bring about a sensitivity to feedback loops at multiple levels of organization. I begin by describing how technological systems and the socio-economic order co-constitute one-another around efficiency metrics. I then go on to define recursivity as social conditions that contain within them all of the parts and practices for their maturation and expansion, and show how organizations that demonstrate recursivity, like the historical English commons, have been marginalized or destroyed all together. Finally, I show how the ownership of property is inherently antithetical to the closed loops of recursivity. All of this is bookended by a study of urban planning’s recursive beginnings.

  16. Simulated Response of a Tissue-equivalent Proportional Counter on the Surface of Mars.

    Science.gov (United States)

    Northum, Jeremy D; Guetersloh, Stephen B; Braby, Leslie A; Ford, John R

    2015-10-01

    Uncertainties persist regarding the assessment of the carcinogenic risk associated with galactic cosmic ray (GCR) exposure during a mission to Mars. The GCR spectrum peaks in the range of 300(-1) MeV n to 700 MeV n(-1) and is comprised of elemental ions from H to Ni. While Fe ions represent only 0.03% of the GCR spectrum in terms of particle abundance, they are responsible for nearly 30% of the dose equivalent in free space. Because of this, radiation biology studies focusing on understanding the biological effects of GCR exposure generally use Fe ions. Acting as a thin shield, the Martian atmosphere alters the GCR spectrum in a manner that significantly reduces the importance of Fe ions. Additionally, albedo particles emanating from the regolith complicate the radiation environment. The present study uses the Monte Carlo code FLUKA to simulate the response of a tissue-equivalent proportional counter on the surface of Mars to produce dosimetry quantities and microdosimetry distributions. The dose equivalent rate on the surface of Mars was found to be 0.18 Sv y(-1) with an average quality factor of 2.9 and a dose mean lineal energy of 18.4 keV μm(-1). Additionally, albedo neutrons were found to account for 25% of the dose equivalent. It is anticipated that these data will provide relevant starting points for use in future risk assessment and mission planning studies.

  17. Cosmology of hybrid metric-Palatini f(X)-gravity

    International Nuclear Information System (INIS)

    Capozziello, Salvatore; Harko, Tiberiu; Koivisto, Tomi S.; Lobo, Francisco S.N.; Olmo, Gonzalo J.

    2013-01-01

    A new class of modified theories of gravity, consisting of the superposition of the metric Einstein-Hilbert Lagrangian with an f(R) term constructed à la Palatini was proposed recently. The dynamically equivalent scalar-tensor representation of the model was also formulated, and it was shown that even if the scalar field is very light, the theory passes the Solar System observational constraints. Therefore the model predicts the existence of a long-range scalar field, modifying the cosmological and galactic dynamics. An explicit model that passes the local tests and leads to cosmic acceleration was also obtained. In the present work, it is shown that the theory can be also formulated in terms of the quantity X≡κ 2 T+R, where T and R are the traces of the stress-energy and Ricci tensors, respectively. The variable X represents the deviation with respect to the field equation trace of general relativity. The cosmological applications of this hybrid metric-Palatini gravitational theory are also explored, and cosmological solutions coming from the scalar-tensor representation of f(X)-gravity are presented. Criteria to obtain cosmic acceleration are discussed and the field equations are analyzed as a dynamical system. Several classes of dynamical cosmological solutions, depending on the functional form of the effective scalar field potential, describing both accelerating and decelerating Universes are explicitly obtained. Furthermore, the cosmological perturbation equations are derived and applied to uncover the nature of the propagating scalar degree of freedom and the signatures these models predict in the large-scale structure

  18. The use of the effective dose equivalent, Hsub(E), as a risk parameter in computed tomography

    International Nuclear Information System (INIS)

    Huda, W.; Sandison, G.A.

    1986-01-01

    This note employs the concept of the effective dose equivalent, Hsub(E) to overcome the problems of comparing the non-uniform radiation doses encountered in CT examinations with the whole-body dose-equivalent limits imposed for non-medical exposures for members of the public (5 mSv/year), or with the risks from familiar everyday activities such as smoking cigarettes or driving cars. (U.K.)

  19. Issues in Benchmark Metric Selection

    Science.gov (United States)

    Crolotte, Alain

    It is true that a metric can influence a benchmark but will esoteric metrics create more problems than they will solve? We answer this question affirmatively by examining the case of the TPC-D metric which used the much debated geometric mean for the single-stream test. We will show how a simple choice influenced the benchmark and its conduct and, to some extent, DBMS development. After examining other alternatives our conclusion is that the “real” measure for a decision-support benchmark is the arithmetic mean.

  20. Synonymy in the translation equivalent paradigms of a standard ...

    African Journals Online (AJOL)

    The norm in current canonical translation dictionaries with Afrikaans and English as the treated language pair is an undiscriminated grouping of partially synonymous translation equivalents. These are separated by commas as sole markers of synonymy. Lexicographers should reject this practice and embrace the view that ...

  1. Robustness of climate metrics under climate policy ambiguity

    International Nuclear Information System (INIS)

    Ekholm, Tommi; Lindroos, Tomi J.; Savolainen, Ilkka

    2013-01-01

    Highlights: • We assess the economic impacts of using different climate metrics. • The setting is cost-efficient scenarios for three interpretations of the 2C target. • With each target setting, the optimal metric is different. • Therefore policy ambiguity prevents the selection of an optimal metric. • Robust metric values that perform well with multiple policy targets however exist. -- Abstract: A wide array of alternatives has been proposed as the common metrics with which to compare the climate impacts of different emission types. Different physical and economic metrics and their parameterizations give diverse weights between e.g. CH 4 and CO 2 , and fixing the metric from one perspective makes it sub-optimal from another. As the aims of global climate policy involve some degree of ambiguity, it is not possible to determine a metric that would be optimal and consistent with all policy aims. This paper evaluates the cost implications of using predetermined metrics in cost-efficient mitigation scenarios. Three formulations of the 2 °C target, including both deterministic and stochastic approaches, shared a wide range of metric values for CH 4 with which the mitigation costs are only slightly above the cost-optimal levels. Therefore, although ambiguity in current policy might prevent us from selecting an optimal metric, it can be possible to select robust metric values that perform well with multiple policy targets

  2. Metrical connection in space-time, Newton's and Hubble's laws

    International Nuclear Information System (INIS)

    Maeder, A.

    1978-01-01

    The theory of gravitation in general relativity is not scale invariant. Here, we follow Dirac's proposition of a scale invariant theory of gravitation (i.e. a theory in which the equations keep their form when a transformation of scale is made). We examine some concepts of Weyl's geometry, like the metrical connection, the scale transformations and invariance, and we discuss their consequences for the equation of the geodetic motion and for its Newtonian limit. Under general conditions, we show that the only non-vanishing component of the coefficient of metrical connection may be identified with Hubble's constant. In this framework, the equivalent to the Newtonian approximation for the equation of motion contains an additional acceleration term Hdr vector /dt, which produces an expansion of gravitational systems. The velocity of this expansion is shown to increase linearly with the distance between interacting objects. The relative importance of this new expansion term to the Newtonian one varies like (2rhosub(c)/rho)sup(1/2), where rhosub(c) is the critical density of the Einsteinde Sitter model and rho is the mean density of the considered gravitational configuration. Thus, this 'generalized expansion' is important essentially for systems of mean density not too much above the critical density. Finally, our main conclusion is that in the integrable Weyl geometry, Hubble's law - like Newton's law - would appear as an intrinsic property of gravitation, being only the most visible manifestation of a general effect characterizing the gravitational interaction. (orig.) [de

  3. Partial rectangular metric spaces and fixed point theorems.

    Science.gov (United States)

    Shukla, Satish

    2014-01-01

    The purpose of this paper is to introduce the concept of partial rectangular metric spaces as a generalization of rectangular metric and partial metric spaces. Some properties of partial rectangular metric spaces and some fixed point results for quasitype contraction in partial rectangular metric spaces are proved. Some examples are given to illustrate the observed results.

  4. A Kerr-NUT metric

    International Nuclear Information System (INIS)

    Vaidya, P.C.; Patel, L.K.; Bhatt, P.V.

    1976-01-01

    Using Galilean time and retarded distance as coordinates the usual Kerr metric is expressed in form similar to the Newman-Unti-Tamburino (NUT) metric. The combined Kerr-NUT metric is then investigated. In addition to the Kerr and NUT solutions of Einstein's equations, three other types of solutions are derived. These are (i) the radiating Kerr solution, (ii) the radiating NUT solution satisfying Rsub(ik) = sigmaxisub(i)xisub(k), xisub(i)xisup(i) = 0, and (iii) the associated Kerr solution satisfying Rsub(ik) = 0. Solution (i) is distinct from and simpler than the one reported earlier by Vaidya and Patel (Phys. Rev.; D7:3590 (1973)). Solutions (ii) and (iii) gave line elements which have the axis of symmetry as a singular line. (author)

  5. Construct equivalence and latent means analysis of health behaviors between male and female middle school students.

    Science.gov (United States)

    Park, Jeong Mo; Han, Ae Kyung; Cho, Yoon Hee

    2011-12-01

    The purpose of this study was to investigate the construct equivalence of the five general factors (subjective health, eating habits, physical activities, sedentary lifestyle, and sleeping behaviors) and to compare the latent means between male and female middle school students in Incheon, Korea. The 2008 Korean Youth Risk Behavior Survey data was used for analysis. Multigroup confirmatory factor analysis was performed to test whether the scale has configural, metric, and scalar invariance across gender. Configural invariance, metric invariance, and factor invariance were satisfied for latent means analysis (LMA) between genders. Male and female students were significantly different in LMA of all factors. Male students reported better subjective health, consumed more fast food and carbonated drinks, participated in more physical activities, showed less sedentary behavior, and enjoyed better quality of sleep than female students. Health providers should consider gender differences when they develop and deliver health promotion programs aimed at adolescents. Copyright © 2011. Published by Elsevier B.V.

  6. SU-G-BRB-16: Vulnerabilities in the Gamma Metric

    International Nuclear Information System (INIS)

    Neal, B; Siebers, J

    2016-01-01

    Purpose: To explore vulnerabilities in the gamma index metric that undermine its wide use as a radiation therapy quality assurance tool. Methods: 2D test field pairs (images) are created specifically to achieve high gamma passing rates, but to also include gross errors by exploiting the distance-to-agreement and percent-passing components of the metric. The first set has no requirement of clinical practicality, but is intended to expose vulnerabilities. The second set exposes clinically realistic vulnerabilities. To circumvent limitations inherent to user-specific tuning of prediction algorithms to match measurements, digital test cases are manually constructed, thereby mimicking high-quality image prediction. Results: With a 3 mm distance-to-agreement metric, changing field size by ±6 mm results in a gamma passing rate over 99%. For a uniform field, a lattice of passing points spaced 5 mm apart results in a passing rate of 100%. Exploiting the percent-passing component, a 10×10 cm"2 field can have a 95% passing rate when an 8 cm"2=2.8×2.8 cm"2 highly out-of-tolerance (e.g. zero dose) square is missing from the comparison image. For clinically realistic vulnerabilities, an arc plan for which a 2D image is created can have a >95% passing rate solely due to agreement in the lateral spillage, with the failing 5% in the critical target region. A field with an integrated boost (e.g whole brain plus small metastases) could neglect the metastases entirely, yet still pass with a 95% threshold. All the failure modes described would be visually apparent on a gamma-map image. Conclusion: The %gamma<1 metric has significant vulnerabilities. High passing rates can obscure critical faults in hypothetical and delivered radiation doses. Great caution should be used with gamma as a QA metric; users should inspect the gamma-map. Visual analysis of gamma-maps may be impractical for cine acquisition.

  7. SU-G-BRB-16: Vulnerabilities in the Gamma Metric

    Energy Technology Data Exchange (ETDEWEB)

    Neal, B; Siebers, J [University of Virginia Health System, Charlottesville, VA (United States)

    2016-06-15

    Purpose: To explore vulnerabilities in the gamma index metric that undermine its wide use as a radiation therapy quality assurance tool. Methods: 2D test field pairs (images) are created specifically to achieve high gamma passing rates, but to also include gross errors by exploiting the distance-to-agreement and percent-passing components of the metric. The first set has no requirement of clinical practicality, but is intended to expose vulnerabilities. The second set exposes clinically realistic vulnerabilities. To circumvent limitations inherent to user-specific tuning of prediction algorithms to match measurements, digital test cases are manually constructed, thereby mimicking high-quality image prediction. Results: With a 3 mm distance-to-agreement metric, changing field size by ±6 mm results in a gamma passing rate over 99%. For a uniform field, a lattice of passing points spaced 5 mm apart results in a passing rate of 100%. Exploiting the percent-passing component, a 10×10 cm{sup 2} field can have a 95% passing rate when an 8 cm{sup 2}=2.8×2.8 cm{sup 2} highly out-of-tolerance (e.g. zero dose) square is missing from the comparison image. For clinically realistic vulnerabilities, an arc plan for which a 2D image is created can have a >95% passing rate solely due to agreement in the lateral spillage, with the failing 5% in the critical target region. A field with an integrated boost (e.g whole brain plus small metastases) could neglect the metastases entirely, yet still pass with a 95% threshold. All the failure modes described would be visually apparent on a gamma-map image. Conclusion: The %gamma<1 metric has significant vulnerabilities. High passing rates can obscure critical faults in hypothetical and delivered radiation doses. Great caution should be used with gamma as a QA metric; users should inspect the gamma-map. Visual analysis of gamma-maps may be impractical for cine acquisition.

  8. Background metric in supergravity theories

    International Nuclear Information System (INIS)

    Yoneya, T.

    1978-01-01

    In supergravity theories, we investigate the conformal anomaly of the path-integral determinant and the problem of fermion zero modes in the presence of a nontrivial background metric. Except in SO(3) -invariant supergravity, there are nonvanishing conformal anomalies. As a consequence, amplitudes around the nontrivial background metric contain unpredictable arbitrariness. The fermion zero modes which are explicitly constructed for the Euclidean Schwarzschild metric are interpreted as an indication of the supersymmetric multiplet structure of a black hole. The degree of degeneracy of a black hole is 2/sup 4n/ in SO(n) supergravity

  9. Ability to Discriminate Between Sustainable and Unsustainable Heat Stress Exposures-Part 1: WBGT Exposure Limits.

    Science.gov (United States)

    Garzón-Villalba, Ximena P; Wu, Yougui; Ashley, Candi D; Bernard, Thomas E

    2017-07-01

    Heat stress exposure limits based on wet-bulb globe temperature (WBGT) were designed to limit exposures to those that could be sustained for an 8-h day using limited data from Lind in the 1960s. In general, Sustainable exposures are heat stress levels at which thermal equilibrium can be achieved, and Unsustainable exposures occur when there is a steady increase in core temperature. This paper addresses the ability of the ACGIH® Threshold Limit Value (TLV®) to differentiate between Sustainable and Unsustainable heat exposures, to propose alternative occupational exposure limits, and ask whether an adjustment for body surface area improves the exposure decision. Two progressive heat stress studies provided data on 176 trials with 352 pairs of Sustainable and Unsustainable exposures over a range of relative humidities and metabolic rates using 29 participants wearing woven cotton clothing. To assess the discrimination ability of the TLV, the exposure metric was the difference between the observed WBGT and the TLV adjusted for metabolic rate. Conditional logistic regression models and receiver operating characteristic curves (ROC) along with ROC's area under the curve (AUC) were used. Four alternative models for an occupational exposure limit were also developed and compared to the TLV. For the TLV, the odds ratio (OR) for Unsustainable was 2.5 per 1°C-WBGT [confidence interval (CI) 2.12-2.88]. The AUC for the TLV was 0.85 (CI 0.81-0.89). For the alternative models, the ORs were also about 2.5/°C-WBGT, with AUCs between 0.84 and 0.88, which were significantly different from the TLV's AUC but have little practical difference. This study (1) confirmed that the TLV is appropriate for heat stress screening; (2) demonstrated the TLV's discrimination accuracy with an ROC AUC of 0.85; and (3) established the OR of 2.5/°C-WBGT for unsustainable exposures. The TLV has high sensitivity, but its specificity is very low, which is protective. There were no important

  10. A conception of practical application of the ICRP Publ. 60

    International Nuclear Information System (INIS)

    Numakunai, Takao

    1999-01-01

    The report of view for practical application of ICRP Publ. 60 in Japanese regulations and its technical guideline proposal were published by the Advisory Committee of radiation protection in June, 1998 and April, 1999, respectively. This paper described the summary of the above reports and essential conception for the actual application. Following items were summarized: the change of technical terms such as the use of ''dose'' in place of dose equivalent, dose limits in occupational exposure (the effective dose limit not to exceed 100 mSv/5 y and 50 mSv/y), dose limits in women's occupational exposure (not to exceed 5 mSv/3 mth), the working area (the controlled area), public dose limits with consideration for medical exposure, exposure by natural sources of radiation, exposure in volunteers and nursing persons, occupational health service for radiation workers, emergency exposure (100 mSv; 300 mSv for lens and 1 Sv for skin), intervention in the public at emergency exposure, document, and the system for radiation control. It was expected for suitable institutions and groups to develop and make the guideline through the examination of the reports. (K.H.)

  11. A conception of practical application of the ICRP Publ. 60

    Energy Technology Data Exchange (ETDEWEB)

    Numakunai, Takao [Inst. of Radiation Measurements, Tokai, Ibaraki (Japan)

    1999-09-01

    The report of view for practical application of ICRP Publ. 60 in Japanese regulations and its technical guideline proposal were published by the Advisory Committee of radiation protection in June, 1998 and April, 1999, respectively. This paper described the summary of the above reports and essential conception for the actual application. Following items were summarized: the change of technical terms such as the use of ''dose'' in place of dose equivalent, dose limits in occupational exposure (the effective dose limit not to exceed 100 mSv/5 y and 50 mSv/y), dose limits in women's occupational exposure (not to exceed 5 mSv/3 mth), the working area (the controlled area), public dose limits with consideration for medical exposure, exposure by natural sources of radiation, exposure in volunteers and nursing persons, occupational health service for radiation workers, emergency exposure (100 mSv; 300 mSv for lens and 1 Sv for skin), intervention in the public at emergency exposure, document, and the system for radiation control. It was expected for suitable institutions and groups to develop and make the guideline through the examination of the reports. (K.H.)

  12. NeuronMetrics: software for semi-automated processing of cultured neuron images.

    Science.gov (United States)

    Narro, Martha L; Yang, Fan; Kraft, Robert; Wenk, Carola; Efrat, Alon; Restifo, Linda L

    2007-03-23

    Using primary cell culture to screen for changes in neuronal morphology requires specialized analysis software. We developed NeuronMetrics for semi-automated, quantitative analysis of two-dimensional (2D) images of fluorescently labeled cultured neurons. It skeletonizes the neuron image using two complementary image-processing techniques, capturing fine terminal neurites with high fidelity. An algorithm was devised to span wide gaps in the skeleton. NeuronMetrics uses a novel strategy based on geometric features called faces to extract a branch number estimate from complex arbors with numerous neurite-to-neurite contacts, without creating a precise, contact-free representation of the neurite arbor. It estimates total neurite length, branch number, primary neurite number, territory (the area of the convex polygon bounding the skeleton and cell body), and Polarity Index (a measure of neuronal polarity). These parameters provide fundamental information about the size and shape of neurite arbors, which are critical factors for neuronal function. NeuronMetrics streamlines optional manual tasks such as removing noise, isolating the largest primary neurite, and correcting length for self-fasciculating neurites. Numeric data are output in a single text file, readily imported into other applications for further analysis. Written as modules for ImageJ, NeuronMetrics provides practical analysis tools that are easy to use and support batch processing. Depending on the need for manual intervention, processing time for a batch of approximately 60 2D images is 1.0-2.5 h, from a folder of images to a table of numeric data. NeuronMetrics' output accelerates the quantitative detection of mutations and chemical compounds that alter neurite morphology in vitro, and will contribute to the use of cultured neurons for drug discovery.

  13. 16 CFR 1510.3 - Requirements.

    Science.gov (United States)

    2010-01-01

    ... 16 Commercial Practices 2 2010-01-01 2010-01-01 false Requirements. 1510.3 Section 1510.3 Commercial Practices CONSUMER PRODUCT SAFETY COMMISSION FEDERAL HAZARDOUS SUBSTANCES ACT REGULATIONS... purposes, the English measurements shall be used. Metric equivalents are included for convenience.) Rattles...

  14. Metrics for energy resilience

    International Nuclear Information System (INIS)

    Roege, Paul E.; Collier, Zachary A.; Mancillas, James; McDonagh, John A.; Linkov, Igor

    2014-01-01

    Energy lies at the backbone of any advanced society and constitutes an essential prerequisite for economic growth, social order and national defense. However there is an Achilles heel to today's energy and technology relationship; namely a precarious intimacy between energy and the fiscal, social, and technical systems it supports. Recently, widespread and persistent disruptions in energy systems have highlighted the extent of this dependence and the vulnerability of increasingly optimized systems to changing conditions. Resilience is an emerging concept that offers to reconcile considerations of performance under dynamic environments and across multiple time frames by supplementing traditionally static system performance measures to consider behaviors under changing conditions and complex interactions among physical, information and human domains. This paper identifies metrics useful to implement guidance for energy-related planning, design, investment, and operation. Recommendations are presented using a matrix format to provide a structured and comprehensive framework of metrics relevant to a system's energy resilience. The study synthesizes previously proposed metrics and emergent resilience literature to provide a multi-dimensional model intended for use by leaders and practitioners as they transform our energy posture from one of stasis and reaction to one that is proactive and which fosters sustainable growth. - Highlights: • Resilience is the ability of a system to recover from adversity. • There is a need for methods to quantify and measure system resilience. • We developed a matrix-based approach to generate energy resilience metrics. • These metrics can be used in energy planning, system design, and operations

  15. Balanced metrics for vector bundles and polarised manifolds

    DEFF Research Database (Denmark)

    Garcia Fernandez, Mario; Ross, Julius

    2012-01-01

    leads to a Hermitian-Einstein metric on E and a constant scalar curvature Kähler metric in c_1(L). For special values of α, limits of balanced metrics are solutions of a system of coupled equations relating a Hermitian-Einstein metric on E and a Kähler metric in c1(L). For this, we compute the top two......We consider a notion of balanced metrics for triples (X, L, E) which depend on a parameter α, where X is smooth complex manifold with an ample line bundle L and E is a holomorphic vector bundle over X. For generic choice of α, we prove that the limit of a convergent sequence of balanced metrics...

  16. Guideline values for skin decontamination measures based on nuclidspecific dose equivalent rate factors

    International Nuclear Information System (INIS)

    Pfob, H.; Heinemann, G.

    1992-01-01

    Corresponding dose equivalent rate factors for various radionuclides are now available for determining the skin dose caused by skin contamination. These dose equivalent rate factors take into account all contributions from the types of radiation emitted. Any limits for skin decontamination measures are nowhere contained or determined yet. However, radiological protection does in practice require at least guideline values in order to prevent unsuitable or detrimental measures that can be noticed quite often. New calculations of dose equivalent rate factors for the skin now make the recommendation of guideline values possible. (author)

  17. The metrics of science and technology

    CERN Document Server

    Geisler, Eliezer

    2000-01-01

    Dr. Geisler's far-reaching, unique book provides an encyclopedic compilation of the key metrics to measure and evaluate the impact of science and technology on academia, industry, and government. Focusing on such items as economic measures, patents, peer review, and other criteria, and supported by an extensive review of the literature, Dr. Geisler gives a thorough analysis of the strengths and weaknesses inherent in metric design, and in the use of the specific metrics he cites. His book has already received prepublication attention, and will prove especially valuable for academics in technology management, engineering, and science policy; industrial R&D executives and policymakers; government science and technology policymakers; and scientists and managers in government research and technology institutions. Geisler maintains that the application of metrics to evaluate science and technology at all levels illustrates the variety of tools we currently possess. Each metric has its own unique strengths and...

  18. Extending cosmology: the metric approach

    OpenAIRE

    Mendoza, S.

    2012-01-01

    Comment: 2012, Extending Cosmology: The Metric Approach, Open Questions in Cosmology; Review article for an Intech "Open questions in cosmology" book chapter (19 pages, 3 figures). Available from: http://www.intechopen.com/books/open-questions-in-cosmology/extending-cosmology-the-metric-approach

  19. Metrics, Media and Advertisers: Discussing Relationship

    Directory of Open Access Journals (Sweden)

    Marco Aurelio de Souza Rodrigues

    2014-11-01

    Full Text Available This study investigates how Brazilian advertisers are adapting to new media and its attention metrics. In-depth interviews were conducted with advertisers in 2009 and 2011. In 2009, new media and its metrics were celebrated as innovations that would increase advertising campaigns overall efficiency. In 2011, this perception has changed: New media’s profusion of metrics, once seen as an advantage, started to compromise its ease of use and adoption. Among its findings, this study argues that there is an opportunity for media groups willing to shift from a product-focused strategy towards a customer-centric one, through the creation of new, simple and integrative metrics

  20. Active Metric Learning for Supervised Classification

    OpenAIRE

    Kumaran, Krishnan; Papageorgiou, Dimitri; Chang, Yutong; Li, Minhan; Takáč, Martin

    2018-01-01

    Clustering and classification critically rely on distance metrics that provide meaningful comparisons between data points. We present mixed-integer optimization approaches to find optimal distance metrics that generalize the Mahalanobis metric extensively studied in the literature. Additionally, we generalize and improve upon leading methods by removing reliance on pre-designated "target neighbors," "triplets," and "similarity pairs." Another salient feature of our method is its ability to en...

  1. Quantitative and qualitative research across cultures and languages: cultural metrics and their application.

    Science.gov (United States)

    Wagner, Wolfgang; Hansen, Karolina; Kronberger, Nicole

    2014-12-01

    Growing globalisation of the world draws attention to cultural differences between people from different countries or from different cultures within the countries. Notwithstanding the diversity of people's worldviews, current cross-cultural research still faces the challenge of how to avoid ethnocentrism; comparing Western-driven phenomena with like variables across countries without checking their conceptual equivalence clearly is highly problematic. In the present article we argue that simple comparison of measurements (in the quantitative domain) or of semantic interpretations (in the qualitative domain) across cultures easily leads to inadequate results. Questionnaire items or text produced in interviews or via open-ended questions have culturally laden meanings and cannot be mapped onto the same semantic metric. We call the culture-specific space and relationship between variables or meanings a 'cultural metric', that is a set of notions that are inter-related and that mutually specify each other's meaning. We illustrate the problems and their possible solutions with examples from quantitative and qualitative research. The suggested methods allow to respect the semantic space of notions in cultures and language groups and the resulting similarities or differences between cultures can be better understood and interpreted.

  2. Determination of dose equivalent and risk in thorium cycle

    International Nuclear Information System (INIS)

    Ney, C.L.V.N.

    1988-01-01

    In these report are presented the calculations of dose equivalent and risk, utilizing the dosimetric model described in publication 30 of the International Comission on Radiological Protection. This information was obtained by the workers of the thorium cycle, employed at the Praia and Santo Amaro Facilities, by assessing the quantity and concentration of thorium in the air. The samples and the number of measurements were established through design of experiments techniques, and the results were evaluated with the aid of variance analysis. The estimater of dose equivalent for internal and external radiation exposure and risk associated were compared with the maximum recommended limits. The results indicate the existence of operation areas whose values were above those limits, requiring so an improvement in the procedures and services in order to meet the requirements of the radiological protetion. (author) [pt

  3. Multimetric indices: How many metrics?

    Science.gov (United States)

    Multimetric indices (MMI’s) often include 5 to 15 metrics, each representing a different attribute of assemblage condition, such as species diversity, tolerant taxa, and nonnative taxa. Is there an optimal number of metrics for MMIs? To explore this question, I created 1000 9-met...

  4. Radiation Exposure Reduction to Brachytherapy Staff By Using Remote Afterloading

    International Nuclear Information System (INIS)

    Attalla, E.M.

    2005-01-01

    The radiation exposures to the personnel staff from patients with brachytherapy implants in a brachytherapy service were reviewed. Exposures to the brachytherapy personnel, as determined by Thermoluminescence Dosimeter (TLD) monitors, indicates a four-fold reduction in exposures after the implantation of the use of remote afterloading devices. Quarterly TLD monitor data for seven quarters prior to the use of remote afterloading devices demonstrate an average projected annual dose equivalent to the brachytherapy staff of 2543 Μ Sv. After the implantation of the remote afterloading devices, the quarterly TLD monitor data indicate an average dose equivalent per person of 153 Μ Sv. This is 76% reduction in exposure to brachytherapy personnel with the use of these devices

  5. The exposure assessment of Rn-222 gas in the atmosphere(II)

    International Nuclear Information System (INIS)

    Ha, Chung Wo; Chang, Si Young; Seo, Kyung Won; Yoon, Yeo Chang; Kim, Jang Lyul; Yoon, Suk Chul; Chung, Rae Ik; Kim, Jong Soo; Park, Young Woong

    1991-01-01

    Dose assessment to inhalation exposure of indoor 222 Rn daughters in 12 residential areas in Korea has been performed by long term averaged radon concentrations measured with passive CR-39 radon cups. A simple mathematical lung dosimetry model based on the ICRP-30 was derived to estimate the indoor radon daughters exposure. The long term average indoor 222 Rn concentrations and corresponding equilibrium equivalent radon concentrations (EEC Rn ) in 12 areas showed a range of 33.82 ∼ 61.42 Bq.m -3 (median : 48.90 Bq.m -3 ) and of 13.53 ∼ 24.57 Bq.m -3 (median: 19.55 Bq.m -3 ), respectively. Reference dose conversion functions for evaluation of regional lung dose and effective dose equivalent for unit exposure to EEC Rn have been derived for an adult. The effective dose equivalent conversion factor was estimated to be 1.07 x 10 -5 mSv/Bq.h.m -3 and this conversion factor agreed well with that recommended by the ICRP and UNSCEAR report. The annual average dose equivalents(H) to Tracheo-Bronchial and Pulmonary region of the lung, and total lung from exposure to measured EEC Rn were estimated to be 17.52 mSv.y -l , 3.35 mSv.y -l and 20.90 mSv.y -1 , respectively, and the resulting effective dose equivalent(H E ) was estimated to be 1.25 mSv.y -l , which is almost 50% of the natural radiation exposure of 2.40 mSv.y -l reported by the UNSCEAR. (Author)

  6. Organic solvent exposure and depressive symptoms among licensed pesticide applicators in the Agricultural Health Study.

    Science.gov (United States)

    Siegel, Miriam; Starks, Sarah E; Sanderson, Wayne T; Kamel, Freya; Hoppin, Jane A; Gerr, Fred

    2017-11-01

    Although organic solvents are often used in agricultural operations, neurotoxic effects of solvent exposure have not been extensively studied among farmers. The current analysis examined associations between questionnaire-based metrics of organic solvent exposure and depressive symptoms among farmers. Results from 692 male Agricultural Health Study participants were analyzed. Solvent type and exposure duration were assessed by questionnaire. An "ever-use" variable and years of use categories were constructed for exposure to gasoline, paint/lacquer thinner, petroleum distillates, and any solvent. Depressive symptoms were ascertained with the Center for Epidemiologic Studies Depression Scale (CES-D); scores were analyzed separately as continuous (0-60) and dichotomous (distillates, and short duration of petroleum distillate exposure and continuous CES-D score (p < 0.05). Although nearly all associations were positive, fewer statistically significant associations were observed between metrics of solvent exposure and the dichotomized CES-D variable. Solvent exposures were associated with depressive symptoms among farmers. Efforts to limit exposure to organic solvents may reduce the risk of depressive symptoms among farmers.

  7. Residential traffic exposure and children's emergency department presentation for asthma: a spatial study

    Directory of Open Access Journals (Sweden)

    Pereira Gavin

    2009-11-01

    Full Text Available Abstract Background There is increasing evidence that residential proximity to roadways is associated with an elevated risk of asthma exacerbation. However, there is no consensus on the distance at which these health effects diminishes to background levels. Therefore the optimal, clinically relevant measure of exposure remains uncertain. Using four spatially defined exposure metrics, we evaluated the association between residential proximity to roadways and emergency department (ED presentation for asthma in Perth, Western Australia. Method The study population consisted of 1809 children aged between 0 and 19 years who had presented at an ED between 2002 and 2006 and were resident in a south-west metropolitan area of Perth traversed by major motorways. We used a 1:2 matched case-control study with gastroenteritis and upper limb injury as the control conditions. To estimate exposure to traffic emissions, we used 4 contrasting methods and 2 independently derived sources of traffic data (video-monitored traffic counts and those obtained from the state government road authority. The following estimates of traffic exposure were compared: (1 a point pattern method, (2 a distance-weighted traffic exposure method, (3 a simple distance method and (4 a road length method. Results Risk estimates were sensitive to socio-economic gradients and the type of exposure method that was applied. Unexpectedly, a range of apparent protective effects were observed for some exposure metrics. The kernel density measure demonstrated more than a 2-fold (OR 2.51, 95% CI 2.00 - 3.15 increased risk of asthma ED presentation for the high exposure group compared to the low exposure group. Conclusion We assessed exposure using traffic data from 2 independent sources and compared the results of 4 different exposure metric types. The results indicate that traffic congestion may be one of the most important aspects of traffic-related exposures, despite being overlooked in many

  8. Metrics for Polyphonic Sound Event Detection

    Directory of Open Access Journals (Sweden)

    Annamaria Mesaros

    2016-05-01

    Full Text Available This paper presents and discusses various metrics proposed for evaluation of polyphonic sound event detection systems used in realistic situations where there are typically multiple sound sources active simultaneously. The system output in this case contains overlapping events, marked as multiple sounds detected as being active at the same time. The polyphonic system output requires a suitable procedure for evaluation against a reference. Metrics from neighboring fields such as speech recognition and speaker diarization can be used, but they need to be partially redefined to deal with the overlapping events. We present a review of the most common metrics in the field and the way they are adapted and interpreted in the polyphonic case. We discuss segment-based and event-based definitions of each metric and explain the consequences of instance-based and class-based averaging using a case study. In parallel, we provide a toolbox containing implementations of presented metrics.

  9. Robustness Metrics: Consolidating the multiple approaches to quantify Robustness

    DEFF Research Database (Denmark)

    Göhler, Simon Moritz; Eifler, Tobias; Howard, Thomas J.

    2016-01-01

    robustness metrics; 3) Functional expectancy and dispersion robustness metrics; and 4) Probability of conformance robustness metrics. The goal was to give a comprehensive overview of robustness metrics and guidance to scholars and practitioners to understand the different types of robustness metrics...

  10. Common Metrics for Human-Robot Interaction

    Science.gov (United States)

    Steinfeld, Aaron; Lewis, Michael; Fong, Terrence; Scholtz, Jean; Schultz, Alan; Kaber, David; Goodrich, Michael

    2006-01-01

    This paper describes an effort to identify common metrics for task-oriented human-robot interaction (HRI). We begin by discussing the need for a toolkit of HRI metrics. We then describe the framework of our work and identify important biasing factors that must be taken into consideration. Finally, we present suggested common metrics for standardization and a case study. Preparation of a larger, more detailed toolkit is in progress.

  11. Optimising work practices to minimise the radiation exposure of PET radiopharmacists

    International Nuclear Information System (INIS)

    Hickson, K.; Chan, G.; O'Keefe, G.; Young, K.; Tochon-Danguy, H.; Poniger, S.; Scott, A.

    2010-01-01

    Full text: The recent installation of a new medical cyclotron at Austin Health has given justification to install an automatic radiopharmaceutical dispenser. We aimed to evaluate the effectiveness of the automatic radiopharmaceutical dispenser on the radiation exp sure of the PET radiopharmacist. Radiation measurements performed can be divided into two distinct examinations. These are; a survey of the ambient radiation levels and an estimation of the personal radiation dose to the radiopharmacist. Shielding around the automatic dispenser was modified and radiation levels were then compared pre and post optimisation. Using real time monitoring methods, the yearly projected radiation dose to the radiopharmacist for FOG production was found. For whole body exposure this was approximately 4.1 mSv per year and a dose of 221 mS v per year to the hands. The radiation dose burden from all duties was recorded using TLD's and was found to be 4.5 and 321 mSv for whole body and hand radiation doses respectively. Since the implementation of an automatic radiopharmaceutical dose dispenser, radiation exposure recorded to the hands by TLD measurements have fallen by 39%. Further optimisation has seen the ambient radiation levels fall by 15%. Conclusion It has been shown that by reviewing work practices, radiation exposure continues to remain below the radiation dose constraints required by law. Continuing optimization and reviewing ensures that radiation exposure is kept as low as reasonably achievable. (author)

  12. Protective aprons in imaging departments: manufacturer stated lead equivalence values require validation

    International Nuclear Information System (INIS)

    Finnerty, M.; Brennan, P.C.

    2005-01-01

    The composition of protective aprons worn by X-ray personnel to shield against secondary radiation is changing. Lead is being replaced by either lead-free or composite (lead with other high atomic numbered elements) materials. These newer aprons are categorised by manufacturers in terms of lead equivalent values, but it is unclear how these stated values compare with actual lead equivalent values. In this work, the actual lead equivalence of 41 protective aprons from four manufacturers, all specified as having 0.25 mm lead equivalence, were investigated with transmission experiments at 70 and 100 kVp. All aprons were in current use. The aprons were screened for defects, and age, weight and design was recorded along with details of associated quality assurance (QA). Out of the 41 protective aprons examined for actual lead equivalence, 73% were outside tolerance levels, with actual levels in some aprons demonstrating less than half of the nominal values. The lack of compatibility between actual and nominal lead equivalent values was demonstrated by aprons from three of the four manufacturers investigated. The area of the defects found on screening of the protective aprons were within recommendations. The results highlight the need for acceptancy and ongoing checks of protective aprons to ensure that radiation exposure of imaging personnel is kept to a minimum. (orig.)

  13. Metric to quantify white matter damage on brain magnetic resonance images

    International Nuclear Information System (INIS)

    Valdes Hernandez, Maria del C.; Munoz Maniega, Susana; Anblagan, Devasuda; Bastin, Mark E.; Wardlaw, Joanna M.; Chappell, Francesca M.; Morris, Zoe; Sakka, Eleni; Dickie, David Alexander; Royle, Natalie A.; Armitage, Paul A.; Deary, Ian J.

    2017-01-01

    Quantitative assessment of white matter hyperintensities (WMH) on structural Magnetic Resonance Imaging (MRI) is challenging. It is important to harmonise results from different software tools considering not only the volume but also the signal intensity. Here we propose and evaluate a metric of white matter (WM) damage that addresses this need. We obtained WMH and normal-appearing white matter (NAWM) volumes from brain structural MRI from community dwelling older individuals and stroke patients enrolled in three different studies, using two automatic methods followed by manual editing by two to four observers blind to each other. We calculated the average intensity values on brain structural fluid-attenuation inversion recovery (FLAIR) MRI for the NAWM and WMH. The white matter damage metric is calculated as the proportion of WMH in brain tissue weighted by the relative image contrast of the WMH-to-NAWM. The new metric was evaluated using tissue microstructure parameters and visual ratings of small vessel disease burden and WMH: Fazekas score for WMH burden and Prins scale for WMH change. The correlation between the WM damage metric and the visual rating scores (Spearman ρ > =0.74, p =0.72, p < 0.0001). The repeatability of the WM damage metric was better than WM volume (average median difference between measurements 3.26% (IQR 2.76%) and 5.88% (IQR 5.32%) respectively). The follow-up WM damage was highly related to total Prins score even when adjusted for baseline WM damage (ANCOVA, p < 0.0001), which was not always the case for WMH volume, as total Prins was highly associated with the change in the intense WMH volume (p = 0.0079, increase of 4.42 ml per unit change in total Prins, 95%CI [1.17 7.67]), but not with the change in less-intense, subtle WMH, which determined the volumetric change. The new metric is practical and simple to calculate. It is robust to variations in image processing methods and scanning protocols, and sensitive to subtle and severe white

  14. Narrowing the Gap Between QoS Metrics and Web QoE Using Above-the-fold Metrics

    OpenAIRE

    da Hora, Diego Neves; Asrese, Alemnew; Christophides, Vassilis; Teixeira, Renata; Rossi, Dario

    2018-01-01

    International audience; Page load time (PLT) is still the most common application Quality of Service (QoS) metric to estimate the Quality of Experience (QoE) of Web users. Yet, recent literature abounds with proposals for alternative metrics (e.g., Above The Fold, SpeedIndex and variants) that aim at better estimating user QoE. The main purpose of this work is thus to thoroughly investigate a mapping between established and recently proposed objective metrics and user QoE. We obtain ground tr...

  15. "Body Practices--Exposure and Effect of a Sporting Culture?" "Stories from Three Australian Swimmers"

    Science.gov (United States)

    McMahon, Jenny; Penney, Dawn; Dinan-Thompson, Maree

    2012-01-01

    This paper contributes to sport, sociology and the body literature by exploring the "exposure and effect" of culture, in particular bodily practices placed on three adolescent swimmers immersed in the Australian swimming culture using an ethnographic framework. The research reported is particularly notable as it addresses two distinct…

  16. Wetland habitat disturbance best predicts metrics of an amphibian index of biotic integrity

    Science.gov (United States)

    Stapanian, Martin A.; Micacchion, Mick; Adams, Jean V.

    2015-01-01

    Regression and classification trees were used to identify the best predictors of the five component metrics of the Ohio Amphibian Index of Biotic Integrity (AmphIBI) in 54 wetlands in Ohio, USA. Of the 17 wetland- and surrounding landscape-scale variables considered, the best predictor for all AmphIBI metrics was habitat alteration and development within the wetland. The results were qualitatively similar to the best predictors for a wetland vegetation index of biotic integrity, suggesting that similar management practices (e.g., reducing or eliminating nutrient enrichment from agriculture, mowing, grazing, logging, and removing down woody debris) within the boundaries of the wetland can be applied to effectively increase the quality of wetland vegetation and amphibian communities.

  17. The risk equivalent of an exposure to-, versus a dose of radiation

    International Nuclear Information System (INIS)

    Bond, V.P.

    1986-01-01

    The long-term potential carcinogenic effects of low-level exposure (LLE) are addressed. The principal point discussed is linear, no-threshold dose-response curve. That the linear no-threshold, or proportional relationship is widely used is seen in the way in which the values for cancer risk coefficients are expressed - in terms of new cases, per million persons exposed, per year, per unit exposure or dose. This implies that the underlying relationship is proportional, i.e., ''linear, without threshold''. 12 refs., 9 figs., 1 tab

  18. Ionizing radiation as a source of both occupational and public exposure. Is there any difference between them? Conclusions for radiation protection practice

    International Nuclear Information System (INIS)

    Vassilev, G.

    2000-01-01

    The assessment of the radiation risk both from natural and occupational exposure is discussed, taking into account the values for the end of 20th century for Bulgaria. The natural background exposure in the country is in average of 2.3 mSv/a. The eternal exposure in different regions varies within the range of ±25%. The radon concentration in dwellings vary in wide range: from 2.5 to 250 Bq/m 3 at a geometrical mean of 22 Bq/m 3 (equilibrium equivalent concentration). Thus the natural exposure vary from 1.0 to 5 mSv/a. The occupational exposure in different fields is as follows: medicine - 1.0 mSv/a; science, education - 0.9 mSv/a; NPP workers - 2.0 mSv/a. Taking into account that this way the risk equalizes for each population group the following conclusions are made: necessity for rendering an account of the individual natural background exposure when occupational risk is assessed; building of an adequate national system for record and limitation of exposure my medical use of ionizing radiation; improvement of the system for limitation of the radon exposure; re-examination of dosimetry control for occupational exposure; reconsidering of the social compensation for radiation risk

  19. Factor structure of the Tomimatsu-Sato metrics

    International Nuclear Information System (INIS)

    Perjes, Z.

    1989-02-01

    Based on an earlier result stating that δ = 3 Tomimatsu-Sato (TS) metrics can be factored over the field of integers, an analogous representation for higher TS metrics was sought. It is shown that the factoring property of TS metrics follows from the structure of special Hankel determinants. A set of linear algebraic equations determining the factors was defined, and the factors of the first five TS metrics were tabulated, together with their primitive factors. (R.P.) 4 refs.; 2 tabs

  20. Determination of dose equivalent with tissue-equivalent proportional counters

    International Nuclear Information System (INIS)

    Dietze, G.; Schuhmacher, H.; Menzel, H.G.

    1989-01-01

    Low pressure tissue-equivalent proportional counters (TEPC) are instruments based on the cavity chamber principle and provide spectral information on the energy loss of single charged particles crossing the cavity. Hence such detectors measure absorbed dose or kerma and are able to provide estimates on radiation quality. During recent years TEPC based instruments have been developed for radiation protection applications in photon and neutron fields. This was mainly based on the expectation that the energy dependence of their dose equivalent response is smaller than that of other instruments in use. Recently, such instruments have been investigated by intercomparison measurements in various neutron and photon fields. Although their principles of measurements are more closely related to the definition of dose equivalent quantities than those of other existing dosemeters, there are distinct differences and limitations with respect to the irradiation geometry and the determination of the quality factor. The application of such instruments for measuring ambient dose equivalent is discussed. (author)

  1. ST-intuitionistic fuzzy metric space with properties

    Science.gov (United States)

    Arora, Sahil; Kumar, Tanuj

    2017-07-01

    In this paper, we define ST-intuitionistic fuzzy metric space and the notion of convergence and completeness properties of cauchy sequences is studied. Further, we prove some properties of ST-intuitionistic fuzzy metric space. Finally, we introduce the concept of symmetric ST Intuitionistic Fuzzy metric space.

  2. Examination of types of exposure and management methods for nurses in interventional radiology

    International Nuclear Information System (INIS)

    Mori, Hiroshige; Fujii, Tomonori; Koshida, Kichiro; Ichikawa, Katsuhiro

    2007-01-01

    Although a large number of studies have been done on exposure to operators and doctors during interventional radiology (IVR), there have been very few reports on nurses. This study was carried out to clarify the situation regarding exposure for nurses, and provides examples of how to estimate and manage. We measured space dose-rate distributions with an ionization survey meter, and personal exposure dose by a small fluorescent grass dosimeter (Dose Ace). The experimental results disclosed that there tended to be two types of exposure depending on the task performed. Head and neck (collar level) were associated with the highest exposure dose, which was observed in nurses assisting operators. Alternatively, knees showed the highest exposure dose, which was observed in nurses observing and assisting the patient. When estimation of skin equivalent exposure at the knees is needed, it can be calculated by using the value measured at the collar level. Furthermore, in estimating exposure dose, the directional and energy characteristics of personal dosimeters should be considered adequate. For radiation management, a circular protective sheet can be placed around the patient's lower area and a protective screen near the patient's head, and basic and practical education can be given. We concluded that these are highly useful for the personal monitoring of nurses engaged in IVR. (author)

  3. Equivalent statistics and data interpretation.

    Science.gov (United States)

    Francis, Gregory

    2017-08-01

    Recent reform efforts in psychological science have led to a plethora of choices for scientists to analyze their data. A scientist making an inference about their data must now decide whether to report a p value, summarize the data with a standardized effect size and its confidence interval, report a Bayes Factor, or use other model comparison methods. To make good choices among these options, it is necessary for researchers to understand the characteristics of the various statistics used by the different analysis frameworks. Toward that end, this paper makes two contributions. First, it shows that for the case of a two-sample t test with known sample sizes, many different summary statistics are mathematically equivalent in the sense that they are based on the very same information in the data set. When the sample sizes are known, the p value provides as much information about a data set as the confidence interval of Cohen's d or a JZS Bayes factor. Second, this equivalence means that different analysis methods differ only in their interpretation of the empirical data. At first glance, it might seem that mathematical equivalence of the statistics suggests that it does not matter much which statistic is reported, but the opposite is true because the appropriateness of a reported statistic is relative to the inference it promotes. Accordingly, scientists should choose an analysis method appropriate for their scientific investigation. A direct comparison of the different inferential frameworks provides some guidance for scientists to make good choices and improve scientific practice.

  4. Pragmatic security metrics applying metametrics to information security

    CERN Document Server

    Brotby, W Krag

    2013-01-01

    Other books on information security metrics discuss number theory and statistics in academic terms. Light on mathematics and heavy on utility, PRAGMATIC Security Metrics: Applying Metametrics to Information Security breaks the mold. This is the ultimate how-to-do-it guide for security metrics.Packed with time-saving tips, the book offers easy-to-follow guidance for those struggling with security metrics. Step by step, it clearly explains how to specify, develop, use, and maintain an information security measurement system (a comprehensive suite of metrics) to

  5. A statistical assessment of differences and equivalences between genetically modified and reference plant varieties

    Directory of Open Access Journals (Sweden)

    Amzal Billy

    2011-02-01

    Full Text Available Abstract Background Safety assessment of genetically modified organisms is currently often performed by comparative evaluation. However, natural variation of plant characteristics between commercial varieties is usually not considered explicitly in the statistical computations underlying the assessment. Results Statistical methods are described for the assessment of the difference between a genetically modified (GM plant variety and a conventional non-GM counterpart, and for the assessment of the equivalence between the GM variety and a group of reference plant varieties which have a history of safe use. It is proposed to present the results of both difference and equivalence testing for all relevant plant characteristics simultaneously in one or a few graphs, as an aid for further interpretation in safety assessment. A procedure is suggested to derive equivalence limits from the observed results for the reference plant varieties using a specific implementation of the linear mixed model. Three different equivalence tests are defined to classify any result in one of four equivalence classes. The performance of the proposed methods is investigated by a simulation study, and the methods are illustrated on compositional data from a field study on maize grain. Conclusions A clear distinction of practical relevance is shown between difference and equivalence testing. The proposed tests are shown to have appropriate performance characteristics by simulation, and the proposed simultaneous graphical representation of results was found to be helpful for the interpretation of results from a practical field trial data set.

  6. Defining a Progress Metric for CERT RMM Improvement

    Science.gov (United States)

    2017-09-14

    REV-03.18.2016.0 Defining a Progress Metric for CERT-RMM Improvement Gregory Crabb Nader Mehravari David Tobar September 2017 TECHNICAL ...fendable resource allocation decisions. Technical metrics measure aspects of controls implemented through technology (systems, soft- ware, hardware...implementation metric would be the percentage of users who have received anti-phishing training . • Effectiveness/efficiency metrics measure whether

  7. IT Project Management Metrics

    Directory of Open Access Journals (Sweden)

    2007-01-01

    Full Text Available Many software and IT projects fail in completing theirs objectives because different causes of which the management of the projects has a high weight. In order to have successfully projects, lessons learned have to be used, historical data to be collected and metrics and indicators have to be computed and used to compare them with past projects and avoid failure to happen. This paper presents some metrics that can be used for the IT project management.

  8. Mass Customization Measurements Metrics

    DEFF Research Database (Denmark)

    Nielsen, Kjeld; Brunø, Thomas Ditlev; Jørgensen, Kaj Asbjørn

    2014-01-01

    A recent survey has indicated that 17 % of companies have ceased mass customizing less than 1 year after initiating the effort. This paper presents measurement for a company’s mass customization performance, utilizing metrics within the three fundamental capabilities: robust process design, choice...... navigation, and solution space development. A mass customizer when assessing performance with these metrics can identify within which areas improvement would increase competitiveness the most and enable more efficient transition to mass customization....

  9. Metrical Phonology: German Sound System.

    Science.gov (United States)

    Tice, Bradley S.

    Metrical phonology, a linguistic process of phonological stress assessment and diagrammatic simplification of sentence and word stress, is discussed as it is found in the English and German languages. The objective is to promote use of metrical phonology as a tool for enhancing instruction in stress patterns in words and sentences, particularly in…

  10. Construction of Einstein-Sasaki metrics in D≥7

    International Nuclear Information System (INIS)

    Lue, H.; Pope, C. N.; Vazquez-Poritz, J. F.

    2007-01-01

    We construct explicit Einstein-Kaehler metrics in all even dimensions D=2n+4≥6, in terms of a 2n-dimensional Einstein-Kaehler base metric. These are cohomogeneity 2 metrics which have the new feature of including a NUT-type parameter, or gravomagnetic charge, in addition to..' in addition to mass and rotation parameters. Using a canonical construction, these metrics all yield Einstein-Sasaki metrics in dimensions D=2n+5≥7. As is commonly the case in this type of construction, for suitable choices of the free parameters the Einstein-Sasaki metrics can extend smoothly onto complete and nonsingular manifolds, even though the underlying Einstein-Kaehler metric has conical singularities. We discuss some explicit examples in the case of seven-dimensional Einstein-Sasaki spaces. These new spaces can provide supersymmetric backgrounds in M theory, which play a role in the AdS 4 /CFT 3 correspondence

  11. A Metric on Phylogenetic Tree Shapes.

    Science.gov (United States)

    Colijn, C; Plazzotta, G

    2018-01-01

    The shapes of evolutionary trees are influenced by the nature of the evolutionary process but comparisons of trees from different processes are hindered by the challenge of completely describing tree shape. We present a full characterization of the shapes of rooted branching trees in a form that lends itself to natural tree comparisons. We use this characterization to define a metric, in the sense of a true distance function, on tree shapes. The metric distinguishes trees from random models known to produce different tree shapes. It separates trees derived from tropical versus USA influenza A sequences, which reflect the differing epidemiology of tropical and seasonal flu. We describe several metrics based on the same core characterization, and illustrate how to extend the metric to incorporate trees' branch lengths or other features such as overall imbalance. Our approach allows us to construct addition and multiplication on trees, and to create a convex metric on tree shapes which formally allows computation of average tree shapes. © The Author(s) 2017. Published by Oxford University Press, on behalf of the Society of Systematic Biologists.

  12. Software Quality Assurance Metrics

    Science.gov (United States)

    McRae, Kalindra A.

    2004-01-01

    Software Quality Assurance (SQA) is a planned and systematic set of activities that ensures conformance of software life cycle processes and products conform to requirements, standards and procedures. In software development, software quality means meeting requirements and a degree of excellence and refinement of a project or product. Software Quality is a set of attributes of a software product by which its quality is described and evaluated. The set of attributes includes functionality, reliability, usability, efficiency, maintainability, and portability. Software Metrics help us understand the technical process that is used to develop a product. The process is measured to improve it and the product is measured to increase quality throughout the life cycle of software. Software Metrics are measurements of the quality of software. Software is measured to indicate the quality of the product, to assess the productivity of the people who produce the product, to assess the benefits derived from new software engineering methods and tools, to form a baseline for estimation, and to help justify requests for new tools or additional training. Any part of the software development can be measured. If Software Metrics are implemented in software development, it can save time, money, and allow the organization to identify the caused of defects which have the greatest effect on software development. The summer of 2004, I worked with Cynthia Calhoun and Frank Robinson in the Software Assurance/Risk Management department. My task was to research and collect, compile, and analyze SQA Metrics that have been used in other projects that are not currently being used by the SA team and report them to the Software Assurance team to see if any metrics can be implemented in their software assurance life cycle process.

  13. Personal dose equivalent conversion coefficients for electrons to 1 Ge V.

    Science.gov (United States)

    Veinot, K G; Hertel, N E

    2012-04-01

    In a previous paper, conversion coefficients for the personal dose equivalent, H(p)(d), for photons were reported. This note reports values for electrons calculated using similar techniques. The personal dose equivalent is the quantity used to approximate the protection quantity effective dose when performing personal dosemeter calibrations and in practice the personal dose equivalent is determined using a 30×30×15 cm slab-type phantom. Conversion coefficients to 1 GeV have been calculated for H(p)(10), H(p)(3) and H(p)(0.07) in the recommended slab phantom. Although the conversion coefficients were determined for discrete incident energies, analytical fits of the conversion coefficients over the energy range are provided using a similar formulation as in the photon results previously reported. The conversion coefficients for the personal dose equivalent are compared with the appropriate protection quantity, calculated according to the recommendations of the latest International Commission on Radiological Protection guidance. Effects of eyewear on H(p)(3) are also discussed.

  14. Assessment of Human Exposure to ENMs.

    Science.gov (United States)

    Jiménez, Araceli Sánchez; van Tongeren, Martie

    2017-01-01

    Human exposure assessment of engineered nanomaterials (ENMs) is hampered, among other factors, by the difficulty to differentiate ENM from other nanomaterials (incidental to processes or naturally occurring) and the lack of a single metric that can be used for health risk assessment. It is important that the exposure assessment is carried out throughout the entire life-cycle as releases can occur at the different stages of the product life-cycle, from the synthesis, manufacture of the nano-enable product (occupational exposure) to the professional and consumer use of nano-enabled product (consumer exposure) and at the end of life.Occupational exposure surveys should follow a tiered approach, increasing in complexity in terms of instruments used and sampling strategy applied with higher tiers in order tailor the exposure assessment to the specific materials used and workplace exposure scenarios and to reduce uncertainty in assessment of exposure. Assessment of consumer exposure and of releases from end-of-life processes currently relies on release testing of nano-enabled products in laboratory settings.

  15. Statistics and assessment of individual dose from occupational exposure in nuclear industry (1985-1990)

    International Nuclear Information System (INIS)

    Wang Yunfang; Yang Lianzhen

    1993-01-01

    The summary and main results of individual dose monitoring (1985-1990) from occupational exposure in nuclear industry are presented. The statistical results show that the annual collective dose equivalent from external exposure to workers in six plants and institutes in 1985-1990 are 29.88, 26.95, 19.16, 14.26, 9.08 and 9.22 man · Sv, respectively. The annual average dose equivalent are 4.98, 4.66, 3.65, 2.79, 2.40 and 2,27 mSv, respectively. The general situation for individual dose monitoring from internal exposure is briefly introduced. The internal exposure dose from uranium, plutonium and tritium in some facilities are given. The annual average committed effective dose equivalent are less than 5.0 mSv. The individual dose monitoring results for occupational exposure from uranium mining are depicted. The individual dose monitoring data are analysed preliminarily

  16. Prioritizing Chemicals and Data Requirements for Screening-Level Exposure and Risk Assessment

    Science.gov (United States)

    Brown, Trevor N.; Wania, Frank; Breivik, Knut; McLachlan, Michael S.

    2012-01-01

    Background: Scientists and regulatory agencies strive to identify chemicals that may cause harmful effects to humans and the environment; however, prioritization is challenging because of the large number of chemicals requiring evaluation and limited data and resources. Objectives: We aimed to prioritize chemicals for exposure and exposure potential and obtain a quantitative perspective on research needs to better address uncertainty in screening assessments. Methods: We used a multimedia mass balance model to prioritize > 12,000 organic chemicals using four far-field human exposure metrics. The propagation of variance (uncertainty) in key chemical information used as model input for calculating exposure metrics was quantified. Results: Modeled human concentrations and intake rates span approximately 17 and 15 orders of magnitude, respectively. Estimates of exposure potential using human concentrations and a unit emission rate span approximately 13 orders of magnitude, and intake fractions span 7 orders of magnitude. The actual chemical emission rate contributes the greatest variance (uncertainty) in exposure estimates. The human biotransformation half-life is the second greatest source of uncertainty in estimated concentrations. In general, biotransformation and biodegradation half-lives are greater sources of uncertainty in modeled exposure and exposure potential than chemical partition coefficients. Conclusions: Mechanistic exposure modeling is suitable for screening and prioritizing large numbers of chemicals. By including uncertainty analysis and uncertainty in chemical information in the exposure estimates, these methods can help identify and address the important sources of uncertainty in human exposure and risk assessment in a systematic manner. PMID:23008278

  17. Prioritizing chemicals and data requirements for screening-level exposure and risk assessment.

    Science.gov (United States)

    Arnot, Jon A; Brown, Trevor N; Wania, Frank; Breivik, Knut; McLachlan, Michael S

    2012-11-01

    Scientists and regulatory agencies strive to identify chemicals that may cause harmful effects to humans and the environment; however, prioritization is challenging because of the large number of chemicals requiring evaluation and limited data and resources. We aimed to prioritize chemicals for exposure and exposure potential and obtain a quantitative perspective on research needs to better address uncertainty in screening assessments. We used a multimedia mass balance model to prioritize > 12,000 organic chemicals using four far-field human exposure metrics. The propagation of variance (uncertainty) in key chemical information used as model input for calculating exposure metrics was quantified. Modeled human concentrations and intake rates span approximately 17 and 15 orders of magnitude, respectively. Estimates of exposure potential using human concentrations and a unit emission rate span approximately 13 orders of magnitude, and intake fractions span 7 orders of magnitude. The actual chemical emission rate contributes the greatest variance (uncertainty) in exposure estimates. The human biotransformation half-life is the second greatest source of uncertainty in estimated concentrations. In general, biotransformation and biodegradation half-lives are greater sources of uncertainty in modeled exposure and exposure potential than chemical partition coefficients. Mechanistic exposure modeling is suitable for screening and prioritizing large numbers of chemicals. By including uncertainty analysis and uncertainty in chemical information in the exposure estimates, these methods can help identify and address the important sources of uncertainty in human exposure and risk assessment in a systematic manner.

  18. Degraded visual environment image/video quality metrics

    Science.gov (United States)

    Baumgartner, Dustin D.; Brown, Jeremy B.; Jacobs, Eddie L.; Schachter, Bruce J.

    2014-06-01

    A number of image quality metrics (IQMs) and video quality metrics (VQMs) have been proposed in the literature for evaluating techniques and systems for mitigating degraded visual environments. Some require both pristine and corrupted imagery. Others require patterned target boards in the scene. None of these metrics relates well to the task of landing a helicopter in conditions such as a brownout dust cloud. We have developed and used a variety of IQMs and VQMs related to the pilot's ability to detect hazards in the scene and to maintain situational awareness. Some of these metrics can be made agnostic to sensor type. Not only are the metrics suitable for evaluating algorithm and sensor variation, they are also suitable for choosing the most cost effective solution to improve operating conditions in degraded visual environments.

  19. Intercomparison of personnel dosimetry for thermal neutron dose equivalent in neutron and gamma-ray mixed fields

    International Nuclear Information System (INIS)

    Ogawa, Yoshihiro

    1985-01-01

    In order to consider the problems concerned with personnel dosimetry using film badges and TLDs, an intercomparison of personnel dosimetry, especially dose equivalent responses of personnel dosimeters to thermal neutron, was carried out in five different neutron and gamma-ray mixed fields at KUR and UTR-KINKI from the practical point of view. For the estimation of thermal neutron dose equivalent, it may be concluded that each personnel dosimeter has good performances in the precision, that is, the standard deviations in the measured values by individual dosimeter were within 24 %, and the dose equivalent responses to thermal neutron were almost independent on cadmium ratio and gamma-ray contamination. However, the relative thermal neutron dose equivalent of individual dosimeter normalized to the ICRP recommended value varied considerably and a difference of about 4 times was observed among the dosimeters. From the results obtained, it is suggested that the standardization of calibration factors and procedures is required from the practical point of radiation protection and safety. (author)

  20. Hybrid Air Quality Modeling Approach for use in the Hear-road Exposures to Urban air pollutant Study(NEXUS)

    Science.gov (United States)

    The paper presents a hybrid air quality modeling approach and its application in NEXUS in order to provide spatial and temporally varying exposure estimates and identification of the mobile source contribution to the total pollutant exposure. Model-based exposure metrics, associa...

  1. The Jacobi metric for timelike geodesics in static spacetimes

    Science.gov (United States)

    Gibbons, G. W.

    2016-01-01

    It is shown that the free motion of massive particles moving in static spacetimes is given by the geodesics of an energy-dependent Riemannian metric on the spatial sections analogous to Jacobi's metric in classical dynamics. In the massless limit Jacobi's metric coincides with the energy independent Fermat or optical metric. For stationary metrics, it is known that the motion of massless particles is given by the geodesics of an energy independent Finslerian metric of Randers type. The motion of massive particles is governed by neither a Riemannian nor a Finslerian metric. The properies of the Jacobi metric for massive particles moving outside the horizon of a Schwarschild black hole are described. By constrast with the massless case, the Gaussian curvature of the equatorial sections is not always negative.

  2. Measurable Control System Security through Ideal Driven Technical Metrics

    Energy Technology Data Exchange (ETDEWEB)

    Miles McQueen; Wayne Boyer; Sean McBride; Marie Farrar; Zachary Tudor

    2008-01-01

    The Department of Homeland Security National Cyber Security Division supported development of a small set of security ideals as a framework to establish measurable control systems security. Based on these ideals, a draft set of proposed technical metrics was developed to allow control systems owner-operators to track improvements or degradations in their individual control systems security posture. The technical metrics development effort included review and evaluation of over thirty metrics-related documents. On the bases of complexity, ambiguity, or misleading and distorting effects the metrics identified during the reviews were determined to be weaker than necessary to aid defense against the myriad threats posed by cyber-terrorism to human safety, as well as to economic prosperity. Using the results of our metrics review and the set of security ideals as a starting point for metrics development, we identified thirteen potential technical metrics - with at least one metric supporting each ideal. Two case study applications of the ideals and thirteen metrics to control systems were then performed to establish potential difficulties in applying both the ideals and the metrics. The case studies resulted in no changes to the ideals, and only a few deletions and refinements to the thirteen potential metrics. This led to a final proposed set of ten core technical metrics. To further validate the security ideals, the modifications made to the original thirteen potential metrics, and the final proposed set of ten core metrics, seven separate control systems security assessments performed over the past three years were reviewed for findings and recommended mitigations. These findings and mitigations were then mapped to the security ideals and metrics to assess gaps in their coverage. The mappings indicated that there are no gaps in the security ideals and that the ten core technical metrics provide significant coverage of standard security issues with 87% coverage. Based

  3. Experiential space is hardly metric

    Czech Academy of Sciences Publication Activity Database

    Šikl, Radovan; Šimeček, Michal; Lukavský, Jiří

    2008-01-01

    Roč. 2008, č. 37 (2008), s. 58-58 ISSN 0301-0066. [European Conference on Visual Perception. 24.08-28.08.2008, Utrecht] R&D Projects: GA ČR GA406/07/1676 Institutional research plan: CEZ:AV0Z70250504 Keywords : visual space perception * metric and non-metric perceptual judgments * ecological validity Subject RIV: AN - Psychology

  4. High resolution metric imaging payload

    Science.gov (United States)

    Delclaud, Y.

    2017-11-01

    Alcatel Space Industries has become Europe's leader in the field of high and very high resolution optical payloads, in the frame work of earth observation system able to provide military government with metric images from space. This leadership allowed ALCATEL to propose for the export market, within a French collaboration frame, a complete space based system for metric observation.

  5. Radiation practices and radiation measurements

    International Nuclear Information System (INIS)

    2008-03-01

    The guide presents the principal requirements on accuracy of radiation measurements and on the approval, calibration and operating condition inspections of radiation meters, together with requirements for dosimetric services measuring the individual radiation doses of workers engaged in radiation work (approved dosimetric services). The Guide also sets out the definitions of quantities and units used in radiation measurements. The radiation protection quantities used for assessing the harmful effects of radiation and for expressing the maximum values for radiation exposure (equivalent dose and effective dose) are set out in Guide ST 7.2. This Guide concerns measurements of ionizing radiation involved in radiation practices, the results of which are used for determining the radiation exposure of workers engaged in radiation work and members of the public, and of patients subject to the use of radiation in health services, or upon the basis of which compliance with safety requirements of appliances currently in use and of their premises of use or of the workplaces of workers is ensured. The Guide also concerns measurements of the radon concentration of inhaled air in both workplaces and dwellings. The Guide does not apply to determining the radiation exposure of aircrews, determination of exposure caused by internal radiation, or measurements made to protect the public in the event of, or in preparation for abnormal radiation conditions

  6. Using plutonium excretion data to predict dose from chronic and acute exposures

    International Nuclear Information System (INIS)

    Krahenbuhl, M.P.; Wilde, J.L.; Slaughter, D.M.

    2000-01-01

    Using fission track analysis (FTA) in conjunction with a composite theoretical model of the transport of plutonium (Pu) in the human body creates a new opportunity to estimate the exposure and dose to the general population due to plutonium in the environment. For the purposes of this study, data derived from FTA performed at the University of Utah's Center for Excellence in Nuclear Technology, Engineering and Research (CENTER) has been used to predict doses for two populations. Both population groups have no known history of plutonium exposures. Therefore, two exposure scenarios (acute and chronic) were assumed to provide boundaries for dose estimates. Dose predictions focus on equivalent dose to lung, liver, and skeletal systems and range from 0.01 mSv to 560 mSv as a function of organ, sample collection interval and exposure type. Additionally, these reconstructions demonstrate the sensitivity of dose calculations to time of sample collection and duration of exposure. As anticipated for a class Y particle, the predicted average equivalent tissue dose to the lungs represents the highest dose to the evaluated compartments. Furthermore, the data imply that the general population receives a dose one order of magnitude lower than a radiation worker with no history of exposure for the equivalent exposure scenario. (author)

  7. The equivalence principle and the gravitational constant in experimental relativity

    International Nuclear Information System (INIS)

    Spallicci, A.D.A.M.

    1988-01-01

    Fischbach's analysis of the Eotvos experiment, showing an embedded fifth force, has stressed the importance of further tests of the Equivalence Principle (EP). From Galilei and Newton, the EP played the role of a postulate for all gravitational physics and mechanics (weak EP), until Einstein, who extended the validity of the EP to all physics (strong EP). After Fischbach's publication on the fifth force, several experiments have been performed or simply proposed to test the WEP. They are concerned with possible gravitational potential anomalies, depending upon distances or matter composition. While the low level of accuracy with which the gravitational constant G is known has been recognized, experiments have been proposed to test G in the range from few cm until 200 m. This paper highlights the different features of the proposed space experiments. Possible implications on the metric formalism for objects in low potential and slow motion are briefly indicated

  8. Smart Grid Status and Metrics Report Appendices

    Energy Technology Data Exchange (ETDEWEB)

    Balducci, Patrick J. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Antonopoulos, Chrissi A. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Clements, Samuel L. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Gorrissen, Willy J. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Kirkham, Harold [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Ruiz, Kathleen A. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Smith, David L. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Weimar, Mark R. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Gardner, Chris [APQC, Houston, TX (United States); Varney, Jeff [APQC, Houston, TX (United States)

    2014-07-01

    A smart grid uses digital power control and communication technology to improve the reliability, security, flexibility, and efficiency of the electric system, from large generation through the delivery systems to electricity consumers and a growing number of distributed generation and storage resources. To convey progress made in achieving the vision of a smart grid, this report uses a set of six characteristics derived from the National Energy Technology Laboratory Modern Grid Strategy. The Smart Grid Status and Metrics Report defines and examines 21 metrics that collectively provide insight into the grid’s capacity to embody these characteristics. This appendix presents papers covering each of the 21 metrics identified in Section 2.1 of the Smart Grid Status and Metrics Report. These metric papers were prepared in advance of the main body of the report and collectively form its informational backbone.

  9. Exposure mode study to xenon-133 in a reactor building

    International Nuclear Information System (INIS)

    Perier, Aurelien

    2014-01-01

    The work described in this thesis focuses on the external and internal dose assessment to xenon-133. During the nuclear reactor operation, fission products and radioactive inert gases, as 133 Xe, are generated and might be responsible for the exposure of workers in case of clad defect. Particle Monte Carlo transport code is adapted in radioprotection to quantify dosimetric quantities. The study of exposure to xenon-133 is conducted by using Monte-Carlo simulations based on GEANT4, an anthropomorphic phantom, a realistic geometry of the reactor building, and compartmental models. The external exposure inside a reactor building is conducted with a realistic and conservative exposure scenario. The effective dose rate and the eye lens equivalent dose rate are determined by Monte-Carlo simulations. Due to the particular emission spectrum of xenon-133, the equivalent dose rate to the lens of eyes is discussed in the light of expected new eye dose limits. The internal exposure occurs while xenon-133 is inhaled. The lungs are firstly exposed by inhalation, and their equivalent dose rate is obtained by Monte-Carlo simulations. A biokinetic model is used to evaluate the internal exposure to xenon-133. This thesis gives us a better understanding to the dosimetric quantities related to external and internal exposure to xenon-133. Moreover the impacts of the dosimetric changes are studied on the current and future dosimetric limits. The dosimetric quantities are lower than the current and future dosimetric limits. (author)

  10. Effective dose efficiency: an application-specific metric of quality and dose for digital radiography

    Energy Technology Data Exchange (ETDEWEB)

    Samei, Ehsan; Ranger, Nicole T; Dobbins, James T III; Ravin, Carl E, E-mail: samei@duke.edu [Carl E Ravin Advanced Imaging Laboratories, Department of Radiology (United States)

    2011-08-21

    The detective quantum efficiency (DQE) and the effective DQE (eDQE) are relevant metrics of image quality for digital radiography detectors and systems, respectively. The current study further extends the eDQE methodology to technique optimization using a new metric of the effective dose efficiency (eDE), reflecting both the image quality as well as the effective dose (ED) attributes of the imaging system. Using phantoms representing pediatric, adult and large adult body habitus, image quality measurements were made at 80, 100, 120 and 140 kVp using the standard eDQE protocol and exposures. ED was computed using Monte Carlo methods. The eDE was then computed as a ratio of image quality to ED for each of the phantom/spectral conditions. The eDQE and eDE results showed the same trends across tube potential with 80 kVp yielding the highest values and 120 kVp yielding the lowest. The eDE results for the pediatric phantom were markedly lower than the results for the adult phantom at spatial frequencies lower than 1.2-1.7 mm{sup -1}, primarily due to a correspondingly higher value of ED per entrance exposure. The relative performance for the adult and large adult phantoms was generally comparable but affected by kVps. The eDE results for the large adult configuration were lower than the eDE results for the adult phantom, across all spatial frequencies (120 and 140 kVp) and at spatial frequencies greater than 1.0 mm{sup -1} (80 and 100 kVp). Demonstrated for chest radiography, the eDE shows promise as an application-specific metric of imaging performance, reflective of body habitus and radiographic technique, with utility for radiography protocol assessment and optimization.

  11. Non-metric close range photogrammetric system for mapping geologic structures in mines

    Energy Technology Data Exchange (ETDEWEB)

    Brandow, V D

    1976-01-01

    A stereographic close-range photogrammetric method of obtaining structural data for mine roof stability analyses is described. Stereo pairs were taken with 70 mm and 35 mm non-metric cameras. Photo co-ordinates were measured with a stereo-comparator and reduced by the direct linear transformation method. Field trials demonstrate that the technique is sufficiently accurate for geological work and is a practical method of mapping.

  12. Implications of Metric Choice for Common Applications of Readmission Metrics

    OpenAIRE

    Davies, Sheryl; Saynina, Olga; Schultz, Ellen; McDonald, Kathryn M; Baker, Laurence C

    2013-01-01

    Objective. To quantify the differential impact on hospital performance of three readmission metrics: all-cause readmission (ACR), 3M Potential Preventable Readmission (PPR), and Centers for Medicare and Medicaid 30-day readmission (CMS).

  13. Prognostic Performance Metrics

    Data.gov (United States)

    National Aeronautics and Space Administration — This chapter presents several performance metrics for offline evaluation of prognostics algorithms. A brief overview of different methods employed for performance...

  14. Adaptive metric kernel regression

    DEFF Research Database (Denmark)

    Goutte, Cyril; Larsen, Jan

    2000-01-01

    Kernel smoothing is a widely used non-parametric pattern recognition technique. By nature, it suffers from the curse of dimensionality and is usually difficult to apply to high input dimensions. In this contribution, we propose an algorithm that adapts the input metric used in multivariate...... regression by minimising a cross-validation estimate of the generalisation error. This allows to automatically adjust the importance of different dimensions. The improvement in terms of modelling performance is illustrated on a variable selection task where the adaptive metric kernel clearly outperforms...

  15. Human population exposure to cosmic radiation

    International Nuclear Information System (INIS)

    Bouville, A.; Lowder, W.M.

    1988-01-01

    Critical evaluations of existing data on cosmic radiation in the atmosphere and in interplanetary space have been carried out in order to estimate the exposure of the world's population to this important component of natural background radiation. Data on population distribution and mean terrain heights on a 1 x 1 degree grid have been folded in to estimate regional and global dose distributions. The per caput annual dose equivalent at ground altitudes is estimated to be 270 μSv from charged particles and 50 μSv from neutrons. More than 100 million people receive more than 1 mSv in a year, and two million in excess of 5 mSv. Aircraft flight crews and frequent flyers receive an additional annual dose equivalent in the order of 1 mSv, though the global per caput annual dose equivalent from airplane flights is only about 1 μSv. Future space travellers on extended missions are likely to receive dose equivalents in the range 0.11 Sv, with the possibility of higher doses at relatively high dose rates from unusually large solar flares. These results indicate a critical need for a better understanding of the biological significance of chronic neutron and heavy charged particle exposure. (author)

  16. The meaning and the principle of determination of the effective dose equivalent in radiation protection

    International Nuclear Information System (INIS)

    Drexler, G.; Williams, G.; Zankl, M.

    1985-01-01

    Since the introduction of the quantity ''effective dose equivalent'' within the framework of new radiation concepts, the meaning and interpretation of the quantity is often discussed and debated. Because of its adoption as a limiting quantity in many international and national laws, it is necessary to be able to interpret this main radiation protection quantity. Examples of organ doses and the related Hsub(E) values in occupational and medical exposures are presented and the meaning of the quantity is considered for whole body exposures to external and internal photon sources, as well as for partial body external exposures to photons. (author)

  17. Incorporating big data into treatment plan evaluation: Development of statistical DVH metrics and visualization dashboards.

    Science.gov (United States)

    Mayo, Charles S; Yao, John; Eisbruch, Avraham; Balter, James M; Litzenberg, Dale W; Matuszak, Martha M; Kessler, Marc L; Weyburn, Grant; Anderson, Carlos J; Owen, Dawn; Jackson, William C; Haken, Randall Ten

    2017-01-01

    . Statistical DVH offers an easy-to-read, detailed, and comprehensive way to visualize the quantitative comparison with historical experiences and among institutions. WES and GEM metrics offer a flexible means of incorporating discrete threshold-prioritizations and historic context into a set of standardized scoring metrics. Together, they provide a practical approach for incorporating big data into clinical practice for treatment plan evaluations.

  18. Trends in sunburns, sun protection practices, and attitudes toward sun exposure protection and tanning among US adolescents, 1998-2004.

    Science.gov (United States)

    Cokkinides, Vilma; Weinstock, Martin; Glanz, Karen; Albano, Jessica; Ward, Elizabeth; Thun, Michael

    2006-09-01

    Sun exposure in childhood is an important risk factor for developing skin cancer as an adult. Despite extensive efforts to reduce sun exposure among the young, there are no population-based data on trends in sunburns and sun protection practices in the young. The aim of this study was to describe nationally representative trend data on sunburns, sun protection, and attitudes related to sun exposure among US youth. Cross-sectional telephone surveys of youth aged 11 to 18 years in 1998 (N = 1196) and in 2004 (N = 1613) were conducted using a 2-stage sampling process to draw population-based samples. The surveys asked identical questions about sun protection, number of sunburns experienced, and attitudes toward sun exposure. Time trends were evaluated using pooled logistic regression analysis. In 2004, 69% of subjects reported having been sunburned during the summer, not significantly less than in 1998 (72%). There was a significant decrease in the percentage of those aged 11 to 15 years who reported sunburns and a nonsignificant increase among the 16- to 18-year-olds. The proportion of youth who reported regular sunscreen use increased significantly from 31% to 39%. Little change occurred in other recommended sun protection practices. A small reduction in sunburn frequency and modest increases in sun protection practices were observed among youth between 1998 and 2004, despite widespread sun protection campaigns. Nevertheless, the decrease in sunburns among younger teens may be cause for optimism regarding future trends. Overall, there was rather limited progress in improving sun protection practices and reducing sunburns among US youth between 1998 and 2004.

  19. Radiation exposure to skin following radioactive contamination

    International Nuclear Information System (INIS)

    Baumann, H.; Beyermann, M.; Kraus, W.

    1989-01-01

    In the case of skin contamination intensive decontamination measures should not be carried out until the potential radiation exposure to the basal cell layer of the epidermis was assessed. Dose equivalent rates from alpha-, beta- or photon-emitting contaminants were calculated with reference to the surface activity for different skin regions as a function of radiation energy on the condition that the skin was healthy and uninjured and the penetration of contaminants through the epidermis negligible. The results have been presented in the form of figures and tables. In the assessment of potential skin doses, both radioactive decay and practical experience as to the decrease in the level of surface contamination by natural desquamation of the stratum corneum were taken into account. 9 figs., 5 tabs., 46 refs. (author)

  20. The implications of carbon dioxide and methane exchange for the heavy mitigation RCP2.6 scenario under two metrics

    International Nuclear Information System (INIS)

    Huntingford, Chris; Lowe, Jason A.; Howarth, Nicholas; Bowerman, Niel H.A.; Gohar, Laila K.; Otto, Alexander; Lee, David S.; Smith, Stephen M.; Elzen, Michel G.J. den; Vuuren, Detlef P. van; Millar, Richard J.; Allen, Myles R.

    2015-01-01

    Highlights: • Exchanging methane for carbon dioxide emissions affects peak global warming. • Economic constraints severely affects exchange possibilities. • Chosen metric determines if economic to eliminate all removable methane emissions. • If all methane emissions could be removed, this could aid meeting two-degrees warming target. - Abstract: Greenhouse gas emissions associated with Representative Concentration Pathway RCP2.6 could limit global warming to around or below a 2 °C increase since pre-industrial times. However this scenario implies very large and rapid reductions in both carbon dioxide (CO 2 ) and non-CO 2 emissions, and suggests a need to understand available flexibility between how different greenhouse gases might be abated. There is a growing interest in developing a greater understanding of the particular role of shorter lived non-CO 2 gases as abatement options. We address this here through a sensitivity study of different methane (CH 4 ) emissions pathways to year 2100 and beyond, by including exchanges with CO 2 emissions, and with a focus on related climate and economic advantages and disadvantages. Metrics exist that characterise gas equivalence in terms of climate change effect per tonne emitted. We analyse the implications of CO 2 and CH 4 emission exchanges under two commonly considered metrics: the 100-yr Global Warming Potential (GWP-100) and Global Temperature Potential (GTP-100). This is whilst keeping CO 2 -equivalent emissions pathways fixed, based on the standard set of emissions usually associated with RCP2.6. An idealised situation of anthropogenic CH 4 emissions being reduced to zero across a period of two decades and with the implementation of such cuts starting almost immediately gives lower warming than for standard RCP2.6 emissions during the 21st and 22nd Century. This is despite exchanging for higher CO 2 emissions. Introducing Marginal Abatement Cost (MAC) curves provides an economic assessment of alternative gas

  1. Measurement Of Lead Equivalent Thickness For Irradiation Room: An Analysis

    International Nuclear Information System (INIS)

    Mohd Khalid Matori; Azuhar Ripin; Husaini Salleh; Mohd Khairusalih Mohd Zin; Muhammad Jamal Muhd Isa; Mohd Faizal Abdul Rahman

    2014-01-01

    The Malaysian Ministry of Health (MOH) has established that the irradiation room must have a sufficient thickness of shielding to ensure that requirements for the purpose of radiation protection of patients, employees and the public are met. This paper presents a technique using americium-241 source to test and verify the integrity of the shielding thickness in term of lead equivalent for irradiation room at health clinics own by MOH. Results of measurement of 8 irradiation rooms conducted in 2014 were analyzed for this presentation. Technical comparison of the attenuation of gamma rays from Am-241 source through the walls of the irradiation room and pieces of lead were used to assess the lead equivalent thickness of the walls. Results showed that almost all the irradiation rooms tested meet the requirements of the Ministry of Health and is suitable for the installation of the intended diagnostic X-ray apparatus. Some specific positions such as door knobs and locks, electrical plug sockets were identified with potential to not met the required lead equivalent thickness hence may contribute to higher radiation exposure to workers and the public. (author)

  2. Equivalent Dynamic Models.

    Science.gov (United States)

    Molenaar, Peter C M

    2017-01-01

    Equivalences of two classes of dynamic models for weakly stationary multivariate time series are discussed: dynamic factor models and autoregressive models. It is shown that exploratory dynamic factor models can be rotated, yielding an infinite set of equivalent solutions for any observed series. It also is shown that dynamic factor models with lagged factor loadings are not equivalent to the currently popular state-space models, and that restriction of attention to the latter type of models may yield invalid results. The known equivalent vector autoregressive model types, standard and structural, are given a new interpretation in which they are conceived of as the extremes of an innovating type of hybrid vector autoregressive models. It is shown that consideration of hybrid models solves many problems, in particular with Granger causality testing.

  3. Energy-Based Metrics for Arthroscopic Skills Assessment.

    Science.gov (United States)

    Poursartip, Behnaz; LeBel, Marie-Eve; McCracken, Laura C; Escoto, Abelardo; Patel, Rajni V; Naish, Michael D; Trejos, Ana Luisa

    2017-08-05

    Minimally invasive skills assessment methods are essential in developing efficient surgical simulators and implementing consistent skills evaluation. Although numerous methods have been investigated in the literature, there is still a need to further improve the accuracy of surgical skills assessment. Energy expenditure can be an indication of motor skills proficiency. The goals of this study are to develop objective metrics based on energy expenditure, normalize these metrics, and investigate classifying trainees using these metrics. To this end, different forms of energy consisting of mechanical energy and work were considered and their values were divided by the related value of an ideal performance to develop normalized metrics. These metrics were used as inputs for various machine learning algorithms including support vector machines (SVM) and neural networks (NNs) for classification. The accuracy of the combination of the normalized energy-based metrics with these classifiers was evaluated through a leave-one-subject-out cross-validation. The proposed method was validated using 26 subjects at two experience levels (novices and experts) in three arthroscopic tasks. The results showed that there are statistically significant differences between novices and experts for almost all of the normalized energy-based metrics. The accuracy of classification using SVM and NN methods was between 70% and 95% for the various tasks. The results show that the normalized energy-based metrics and their combination with SVM and NN classifiers are capable of providing accurate classification of trainees. The assessment method proposed in this study can enhance surgical training by providing appropriate feedback to trainees about their level of expertise and can be used in the evaluation of proficiency.

  4. Principle of space existence and De Sitter metric

    International Nuclear Information System (INIS)

    Mal'tsev, V.K.

    1990-01-01

    The selection principle for the solutions of the Einstein equations suggested in a series of papers implies the existence of space (g ik ≠ 0) only in the presence of matter (T ik ≠0). This selection principle (principle of space existence, in the Markov terminology) implies, in the general case, the absence of the cosmological solution with the De Sitter metric. On the other hand, the De Sitter metric is necessary for describing both inflation and deflation periods of the Universe. It is shown that the De Sitter metric is also allowed by the selection principle under discussion if the metric experiences the evolution into the Friedmann metric

  5. What can article-level metrics do for you?

    Science.gov (United States)

    Fenner, Martin

    2013-10-01

    Article-level metrics (ALMs) provide a wide range of metrics about the uptake of an individual journal article by the scientific community after publication. They include citations, usage statistics, discussions in online comments and social media, social bookmarking, and recommendations. In this essay, we describe why article-level metrics are an important extension of traditional citation-based journal metrics and provide a number of example from ALM data collected for PLOS Biology.

  6. About the possibility of a generalized metric

    International Nuclear Information System (INIS)

    Lukacs, B.; Ladik, J.

    1991-10-01

    The metric (the structure of the space-time) may be dependent on the properties of the object measuring it. The case of size dependence of the metric was examined. For this dependence the simplest possible form of the metric tensor has been constructed which fulfils the following requirements: there be two extremal characteristic scales; the metric be unique and the usual between them; the change be sudden in the neighbourhood of these scales; the size of the human body appear as a parameter (postulated on the basis of some philosophical arguments). Estimates have been made for the two extremal length scales according to existing observations. (author) 19 refs

  7. Global spatially explicit CO2 emission metrics at 0.25° horizontal resolution for forest bioenergy

    Science.gov (United States)

    Cherubini, F.

    2015-12-01

    Bioenergy is the most important renewable energy option in studies designed to align with future RCP projections, reaching approximately 250 EJ/yr in RCP2.6, 145 EJ/yr in RCP4.5 and 180 EJ/yr in RCP8.5 by the end of the 21st century. However, many questions enveloping the direct carbon cycle and climate response to bioenergy remain partially unexplored. Bioenergy systems are largely assessed under the default climate neutrality assumption and the time lag between CO2 emissions from biomass combustion and CO2 uptake by vegetation is usually ignored. Emission metrics of CO2 from forest bioenergy are only available on a case-specific basis and their quantification requires processing of a wide spectrum of modelled or observed local climate and forest conditions. On the other hand, emission metrics are widely used to aggregate climate impacts of greenhouse gases to common units such as CO2-equivalents (CO2-eq.), but a spatially explicit analysis of emission metrics with global forest coverage is today lacking. Examples of emission metrics include the global warming potential (GWP), the global temperature change potential (GTP) and the absolute sustained emission temperature (aSET). Here, we couple a global forest model, a heterotrophic respiration model, and a global climate model to produce global spatially explicit emission metrics for CO2 emissions from forest bioenergy. We show their applications to global emissions in 2015 and until 2100 under the different RCP scenarios. We obtain global average values of 0.49 ± 0.03 kgCO2-eq. kgCO2-1 (mean ± standard deviation), 0.05 ± 0.05 kgCO2-eq. kgCO2-1, and 2.14·10-14 ± 0.11·10-14 °C (kg yr-1)-1, and 2.14·10-14 ± 0.11·10-14 °C (kg yr-1)-1 for GWP, GTP and aSET, respectively. We also present results aggregated at a grid, national and continental level. The metrics are found to correlate with the site-specific turnover times and local climate variables like annual mean temperature and precipitation. Simplified

  8. Investigation of in-vehicle speech intelligibility metrics for normal hearing and hearing impaired listeners

    Science.gov (United States)

    Samardzic, Nikolina

    The effectiveness of in-vehicle speech communication can be a good indicator of the perception of the overall vehicle quality and customer satisfaction. Currently available speech intelligibility metrics do not account in their procedures for essential parameters needed for a complete and accurate evaluation of in-vehicle speech intelligibility. These include the directivity and the distance of the talker with respect to the listener, binaural listening, hearing profile of the listener, vocal effort, and multisensory hearing. In the first part of this research the effectiveness of in-vehicle application of these metrics is investigated in a series of studies to reveal their shortcomings, including a wide range of scores resulting from each of the metrics for a given measurement configuration and vehicle operating condition. In addition, the nature of a possible correlation between the scores obtained from each metric is unknown. The metrics and the subjective perception of speech intelligibility using, for example, the same speech material have not been compared in literature. As a result, in the second part of this research, an alternative method for speech intelligibility evaluation is proposed for use in the automotive industry by utilizing a virtual reality driving environment for ultimately setting targets, including the associated statistical variability, for future in-vehicle speech intelligibility evaluation. The Speech Intelligibility Index (SII) was evaluated at the sentence Speech Receptions Threshold (sSRT) for various listening situations and hearing profiles using acoustic perception jury testing and a variety of talker and listener configurations and background noise. In addition, the effect of individual sources and transfer paths of sound in an operating vehicle to the vehicle interior sound, specifically their effect on speech intelligibility was quantified, in the framework of the newly developed speech intelligibility evaluation method. Lastly

  9. System equivalent model mixing

    Science.gov (United States)

    Klaassen, Steven W. B.; van der Seijs, Maarten V.; de Klerk, Dennis

    2018-05-01

    This paper introduces SEMM: a method based on Frequency Based Substructuring (FBS) techniques that enables the construction of hybrid dynamic models. With System Equivalent Model Mixing (SEMM) frequency based models, either of numerical or experimental nature, can be mixed to form a hybrid model. This model follows the dynamic behaviour of a predefined weighted master model. A large variety of applications can be thought of, such as the DoF-space expansion of relatively small experimental models using numerical models, or the blending of different models in the frequency spectrum. SEMM is outlined, both mathematically and conceptually, based on a notation commonly used in FBS. A critical physical interpretation of the theory is provided next, along with a comparison to similar techniques; namely DoF expansion techniques. SEMM's concept is further illustrated by means of a numerical example. It will become apparent that the basic method of SEMM has some shortcomings which warrant a few extensions to the method. One of the main applications is tested in a practical case, performed on a validated benchmark structure; it will emphasize the practicality of the method.

  10. Adaptive Metric Kernel Regression

    DEFF Research Database (Denmark)

    Goutte, Cyril; Larsen, Jan

    1998-01-01

    Kernel smoothing is a widely used nonparametric pattern recognition technique. By nature, it suffers from the curse of dimensionality and is usually difficult to apply to high input dimensions. In this paper, we propose an algorithm that adapts the input metric used in multivariate regression...... by minimising a cross-validation estimate of the generalisation error. This allows one to automatically adjust the importance of different dimensions. The improvement in terms of modelling performance is illustrated on a variable selection task where the adaptive metric kernel clearly outperforms the standard...

  11. Ideal Based Cyber Security Technical Metrics for Control Systems

    Energy Technology Data Exchange (ETDEWEB)

    W. F. Boyer; M. A. McQueen

    2007-10-01

    Much of the world's critical infrastructure is at risk from attack through electronic networks connected to control systems. Security metrics are important because they provide the basis for management decisions that affect the protection of the infrastructure. A cyber security technical metric is the security relevant output from an explicit mathematical model that makes use of objective measurements of a technical object. A specific set of technical security metrics are proposed for use by the operators of control systems. Our proposed metrics are based on seven security ideals associated with seven corresponding abstract dimensions of security. We have defined at least one metric for each of the seven ideals. Each metric is a measure of how nearly the associated ideal has been achieved. These seven ideals provide a useful structure for further metrics development. A case study shows how the proposed metrics can be applied to an operational control system.

  12. THE ROLE OF ARTICLE LEVEL METRICS IN SCIENTIFIC PUBLISHING

    Directory of Open Access Journals (Sweden)

    Vladimir TRAJKOVSKI

    2016-04-01

    Full Text Available Emerging metrics based on article-level does not exclude traditional metrics based on citations to the journal, but complements them. Article-level metrics (ALMs provide a wide range of metrics about the uptake of an individual journal article by the scientific community after publication. They include citations, statistics of usage, discussions in online comments and social media, social bookmarking, and recommendations. In this editorial, the role of article level metrics in publishing scientific papers has been described. Article-Level Metrics (ALMs are rapidly emerging as important tools to quantify how individual articles are being discussed, shared, and used. Data sources depend on the tool, but they include classic metrics indicators depending on citations, academic social networks (Mendeley, CiteULike, Delicious and social media (Facebook, Twitter, blogs, and Youtube. The most popular tools used to apply this new metrics are: Public Library of Science - Article-Level Metrics, Altmetric, Impactstory and Plum Analytics. Journal Impact Factor (JIF does not consider impact or influence beyond citations count as this count reflected only through Thomson Reuters’ Web of Science® database. JIF provides indicator related to the journal, but not related to a published paper. Thus, altmetrics now becomes an alternative metrics for performance assessment of individual scientists and their contributed scholarly publications. Macedonian scholarly publishers have to work on implementing of article level metrics in their e-journals. It is the way to increase their visibility and impact in the world of science.

  13. Metric to quantify white matter damage on brain magnetic resonance images

    Energy Technology Data Exchange (ETDEWEB)

    Valdes Hernandez, Maria del C.; Munoz Maniega, Susana; Anblagan, Devasuda; Bastin, Mark E.; Wardlaw, Joanna M. [University of Edinburgh, Department of Neuroimaging Sciences, Centre for Clinical Brain Sciences, Edinburgh (United Kingdom); University of Edinburgh, Centre for Cognitive Ageing and Cognitive Epidemiology, Edinburgh (United Kingdom); UK Dementia Research Institute, Edinburgh Dementia Research Centre, London (United Kingdom); Chappell, Francesca M.; Morris, Zoe; Sakka, Eleni [University of Edinburgh, Department of Neuroimaging Sciences, Centre for Clinical Brain Sciences, Edinburgh (United Kingdom); UK Dementia Research Institute, Edinburgh Dementia Research Centre, London (United Kingdom); Dickie, David Alexander; Royle, Natalie A. [University of Edinburgh, Department of Neuroimaging Sciences, Centre for Clinical Brain Sciences, Edinburgh (United Kingdom); University of Edinburgh, Centre for Cognitive Ageing and Cognitive Epidemiology, Edinburgh (United Kingdom); Armitage, Paul A. [University of Sheffield, Department of Cardiovascular Sciences, Sheffield (United Kingdom); Deary, Ian J. [University of Edinburgh, Centre for Cognitive Ageing and Cognitive Epidemiology, Edinburgh (United Kingdom); University of Edinburgh, Department of Psychology, Edinburgh (United Kingdom)

    2017-10-15

    Quantitative assessment of white matter hyperintensities (WMH) on structural Magnetic Resonance Imaging (MRI) is challenging. It is important to harmonise results from different software tools considering not only the volume but also the signal intensity. Here we propose and evaluate a metric of white matter (WM) damage that addresses this need. We obtained WMH and normal-appearing white matter (NAWM) volumes from brain structural MRI from community dwelling older individuals and stroke patients enrolled in three different studies, using two automatic methods followed by manual editing by two to four observers blind to each other. We calculated the average intensity values on brain structural fluid-attenuation inversion recovery (FLAIR) MRI for the NAWM and WMH. The white matter damage metric is calculated as the proportion of WMH in brain tissue weighted by the relative image contrast of the WMH-to-NAWM. The new metric was evaluated using tissue microstructure parameters and visual ratings of small vessel disease burden and WMH: Fazekas score for WMH burden and Prins scale for WMH change. The correlation between the WM damage metric and the visual rating scores (Spearman ρ > =0.74, p < 0.0001) was slightly stronger than between the latter and WMH volumes (Spearman ρ > =0.72, p < 0.0001). The repeatability of the WM damage metric was better than WM volume (average median difference between measurements 3.26% (IQR 2.76%) and 5.88% (IQR 5.32%) respectively). The follow-up WM damage was highly related to total Prins score even when adjusted for baseline WM damage (ANCOVA, p < 0.0001), which was not always the case for WMH volume, as total Prins was highly associated with the change in the intense WMH volume (p = 0.0079, increase of 4.42 ml per unit change in total Prins, 95%CI [1.17 7.67]), but not with the change in less-intense, subtle WMH, which determined the volumetric change. The new metric is practical and simple to calculate. It is robust to variations in

  14. Supplier selection using different metric functions

    Directory of Open Access Journals (Sweden)

    Omosigho S.E.

    2015-01-01

    Full Text Available Supplier selection is an important component of supply chain management in today’s global competitive environment. Hence, the evaluation and selection of suppliers have received considerable attention in the literature. Many attributes of suppliers, other than cost, are considered in the evaluation and selection process. Therefore, the process of evaluation and selection of suppliers is a multi-criteria decision making process. The methodology adopted to solve the supplier selection problem is intuitionistic fuzzy TOPSIS (Technique for Order Preference by Similarity to the Ideal Solution. Generally, TOPSIS is based on the concept of minimum distance from the positive ideal solution and maximum distance from the negative ideal solution. We examine the deficiencies of using only one metric function in TOPSIS and propose the use of spherical metric function in addition to the commonly used metric functions. For empirical supplier selection problems, more than one metric function should be used.

  15. 77 FR 12832 - Non-RTO/ISO Performance Metrics; Commission Staff Request Comments on Performance Metrics for...

    Science.gov (United States)

    2012-03-02

    ... Performance Metrics; Commission Staff Request Comments on Performance Metrics for Regions Outside of RTOs and... performance communicate about the benefits of RTOs and, where appropriate, (2) changes that need to be made to... common set of performance measures for markets both within and outside of ISOs/RTOs. As recommended by...

  16. Regional Sustainability: The San Luis Basin Metrics Project

    Science.gov (United States)

    There are a number of established, scientifically supported metrics of sustainability. Many of the metrics are data intensive and require extensive effort to collect data and compute. Moreover, individual metrics may not capture all aspects of a system that are relevant to sust...

  17. Metrics to assess injury prevention programs for young workers in high-risk occupations: a scoping review of the literature

    Directory of Open Access Journals (Sweden)

    Jennifer Smith

    2018-05-01

    Full Text Available Introduction: Despite legal protections for young workers in Canada, youth aged 15–24 are at high risk of traumatic occupational injury. While many injury prevention initiatives targeting young workers exist, the challenge faced by youth advocates and employers is deciding what aspect(s of prevention will be the most effective focus for their efforts. A review of the academic and grey literatures was undertaken to compile the metrics—both the indicators being evaluated and the methods of measurement—commonly used to assess injury prevention programs for young workers. Metrics are standards of measurement through which efficiency, performance, progress, or quality of a plan, process, or product can be assessed. Methods: A PICO framework was used to develop search terms. Medline, PubMed, OVID, EMBASE, CCOHS, PsychINFO, CINAHL, NIOSHTIC, Google Scholar and the grey literature were searched for articles in English, published between 1975-2015. Two independent reviewers screened the resulting list and categorized the metrics in three domains of injury prevention: Education, Environment and Enforcement. Results: Of 174 acquired articles meeting the inclusion criteria, 21 both described and assessed an intervention. Half were educational in nature (N=11. Commonly assessed metrics included: knowledge, perceptions, self-reported behaviours or intentions, hazardous exposures, injury claims, and injury counts. One study outlined a method for developing metrics to predict injury rates. Conclusion: Metrics specific to the evaluation of young worker injury prevention programs are needed, as current metrics are insufficient to predict reduced injuries following program implementation. One study, which the review brought to light, could be an appropriate model for future research to develop valid leading metrics specific to young workers, and then apply these metrics to injury prevention programs for youth.

  18. Measurement equivalence of the CES-D 8 depression-scale among the ageing population in eleven European countries.

    Science.gov (United States)

    Missinne, Sarah; Vandeviver, Christophe; Van de Velde, Sarah; Bracke, Piet

    2014-07-01

    Depression is one of the most prevalent mental disorders in later life. However, despite considerable research attention, great confusion remains regarding the association between ageing and depression. There is doubt as to whether a depression scale performs identically for different age groups and countries. Although measurement equivalence is a crucial prerequisite for valid comparisons across age groups and countries, it has not been established for the eight-item version of the Centre for Epidemiological Studies Depression Scale (CES-D8). Using multi-group confirmatory factor analysis, we assess configural, metric, and scalar measurement equivalence across two age groups (50-64 years of age and 65 or older) in eleven European countries, employing data from the Survey of Health, Ageing, and Retirement (SHARE). Results indicate that the construct of depression is comparable across age and country groups, allowing the substantive interpretation of correlates and mean levels of depressive symptoms. Copyright © 2014 Elsevier Inc. All rights reserved.

  19. On the relativity and equivalence principles in the gauge theory of gravitation

    International Nuclear Information System (INIS)

    Ivanenko, D.; Sardanashvily, G.

    1981-01-01

    One sees the basic ideas of the gauge gravitation theory still not generally accepted in spite of more than twenty years of its history. The chief reason lies in the fact that the gauge character of gravity is connected with the whole complex of problems of Einstein General Relativity: about the reference system definition, on the (3+1)-splitting, on the presence (or absence) of symmetries in GR, on the necessity (or triviality) of general covariance, on the meaning of equivalence principle, which led Einstein from Special to General Relativity |1|. The real actuality of this complex of interconnected problems is demonstrated by the well-known work of V. Fock, who saw no symmetries in General Relativity, declared the unnecessary Equivalence principle and proposed even to substitute the designation ''chronogeometry'' instead of ''general relativity'' (see also P. Havas). Developing this line, H. Bondi quite recently also expressed doubts about the ''relativity'' in Einstein theory of gravitation. All proposed versions of the gauge gravitation theory must clarify the discrepancy between Einstein gravitational field being a pseudo-Riemannian metric field, and the gauge potentials representing connections on some fiber bundles and there exists no group, whose gauging would lead to the purely gravitational part of connection (Christoffel symbols or Fock-Ivenenko-Weyl spinorial coefficients). (author)

  20. Eating habits and internal radiation exposures in Japanese

    International Nuclear Information System (INIS)

    Shiraishi, Kunio

    1995-01-01

    Recently, annual dose equivalent for Japanese was estimated to be 3.75 mSv. Medical radiation exposures (2.25 mSv/y) and exposures from natural sources of radiation (1.48 mSv/y) were the major contributors to this dose. Dietary intakes of both natural and man-made radionuclides directly related to internal exposures. In this paper, internal doses received only through ingestion of radionuclides in food are described; internal doses through inhalation have been excluded. First, the representative intakes of radionuclides for Japanese were estimated from the literature. Second, the annual dose equivalents were calculated according to intakes of individual radionuclides and weighted committed dose equivalents (Sv/Bq) of the International Commission on Radiological Protection Pub. 30. Total annual doses through radiation of natural sources and man-made sources, were estimated as 0.35 mSv and 0.001 mSv, respectively. Furthermore, the effects of imported foods on internal dose in Japanese were calculated preliminarily, because the contribution of imported foods to Japanese eating habits is increasing annually and will not be negligible when assessing internal dose in the near future. (author)