WorldWideScience

Sample records for benchmark dose approach

  1. EPA's Benchmark Dose Modeling Software

    Science.gov (United States)

    The EPA developed the Benchmark Dose Software (BMDS) as a tool to help Agency risk assessors facilitate applying benchmark dose (BMD) method’s to EPA’s human health risk assessment (HHRA) documents. The application of BMD methods overcomes many well know limitations ...

  2. Nonparametric estimation of benchmark doses in environmental risk assessment

    Science.gov (United States)

    Piegorsch, Walter W.; Xiong, Hui; Bhattacharya, Rabi N.; Lin, Lizhen

    2013-01-01

    Summary An important statistical objective in environmental risk analysis is estimation of minimum exposure levels, called benchmark doses (BMDs), that induce a pre-specified benchmark response in a dose-response experiment. In such settings, representations of the risk are traditionally based on a parametric dose-response model. It is a well-known concern, however, that if the chosen parametric form is misspecified, inaccurate and possibly unsafe low-dose inferences can result. We apply a nonparametric approach for calculating benchmark doses, based on an isotonic regression method for dose-response estimation with quantal-response data (Bhattacharya and Kong, 2007). We determine the large-sample properties of the estimator, develop bootstrap-based confidence limits on the BMDs, and explore the confidence limits’ small-sample properties via a short simulation study. An example from cancer risk assessment illustrates the calculations. PMID:23914133

  3. Effects of exposure imprecision on estimation of the benchmark dose

    DEFF Research Database (Denmark)

    Budtz-Jørgensen, Esben; Keiding, Niels; Grandjean, Philippe

    2004-01-01

    In regression analysis failure to adjust for imprecision in the exposure variable is likely to lead to underestimation of the exposure effect. However, the consequences of exposure error for determination of safe doses of toxic substances have so far not received much attention. The benchmark...... approach is one of the most widely used methods for development of exposure limits. An important advantage of this approach is that it can be applied to observational data. However, in this type of data, exposure markers are seldom measured without error. It is shown that, if the exposure error is ignored......, then the benchmark approach produces results that are biased toward higher and less protective levels. It is therefore important to take exposure measurement error into account when calculating benchmark doses. Methods that allow this adjustment are described and illustrated in data from an epidemiological study...

  4. Impact of Genomics Platform and Statistical Filtering on Transcriptional Benchmark Doses (BMD and Multiple Approaches for Selection of Chemical Point of Departure (PoD.

    Directory of Open Access Journals (Sweden)

    A Francina Webster

    Full Text Available Many regulatory agencies are exploring ways to integrate toxicogenomic data into their chemical risk assessments. The major challenge lies in determining how to distill the complex data produced by high-content, multi-dose gene expression studies into quantitative information. It has been proposed that benchmark dose (BMD values derived from toxicogenomics data be used as point of departure (PoD values in chemical risk assessments. However, there is limited information regarding which genomics platforms are most suitable and how to select appropriate PoD values. In this study, we compared BMD values modeled from RNA sequencing-, microarray-, and qPCR-derived gene expression data from a single study, and explored multiple approaches for selecting a single PoD from these data. The strategies evaluated include several that do not require prior mechanistic knowledge of the compound for selection of the PoD, thus providing approaches for assessing data-poor chemicals. We used RNA extracted from the livers of female mice exposed to non-carcinogenic (0, 2 mg/kg/day, mkd and carcinogenic (4, 8 mkd doses of furan for 21 days. We show that transcriptional BMD values were consistent across technologies and highly predictive of the two-year cancer bioassay-based PoD. We also demonstrate that filtering data based on statistically significant changes in gene expression prior to BMD modeling creates more conservative BMD values. Taken together, this case study on mice exposed to furan demonstrates that high-content toxicogenomics studies produce robust data for BMD modelling that are minimally affected by inter-technology variability and highly predictive of cancer-based PoD doses.

  5. Introduction to benchmark dose methods and U.S. EPA's benchmark dose software (BMDS) version 2.1.1

    International Nuclear Information System (INIS)

    Davis, J. Allen; Gift, Jeffrey S.; Zhao, Q. Jay

    2011-01-01

    Traditionally, the No-Observed-Adverse-Effect-Level (NOAEL) approach has been used to determine the point of departure (POD) from animal toxicology data for use in human health risk assessments. However, this approach is subject to substantial limitations that have been well defined, such as strict dependence on the dose selection, dose spacing, and sample size of the study from which the critical effect has been identified. Also, the NOAEL approach fails to take into consideration the shape of the dose-response curve and other related information. The benchmark dose (BMD) method, originally proposed as an alternative to the NOAEL methodology in the 1980s, addresses many of the limitations of the NOAEL method. It is less dependent on dose selection and spacing, and it takes into account the shape of the dose-response curve. In addition, the estimation of a BMD 95% lower bound confidence limit (BMDL) results in a POD that appropriately accounts for study quality (i.e., sample size). With the recent advent of user-friendly BMD software programs, including the U.S. Environmental Protection Agency's (U.S. EPA) Benchmark Dose Software (BMDS), BMD has become the method of choice for many health organizations world-wide. This paper discusses the BMD methods and corresponding software (i.e., BMDS version 2.1.1) that have been developed by the U.S. EPA, and includes a comparison with recently released European Food Safety Authority (EFSA) BMD guidance.

  6. A Web-Based System for Bayesian Benchmark Dose Estimation.

    Science.gov (United States)

    Shao, Kan; Shapiro, Andrew J

    2018-01-11

    Benchmark dose (BMD) modeling is an important step in human health risk assessment and is used as the default approach to identify the point of departure for risk assessment. A probabilistic framework for dose-response assessment has been proposed and advocated by various institutions and organizations; therefore, a reliable tool is needed to provide distributional estimates for BMD and other important quantities in dose-response assessment. We developed an online system for Bayesian BMD (BBMD) estimation and compared results from this software with U.S. Environmental Protection Agency's (EPA's) Benchmark Dose Software (BMDS). The system is built on a Bayesian framework featuring the application of Markov chain Monte Carlo (MCMC) sampling for model parameter estimation and BMD calculation, which makes the BBMD system fundamentally different from the currently prevailing BMD software packages. In addition to estimating the traditional BMDs for dichotomous and continuous data, the developed system is also capable of computing model-averaged BMD estimates. A total of 518 dichotomous and 108 continuous data sets extracted from the U.S. EPA's Integrated Risk Information System (IRIS) database (and similar databases) were used as testing data to compare the estimates from the BBMD and BMDS programs. The results suggest that the BBMD system may outperform the BMDS program in a number of aspects, including fewer failed BMD and BMDL calculations and estimates. The BBMD system is a useful alternative tool for estimating BMD with additional functionalities for BMD analysis based on most recent research. Most importantly, the BBMD has the potential to incorporate prior information to make dose-response modeling more reliable and can provide distributional estimates for important quantities in dose-response assessment, which greatly facilitates the current trend for probabilistic risk assessment. https://doi.org/10.1289/EHP1289.

  7. Categorical Regression and Benchmark Dose Software 3.0

    Science.gov (United States)

    The objective of this full-day course is to provide participants with interactive training on the use of the U.S. Environmental Protection Agency’s (EPA) Benchmark Dose software (BMDS, version 3.0, released fall 2018) and Categorical Regression software (CatReg, version 3.1...

  8. Benchmarking

    OpenAIRE

    Meylianti S., Brigita

    1999-01-01

    Benchmarking has different meaning to different people. There are five types of benchmarking, namely internal benchmarking, competitive benchmarking, industry / functional benchmarking, process / generic benchmarking and collaborative benchmarking. Each type of benchmarking has its own advantages as well as disadvantages. Therefore it is important to know what kind of benchmarking is suitable to a specific application. This paper will discuss those five types of benchmarking in detail, includ...

  9. Application of the hybrid approach to the benchmark dose of urinary cadmium as the reference level for renal effects in cadmium polluted and non-polluted areas in Japan

    International Nuclear Information System (INIS)

    Suwazono, Yasushi; Nogawa, Kazuhiro; Uetani, Mirei; Nakada, Satoru; Kido, Teruhiko; Nakagawa, Hideaki

    2011-01-01

    Objectives: The aim of this study was to evaluate the reference level of urinary cadmium (Cd) that caused renal effects. An updated hybrid approach was used to estimate the benchmark doses (BMDs) and their 95% lower confidence limits (BMDL) in subjects with a wide range of exposure to Cd. Methods: The total number of subjects was 1509 (650 men and 859 women) in non-polluted areas and 3103 (1397 men and 1706 women) in the environmentally exposed Kakehashi river basin. We measured urinary cadmium (U-Cd) as a marker of long-term exposure, and β2-microglobulin (β2-MG) as a marker of renal effects. The BMD and BMDL that corresponded to an additional risk (BMR) of 5% were calculated with background risk at zero exposure set at 5%. Results: The U-Cd BMDL for β2-MG was 3.5 μg/g creatinine in men and 3.7 μg/g creatinine in women. Conclusions: The BMDL values for a wide range of U-Cd were generally within the range of values measured in non-polluted areas in Japan. This indicated that the hybrid approach is a robust method for different ranges of cadmium exposure. The present results may contribute further to recent discussions on health risk assessment of Cd exposure.

  10. Application of the hybrid approach to the benchmark dose of urinary cadmium as the reference level for renal effects in cadmium polluted and non-polluted areas in Japan

    Energy Technology Data Exchange (ETDEWEB)

    Suwazono, Yasushi, E-mail: suwa@faculty.chiba-u.jp [Department of Occupational and Environmental Medicine, Graduate School of Medicine, Chiba University, 1-8-1 Inohana, Chuoku, Chiba 260-8670 (Japan); Nogawa, Kazuhiro; Uetani, Mirei [Department of Occupational and Environmental Medicine, Graduate School of Medicine, Chiba University, 1-8-1 Inohana, Chuoku, Chiba 260-8670 (Japan); Nakada, Satoru [Safety and Health Organization, Chiba University, 1-33 Yayoicho, Inageku, Chiba 263-8522 (Japan); Kido, Teruhiko [Department of Community Health Nursing, Kanazawa University School of Health Sciences, 5-11-80 Kodatsuno, Kanazawa, Ishikawa 920-0942 (Japan); Nakagawa, Hideaki [Department of Epidemiology and Public Health, Kanazawa Medical University, 1-1 Daigaku, Uchnada, Ishikawa 920-0293 (Japan)

    2011-02-15

    Objectives: The aim of this study was to evaluate the reference level of urinary cadmium (Cd) that caused renal effects. An updated hybrid approach was used to estimate the benchmark doses (BMDs) and their 95% lower confidence limits (BMDL) in subjects with a wide range of exposure to Cd. Methods: The total number of subjects was 1509 (650 men and 859 women) in non-polluted areas and 3103 (1397 men and 1706 women) in the environmentally exposed Kakehashi river basin. We measured urinary cadmium (U-Cd) as a marker of long-term exposure, and {beta}2-microglobulin ({beta}2-MG) as a marker of renal effects. The BMD and BMDL that corresponded to an additional risk (BMR) of 5% were calculated with background risk at zero exposure set at 5%. Results: The U-Cd BMDL for {beta}2-MG was 3.5 {mu}g/g creatinine in men and 3.7 {mu}g/g creatinine in women. Conclusions: The BMDL values for a wide range of U-Cd were generally within the range of values measured in non-polluted areas in Japan. This indicated that the hybrid approach is a robust method for different ranges of cadmium exposure. The present results may contribute further to recent discussions on health risk assessment of Cd exposure.

  11. BENCHMARK DOSES FOR CHEMICAL MIXTURES: EVALUATION OF A MIXTURE OF 18 PHAHS.

    Science.gov (United States)

    Benchmark doses (BMDs), defined as doses of a substance that are expected to result in a pre-specified level of "benchmark" response (BMR), have been used for quantifying the risk associated with exposure to environmental hazards. The lower confidence limit of the BMD is used as...

  12. Dose Rate Experiment at JET for Benchmarking the Calculation Direct One Step Method

    International Nuclear Information System (INIS)

    Angelone, M.; Petrizzi, L.; Pillon, M.; Villari, R.; Popovichev, S.

    2006-01-01

    Neutrons produced by D-D and D-T plasmas induce the activation of tokamak materials and of components. The development of reliable methods to assess dose rates is a key issue for maintenance and operating nuclear machines, in normal and off-normal conditions. In the frame of the EFDA Fusion Technology work programme, a computational tool based upon MCNP Monte Carlo code has been developed to predict the dose rate after shutdown: it is called Direct One Step Method (D1S). The D1S is an innovative approach in which the decay gammas are coupled to the neutrons as in the prompt case and they are transported in one single step in the same run. Benchmarking of this new tool with experimental data taken in a complex geometry like that of a tokamak is a fundamental step to test the reliability of the D1S method. A dedicated benchmark experiment was proposed for the 2005-2006 experimental campaign of JET. Two irradiation positions have been selected for the benchmark: one inner position inside the vessel, not far from the plasma, called the 2 upper irradiation end (IE2), where neutron fluence is relatively high. The second position is just outside a vertical port in an external position (EX). Here the neutron flux is lower and the dose rate to be measured is not very far from the residual background. Passive detectors are used for in-vessel measurements: the high sensitivity Thermo Luminescent Dosimeters (TLDs) GR-200A (natural LiF), which ensure measurements down to environmental dose level. An active detector of Geiger-Muller (GM) type is used for out of vessel dose rate measurement. Before their use the detectors were calibrated in a secondary gamma-ray standard (Cs-137 and Co-60) facility in term of air-kerma. The background measurement was carried-out in the period July -September 2005 in the outside position EX using the GM tube and in September 2005 inside the vacuum vessel using TLD detectors located in the 2 Upper irradiation end IE2. In the present work

  13. Current modeling practice may lead to falsely high benchmark dose estimates.

    Science.gov (United States)

    Ringblom, Joakim; Johanson, Gunnar; Öberg, Mattias

    2014-07-01

    Benchmark dose (BMD) modeling is increasingly used as the preferred approach to define the point-of-departure for health risk assessment of chemicals. As data are inherently variable, there is always a risk to select a model that defines a lower confidence bound of the BMD (BMDL) that, contrary to expected, exceeds the true BMD. The aim of this study was to investigate how often and under what circumstances such anomalies occur under current modeling practice. Continuous data were generated from a realistic dose-effect curve by Monte Carlo simulations using four dose groups and a set of five different dose placement scenarios, group sizes between 5 and 50 animals and coefficients of variations of 5-15%. The BMD calculations were conducted using nested exponential models, as most BMD software use nested approaches. "Non-protective" BMDLs (higher than true BMD) were frequently observed, in some scenarios reaching 80%. The phenomenon was mainly related to the selection of the non-sigmoidal exponential model (Effect=a·e(b)(·dose)). In conclusion, non-sigmoid models should be used with caution as it may underestimate the risk, illustrating that awareness of the model selection process and sound identification of the point-of-departure is vital for health risk assessment. Copyright © 2014 The Authors. Published by Elsevier Inc. All rights reserved.

  14. Benchmarking

    OpenAIRE

    Beretta Sergio; Dossi Andrea; Grove Hugh

    2000-01-01

    Due to their particular nature, the benchmarking methodologies tend to exceed the boundaries of management techniques, and to enter the territories of managerial culture. A culture that is also destined to break into the accounting area not only strongly supporting the possibility of fixing targets, and measuring and comparing the performance (an aspect that is already innovative and that is worthy of attention), but also questioning one of the principles (or taboos) of the accounting or...

  15. Residual Generation for the Ship Benchmark Using Structural Approach

    DEFF Research Database (Denmark)

    Cocquempot, V.; Izadi-Zamanabadi, Roozbeh; Staroswiecki, M

    1998-01-01

    The prime objective of Fault-tolerant Control (FTC) systems is to handle faults and discrepancies using appropriate accommodation policies. The issue of obtaining information about various parameters and signals, which have to be monitored for fault detection purposes, becomes a rigorous task...... with the growing number of subsystems. The structural approach, presented in this paper, constitutes a general framework for providing information when the system becomes complex. The methodology of this approach is illustrated on the ship propulsion benchmark....

  16. A simplified approach to WWER-440 fuel assembly head benchmark

    International Nuclear Information System (INIS)

    Muehlbauer, P.

    2010-01-01

    The WWER-440 fuel assembly head benchmark was simulated with FLUENT 12 code as a first step of validation of the code for nuclear reactor safety analyses. Results of the benchmark together with comparison of results provided by other participants and results of measurement will be presented in another paper by benchmark organisers. This presentation is therefore focused on our approach to this simulation as illustrated on the case 323-34, which represents a peripheral assembly with five neighbours. All steps of the simulation and some lessons learned are described. Geometry of the computational region supplied as STEP file by organizers of the benchmark was first separated into two parts (inlet part with spacer grid, and the rest of assembly head) in order to keep the size of the computational mesh manageable with regard to the hardware available (HP Z800 workstation with Intel Zeon four-core CPU 3.2 GHz, 32 GB of RAM) and then further modified at places where shape of the geometry would probably lead to highly distorted cells. Both parts of the geometry were connected via boundary profile file generated at cross section, where effect of grid spacers is still felt but the effect of out flow boundary condition used in the computations of the inlet part of geometry is negligible. Computation proceeded in several steps: start with basic mesh, standard k-ε model of turbulence with standard wall functions and first order upwind numerical schemes; after convergence (scaled residuals lower than 10-3) and near-wall meshes local adaptation when needed, realizable k-ε of turbulence was used with second order upwind numerical schemes for momentum and energy equations. During iterations, area-average temperature of thermocouples and area-averaged outlet temperature which are the main figures of merit of the benchmark were also monitored. In this 'blind' phase of the benchmark, effect of spacers was neglected. After results of measurements are available, standard validation

  17. Benchmarking Benchmarks

    NARCIS (Netherlands)

    D.C. Blitz (David)

    2011-01-01

    textabstractBenchmarking benchmarks is a bundle of six studies that are inspired by the prevalence of benchmarking in academic finance research as well as in investment practice. Three studies examine if current benchmark asset pricing models adequately describe the cross-section of stock returns.

  18. Benchmarks: The Development of a New Approach to Student Evaluation.

    Science.gov (United States)

    Larter, Sylvia

    The Toronto Board of Education Benchmarks are libraries of reference materials that demonstrate student achievement at various levels. Each library contains video benchmarks, print benchmarks, a staff handbook, and summary and introductory documents. This book is about the development and the history of the benchmark program. It has taken over 3…

  19. Benchmarking pediatric cranial CT protocols using a dose tracking software system: a multicenter study.

    Science.gov (United States)

    De Bondt, Timo; Mulkens, Tom; Zanca, Federica; Pyfferoen, Lotte; Casselman, Jan W; Parizel, Paul M

    2017-02-01

    To benchmark regional standard practice for paediatric cranial CT-procedures in terms of radiation dose and acquisition parameters. Paediatric cranial CT-data were retrospectively collected during a 1-year period, in 3 different hospitals of the same country. A dose tracking system was used to automatically gather information. Dose (CTDI and DLP), scan length, amount of retakes and demographic data were stratified by age and clinical indication; appropriate use of child-specific protocols was assessed. In total, 296 paediatric cranial CT-procedures were collected. Although the median dose of each hospital was below national and international diagnostic reference level (DRL) for all age categories, statistically significant (p-value benchmarking showed that further dose optimization and standardization is possible by using age-stratified protocols for paediatric cranial CT. Moreover, having a dose tracking system revealed that adult protocols are still applied for paediatric CT, a practice that must be avoided. • Significant differences were observed in the delivered dose between age-groups and hospitals. • Using age-adapted scanning protocols gives a nearly linear dose increase. • Sharing dose-data can be a trigger for hospitals to reduce dose levels.

  20. Benchmarking pediatric cranial CT protocols using a dose tracking software system: a multicenter study

    Energy Technology Data Exchange (ETDEWEB)

    Bondt, Timo de; Parizel, Paul M. [Antwerp University Hospital and University of Antwerp, Department of Radiology, Antwerp (Belgium); Mulkens, Tom [H. Hart Hospital, Department of Radiology, Lier (Belgium); Zanca, Federica [GE Healthcare, DoseWatch, Buc (France); KU Leuven, Imaging and Pathology Department, Leuven (Belgium); Pyfferoen, Lotte; Casselman, Jan W. [AZ St. Jan Brugge-Oostende AV Hospital, Department of Radiology, Brugge (Belgium)

    2017-02-15

    To benchmark regional standard practice for paediatric cranial CT-procedures in terms of radiation dose and acquisition parameters. Paediatric cranial CT-data were retrospectively collected during a 1-year period, in 3 different hospitals of the same country. A dose tracking system was used to automatically gather information. Dose (CTDI and DLP), scan length, amount of retakes and demographic data were stratified by age and clinical indication; appropriate use of child-specific protocols was assessed. In total, 296 paediatric cranial CT-procedures were collected. Although the median dose of each hospital was below national and international diagnostic reference level (DRL) for all age categories, statistically significant (p-value < 0.001) dose differences among hospitals were observed. The hospital with lowest dose levels showed smallest dose variability and used age-stratified protocols for standardizing paediatric head exams. Erroneous selection of adult protocols for children still occurred, mostly in the oldest age-group. Even though all hospitals complied with national and international DRLs, dose tracking and benchmarking showed that further dose optimization and standardization is possible by using age-stratified protocols for paediatric cranial CT. Moreover, having a dose tracking system revealed that adult protocols are still applied for paediatric CT, a practice that must be avoided. (orig.)

  1. Evaluation of the applicability of the Benchmark approach to existing toxicological data. Framework: Chemical compounds in the working place

    NARCIS (Netherlands)

    Appel MJ; Bouman HGM; Pieters MN; Slob W; Adviescentrum voor chemische; CSR

    2001-01-01

    Five chemicals used in workplace, for which a risk assessment had already been carried out, were selected and the relevant critical studies re-analyzed by the Benchmark approach. The endpoints involved included continuous, and ordinal data. Dose-response modeling could be reasonablyapplied to the

  2. Regression Benchmarking: An Approach to Quality Assurance in Performance

    OpenAIRE

    Bulej, Lubomír

    2005-01-01

    The paper presents a short summary of our work in the area of regression benchmarking and its application to software development. Specially, we explain the concept of regression benchmarking, the requirements for employing regression testing in a software project, and methods used for analyzing the vast amounts of data resulting from repeated benchmarking. We present the application of regression benchmarking on a real software project and conclude with a glimpse at the challenges for the fu...

  3. A unified framework for benchmark dose estimation applied to mixed models and model averaging

    DEFF Research Database (Denmark)

    Ritz, Christian; Gerhard, Daniel; Hothorn, Ludwig A.

    2013-01-01

    for hierarchical data structures, reflecting increasingly common types of assay data. We illustrate the usefulness of the methodology by means of a cytotoxicology example where the sensitivity of two types of assays are evaluated and compared. By means of a simulation study, we show that the proposed framework......This article develops a framework for benchmark dose estimation that allows intrinsically nonlinear dose-response models to be used for continuous data in much the same way as is already possible for quantal data. This means that the same dose-response model equations may be applied to both...

  4. Benchmarking novel approaches for modelling species range dynamics.

    Science.gov (United States)

    Zurell, Damaris; Thuiller, Wilfried; Pagel, Jörn; Cabral, Juliano S; Münkemüller, Tamara; Gravel, Dominique; Dullinger, Stefan; Normand, Signe; Schiffers, Katja H; Moore, Kara A; Zimmermann, Niklaus E

    2016-08-01

    Increasing biodiversity loss due to climate change is one of the most vital challenges of the 21st century. To anticipate and mitigate biodiversity loss, models are needed that reliably project species' range dynamics and extinction risks. Recently, several new approaches to model range dynamics have been developed to supplement correlative species distribution models (SDMs), but applications clearly lag behind model development. Indeed, no comparative analysis has been performed to evaluate their performance. Here, we build on process-based, simulated data for benchmarking five range (dynamic) models of varying complexity including classical SDMs, SDMs coupled with simple dispersal or more complex population dynamic models (SDM hybrids), and a hierarchical Bayesian process-based dynamic range model (DRM). We specifically test the effects of demographic and community processes on model predictive performance. Under current climate, DRMs performed best, although only marginally. Under climate change, predictive performance varied considerably, with no clear winners. Yet, all range dynamic models improved predictions under climate change substantially compared to purely correlative SDMs, and the population dynamic models also predicted reasonable extinction risks for most scenarios. When benchmarking data were simulated with more complex demographic and community processes, simple SDM hybrids including only dispersal often proved most reliable. Finally, we found that structural decisions during model building can have great impact on model accuracy, but prior system knowledge on important processes can reduce these uncertainties considerably. Our results reassure the clear merit in using dynamic approaches for modelling species' response to climate change but also emphasize several needs for further model and data improvement. We propose and discuss perspectives for improving range projections through combination of multiple models and for making these approaches

  5. Benchmark studies of induced radioactivity produced in LHC materials, Part II: Remanent dose rates.

    Science.gov (United States)

    Brugger, M; Khater, H; Mayer, S; Prinz, A; Roesler, S; Ulrici, L; Vincke, H

    2005-01-01

    A new method to estimate remanent dose rates, to be used with the Monte Carlo code FLUKA, was benchmarked against measurements from an experiment that was performed at the CERN-EU high-energy reference field facility. An extensive collection of samples of different materials were placed downstream of, and laterally to, a copper target, intercepting a positively charged mixed hadron beam with a momentum of 120 GeV c(-1). Emphasis was put on the reduction of uncertainties by taking measures such as careful monitoring of the irradiation parameters, using different instruments to measure dose rates, adopting detailed elemental analyses of the irradiated materials and making detailed simulations of the irradiation experiment. The measured and calculated dose rates are in good agreement.

  6. An international pooled analysis for obtaining a benchmark dose for environmental lead exposure in children

    DEFF Research Database (Denmark)

    Budtz-Jørgensen, Esben; Bellinger, David; Lanphear, Bruce

    2013-01-01

    Lead is a recognized neurotoxicant, but estimating effects at the lowest measurable levels is difficult. An international pooled analysis of data from seven cohort studies reported an inverse and supra-linear relationship between blood lead concentrations and IQ scores in children. The lack...... of a clear threshold presents a challenge to the identification of an acceptable level of exposure. The benchmark dose (BMD) is defined as the dose that leads to a specific known loss. As an alternative to elusive thresholds, the BMD is being used increasingly by regulatory authorities. Using the pooled data...... yielding lower confidence limits (BMDLs) of about 0.1-1.0 μ g/dL for the dose leading to a loss of one IQ point. We conclude that current allowable blood lead concentrations need to be lowered and further prevention efforts are needed to protect children from lead toxicity....

  7. Benchmarking of computer codes and approaches for modeling exposure scenarios

    International Nuclear Information System (INIS)

    Seitz, R.R.; Rittmann, P.D.; Wood, M.I.; Cook, J.R.

    1994-08-01

    The US Department of Energy Headquarters established a performance assessment task team (PATT) to integrate the activities of DOE sites that are preparing performance assessments for the disposal of newly generated low-level waste. The PATT chartered a subteam with the task of comparing computer codes and exposure scenarios used for dose calculations in performance assessments. This report documents the efforts of the subteam. Computer codes considered in the comparison include GENII, PATHRAE-EPA, MICROSHIELD, and ISOSHLD. Calculations were also conducted using spreadsheets to provide a comparison at the most fundamental level. Calculations and modeling approaches are compared for unit radionuclide concentrations in water and soil for the ingestion, inhalation, and external dose pathways. Over 30 tables comparing inputs and results are provided

  8. Immunotoxicity of perfluorinated alkylates: calculation of benchmark doses based on serum concentrations in children

    DEFF Research Database (Denmark)

    Grandjean, Philippe; Budtz-Joergensen, Esben

    2013-01-01

    BACKGROUND: Immune suppression may be a critical effect associated with exposure to perfluorinated compounds (PFCs), as indicated by recent data on vaccine antibody responses in children. Therefore, this information may be crucial when deciding on exposure limits. METHODS: Results obtained from...... follow-up of a Faroese birth cohort were used. Serum-PFC concentrations were measured at age 5 years, and serum antibody concentrations against tetanus and diphtheria toxoids were obtained at ages 7 years. Benchmark dose results were calculated in terms of serum concentrations for 431 children...

  9. Estimating the Need for Palliative Radiation Therapy: A Benchmarking Approach

    Energy Technology Data Exchange (ETDEWEB)

    Mackillop, William J., E-mail: william.mackillop@krcc.on.ca [Cancer Care and Epidemiology, Queen' s Cancer Research Institute, Queen' s University, Kingston, Ontario (Canada); Department of Public Health Sciences, Queen' s University, Kingston, Ontario (Canada); Department of Oncology, Queen' s University, Kingston, Ontario (Canada); Kong, Weidong [Cancer Care and Epidemiology, Queen' s Cancer Research Institute, Queen' s University, Kingston, Ontario (Canada)

    2016-01-01

    Purpose: Palliative radiation therapy (PRT) benefits many patients with incurable cancer, but the overall need for PRT is unknown. Our primary objective was to estimate the appropriate rate of use of PRT in Ontario. Methods and Materials: The Ontario Cancer Registry identified patients who died of cancer in Ontario between 2006 and 2010. Comprehensive RT records were linked to the registry. Multivariate analysis identified social and health system-related factors affecting the use of PRT, enabling us to define a benchmark population of patients with unimpeded access to PRT. The proportion of cases treated at any time (PRT{sub lifetime}), the proportion of cases treated in the last 2 years of life (PRT{sub 2y}), and number of courses of PRT per thousand cancer deaths were measured in the benchmark population. These benchmarks were standardized to the characteristics of the overall population, and province-wide PRT rates were then compared to benchmarks. Results: Cases diagnosed at hospitals with no RT on-site and residents of poorer communities and those who lived farther from an RT center, were significantly less likely than others to receive PRT. However, availability of RT at the diagnosing hospital was the dominant factor. Neither socioeconomic status nor distance from home to nearest RT center had a significant effect on the use of PRT in patients diagnosed at a hospital with RT facilities. The benchmark population therefore consisted of patients diagnosed at a hospital with RT facilities. The standardized benchmark for PRT{sub lifetime} was 33.9%, and the corresponding province-wide rate was 28.5%. The standardized benchmark for PRT{sub 2y} was 32.4%, and the corresponding province-wide rate was 27.0%. The standardized benchmark for the number of courses of PRT per thousand cancer deaths was 652, and the corresponding province-wide rate was 542. Conclusions: Approximately one-third of patients who die of cancer in Ontario need PRT, but many of them are never

  10. Quality Assurance Testing of Version 1.3 of U.S. EPA Benchmark Dose Software (Presentation)

    Science.gov (United States)

    EPA benchmark dose software (BMDS) issued to evaluate chemical dose-response data in support of Agency risk assessments, and must therefore be dependable. Quality assurance testing methods developed for BMDS were designed to assess model dependability with respect to curve-fitt...

  11. Benchmarking the Degree of Implementation of Learner-Centered Approaches

    Science.gov (United States)

    Blumberg, Phyllis; Pontiggia, Laura

    2011-01-01

    We describe an objective way to measure whether curricula, educational programs, and institutions are learner-centered. This technique for benchmarking learner-centeredness uses rubrics to measure courses on 29 components within Weimer's five dimensions. We converted the scores on the rubrics to four-point indices and constructed histograms that…

  12. Benchmarking the minimum Electron Beam (eBeam) dose required for the sterilization of space foods

    Science.gov (United States)

    Bhatia, Sohini S.; Wall, Kayley R.; Kerth, Chris R.; Pillai, Suresh D.

    2018-02-01

    As manned space missions extend in length, the safety, nutrition, acceptability, and shelf life of space foods are of paramount importance to NASA. Since food and mealtimes play a key role in reducing stress and boredom of prolonged missions, the quality of food in terms of appearance, flavor, texture, and aroma can have significant psychological ramifications on astronaut performance. The FDA, which oversees space foods, currently requires a minimum dose of 44 kGy for irradiated space foods. The underlying hypothesis was that commercial sterility of space foods could be achieved at a significantly lower dose, and this lowered dose would positively affect the shelf life of the product. Electron beam processed beef fajitas were used as an example NASA space food to benchmark the minimum eBeam dose required for sterility. A 15 kGy dose was able to achieve an approximately 10 log reduction in Shiga-toxin-producing Escherichia coli bacteria, and a 5 log reduction in Clostridium sporogenes spores. Furthermore, accelerated shelf life testing (ASLT) to determine sensory and quality characteristics under various conditions was conducted. Using Multidimensional gas-chromatography-olfactometry-mass spectrometry (MDGC-O-MS), numerous volatiles were shown to be dependent on the dose applied to the product. Furthermore, concentrations of off -flavor aroma compounds such as dimethyl sulfide were decreased at the reduced 15 kGy dose. The results suggest that the combination of conventional cooking combined with eBeam processing (15 kGy) can achieve the safety and shelf-life objectives needed for long duration space-foods.

  13. An integrative approach of the marketing research and benchmarking

    Directory of Open Access Journals (Sweden)

    Moraru Gina-Maria

    2017-01-01

    Full Text Available The accuracy of the manager’s actions in a firm depends, among other things, on the accuracy of his/her information about all processes. At this issue, developing a marketing research is essential, because it provides information that represents the current situation in organization and on the market. Although specialists devote the marketing research exclusively to the organizational marketing function, practice has shown that it can be used in any other function of the company: production, finance, human resources, research and development. Firstly, the paper presents the opportunities to use the marketing research as a management tool in various stages of creative thinking. Secondly, based on a study made from secondary sources of economic literature, the paper draws a parallel between marketing research and benchmarking. Finally, the paper shows that creative benchmarking closes the management - marketing - creativity circle for the benefit of the organization and community.

  14. BENCHMARKING - MANAGEMENT APPROACH FOR PUBLIC SERVICES BASED ON PERFORMANCE CRITERIA

    OpenAIRE

    Elena TUDOSE (IORGA)

    2013-01-01

    Objective assessment of the efficiency/effectiveness of an organization’s activity, based on realistic, measurable indicators, is a fundamental premise for successful strategic planning. Although traditionally associated with the business sector and the concept of economic productivity, benchmarking has become today an important tool used by public agencies in planning their activities and evaluating processes and outcomes. The public utility services sector is one of the fields that require ...

  15. Netherlands contribution to the EC project: Benchmark exercise on dose estimation in a regulatory context

    International Nuclear Information System (INIS)

    Stolk, D.J.

    1987-04-01

    On request of the Netherlands government FEL-TNO is developing a decision support system with the acronym RAMBOS for the assessment of the off-site consequences of an accident with hazardous materials. This is a user friendly interactive computer program, which uses very sophisticated graphical means. RAMBOS supports the emergency planning organization in two ways. Firstly, the risk to the residents in the surroundings of the accident is quantified in terms of severity and magnitude (number of casualties, etc.). Secondly, the consequences of countermeasures, such as sheltering and evacuation, are predicted. By evaluating several countermeasures the user can determine an optimum policy to reduce the impact of the accident. Within the framework of the EC project 'Benchmark exercise on dose estimation in a regulatory context' on request of the Ministry of Housing, Physical Planning and Environment calculations were carried out with the RAMBOS system. This report contains the results of these calculations. 3 refs.; 2 figs.; 10 tabs

  16. Benchmarking of MCNP for calculating dose rates at an interim storage facility for nuclear waste.

    Science.gov (United States)

    Heuel-Fabianek, Burkhard; Hille, Ralf

    2005-01-01

    During the operation of research facilities at Research Centre Jülich, Germany, nuclear waste is stored in drums and other vessels in an interim storage building on-site, which has a concrete shielding at the side walls. Owing to the lack of a well-defined source, measured gamma spectra were unfolded to determine the photon flux on the surface of the containers. The dose rate simulation, including the effects of skyshine, using the Monte Carlo transport code MCNP is compared with the measured dosimetric data at some locations in the vicinity of the interim storage building. The MCNP data for direct radiation confirm the data calculated using a point-kernel method. However, a comparison of the modelled dose rates for direct radiation and skyshine with the measured data demonstrate the need for a more precise definition of the source. Both the measured and the modelled dose rates verified the fact that the legal limits (<1 mSv a(-1)) are met in the area outside the perimeter fence of the storage building to which members of the public have access. Using container surface data (gamma spectra) to define the source may be a useful tool for practical calculations and additionally for benchmarking of computer codes if the discussed critical aspects with respect to the source can be addressed adequately.

  17. The current state of knowledge on the use of the benchmark dose concept in risk assessment.

    Science.gov (United States)

    Sand, Salomon; Victorin, Katarina; Filipsson, Agneta Falk

    2008-05-01

    This review deals with the current state of knowledge on the use of the benchmark dose (BMD) concept in health risk assessment of chemicals. The BMD method is an alternative to the traditional no-observed-adverse-effect level (NOAEL) and has been presented as a methodological improvement in the field of risk assessment. The BMD method has mostly been employed in the USA but is presently given higher attention also in Europe. The review presents a number of arguments in favor of the BMD, relative to the NOAEL. In addition, it gives a detailed overview of the several procedures that have been suggested and applied for BMD analysis, for quantal as well as continuous data. For quantal data the BMD is generally defined as corresponding to an additional or extra risk of 5% or 10%. For continuous endpoints it is suggested that the BMD is defined as corresponding to a percentage change in response relative to background or relative to the dynamic range of response. Under such definitions, a 5% or 10% change can be considered as default. Besides how to define the BMD and its lower bound, the BMDL, the question of how to select the dose-response model to be used in the BMD and BMDL determination is highlighted. Issues of study design and comparison of dose-response curves and BMDs are also covered. Copyright (c) 2007 John Wiley & Sons, Ltd.

  18. Mechanism-based risk assessment strategy for drug-induced cholestasis using the transcriptional benchmark dose derived by toxicogenomics.

    Science.gov (United States)

    Kawamoto, Taisuke; Ito, Yuichi; Morita, Osamu; Honda, Hiroshi

    2017-01-01

    Cholestasis is one of the major causes of drug-induced liver injury (DILI), which can result in withdrawal of approved drugs from the market. Early identification of cholestatic drugs is difficult due to the complex mechanisms involved. In order to develop a strategy for mechanism-based risk assessment of cholestatic drugs, we analyzed gene expression data obtained from the livers of rats that had been orally administered with 12 known cholestatic compounds repeatedly for 28 days at three dose levels. Qualitative analyses were performed using two statistical approaches (hierarchical clustering and principle component analysis), in addition to pathway analysis. The transcriptional benchmark dose (tBMD) and tBMD 95% lower limit (tBMDL) were used for quantitative analyses, which revealed three compound sub-groups that produced different types of differential gene expression; these groups of genes were mainly involved in inflammation, cholesterol biosynthesis, and oxidative stress. Furthermore, the tBMDL values for each test compound were in good agreement with the relevant no observed adverse effect level. These results indicate that our novel strategy for drug safety evaluation using mechanism-based classification and tBMDL would facilitate the application of toxicogenomics for risk assessment of cholestatic DILI.

  19. Is the assumption of normality or log-normality for continuous response data critical for benchmark dose estimation?

    International Nuclear Information System (INIS)

    Shao, Kan; Gift, Jeffrey S.; Setzer, R. Woodrow

    2013-01-01

    Continuous responses (e.g. body weight) are widely used in risk assessment for determining the benchmark dose (BMD) which is used to derive a U.S. EPA reference dose. One critical question that is not often addressed in dose–response assessments is whether to model the continuous data as normally or log-normally distributed. Additionally, if lognormality is assumed, and only summarized response data (i.e., mean ± standard deviation) are available as is usual in the peer-reviewed literature, the BMD can only be approximated. In this study, using the “hybrid” method and relative deviation approach, we first evaluate six representative continuous dose–response datasets reporting individual animal responses to investigate the impact on BMD/BMDL estimates of (1) the distribution assumption and (2) the use of summarized versus individual animal data when a log-normal distribution is assumed. We also conduct simulation studies evaluating model fits to various known distributions to investigate whether the distribution assumption has influence on BMD/BMDL estimates. Our results indicate that BMDs estimated using the hybrid method are more sensitive to the distribution assumption than counterpart BMDs estimated using the relative deviation approach. The choice of distribution assumption has limited impact on the BMD/BMDL estimates when the within dose-group variance is small, while the lognormality assumption is a better choice for relative deviation method when data are more skewed because of its appropriateness in describing the relationship between mean and standard deviation. Additionally, the results suggest that the use of summarized data versus individual response data to characterize log-normal distributions has minimal impact on BMD estimates. - Highlights: • We investigate to what extent the distribution assumption can affect BMD estimates. • Both real data analysis and simulation study are conducted. • BMDs estimated using hybrid method are more

  20. Is the assumption of normality or log-normality for continuous response data critical for benchmark dose estimation?

    Energy Technology Data Exchange (ETDEWEB)

    Shao, Kan, E-mail: Shao.Kan@epa.gov [ORISE Postdoctoral Fellow, National Center for Environmental Assessment, U.S. Environmental Protection Agency, Research Triangle Park, NC (United States); Gift, Jeffrey S. [National Center for Environmental Assessment, U.S. Environmental Protection Agency, Research Triangle Park, NC (United States); Setzer, R. Woodrow [National Center for Computational Toxicology, U.S. Environmental Protection Agency, Research Triangle Park, NC (United States)

    2013-11-01

    Continuous responses (e.g. body weight) are widely used in risk assessment for determining the benchmark dose (BMD) which is used to derive a U.S. EPA reference dose. One critical question that is not often addressed in dose–response assessments is whether to model the continuous data as normally or log-normally distributed. Additionally, if lognormality is assumed, and only summarized response data (i.e., mean ± standard deviation) are available as is usual in the peer-reviewed literature, the BMD can only be approximated. In this study, using the “hybrid” method and relative deviation approach, we first evaluate six representative continuous dose–response datasets reporting individual animal responses to investigate the impact on BMD/BMDL estimates of (1) the distribution assumption and (2) the use of summarized versus individual animal data when a log-normal distribution is assumed. We also conduct simulation studies evaluating model fits to various known distributions to investigate whether the distribution assumption has influence on BMD/BMDL estimates. Our results indicate that BMDs estimated using the hybrid method are more sensitive to the distribution assumption than counterpart BMDs estimated using the relative deviation approach. The choice of distribution assumption has limited impact on the BMD/BMDL estimates when the within dose-group variance is small, while the lognormality assumption is a better choice for relative deviation method when data are more skewed because of its appropriateness in describing the relationship between mean and standard deviation. Additionally, the results suggest that the use of summarized data versus individual response data to characterize log-normal distributions has minimal impact on BMD estimates. - Highlights: • We investigate to what extent the distribution assumption can affect BMD estimates. • Both real data analysis and simulation study are conducted. • BMDs estimated using hybrid method are more

  1. Correlation of In Vivo Versus In Vitro Benchmark Doses (BMDs) Derived From Micronucleus Test Data: A Proof of Concept Study.

    Science.gov (United States)

    Soeteman-Hernández, Lya G; Fellows, Mick D; Johnson, George E; Slob, Wout

    2015-12-01

    In this study, we explored the applicability of using in vitro micronucleus (MN) data from human lymphoblastoid TK6 cells to derive in vivo genotoxicity potency information. Nineteen chemicals covering a broad spectrum of genotoxic modes of action were tested in an in vitro MN test using TK6 cells using the same study protocol. Several of these chemicals were considered to need metabolic activation, and these were administered in the presence of S9. The Benchmark dose (BMD) approach was applied using the dose-response modeling program PROAST to estimate the genotoxic potency from the in vitro data. The resulting in vitro BMDs were compared with previously derived BMDs from in vivo MN and carcinogenicity studies. A proportional correlation was observed between the BMDs from the in vitro MN and the BMDs from the in vivo MN assays. Further, a clear correlation was found between the BMDs from in vitro MN and the associated BMDs for malignant tumors. Although these results are based on only 19 compounds, they show that genotoxicity potencies estimated from in vitro tests may result in useful information regarding in vivo genotoxic potency, as well as expected cancer potency. Extension of the number of compounds and further investigation of metabolic activation (S9) and of other toxicokinetic factors would be needed to validate our initial conclusions. However, this initial work suggests that this approach could be used for in vitro to in vivo extrapolations which would support the reduction of animals used in research (3Rs: replacement, reduction, and refinement). © The Author 2015. Published by Oxford University Press on behalf of the Society of Toxicology.

  2. Benchmarking in the National Intellectual Capital Measurement: Is It the Best Available Approach?

    Science.gov (United States)

    Januškaite, Virginija; Užiene, Lina

    2016-01-01

    Sustainable economic development is an aspiration of every nation in today's knowledge economy. Scientists for a few decades claim that intellectual capital management is the answer how to reach this goal. Currently, benchmarking methodology is the most common approach in the national intellectual capital measurement intended to provide…

  3. Determining Optimal Crude Oil Price Benchmark in Nigeria: An Empirical Approach

    Directory of Open Access Journals (Sweden)

    Saibu Olufemi Muibi

    2015-12-01

    Full Text Available This paper contributes to on-going empirical search for an appropriate crude oil price benchmark that ensures greater financial stability and efficient fiscal management in Nigeria. It adopted the seasonally adjusted ARIMA forecasting models using monthly data series from 2000m01 to 2012m12 to predict future movement in Nigeria crude oil prices. The paper derived a more robust and dynamic framework that accommodates fluctuation in crude oil price and also in government spending. The result shows that if the incessant withdrawal from the ECA fund and the increasing debt profile of government in recent times are factored into the benchmark, the real crude oil numerical fiscal rule is (US$82.3 for 2013 which is higher than the official benchmark of $75 used for 2013 and 2014 budget proposal. The paper argues that the current long run price rule based on 5-10 year moving average approach adopted by government is rigid and inflexible as a rule for managing Nigerian oil funds. The unrealistic assumption of the extant benchmark accounted for excessive depletion and lack of accountability of the excess crude oil account. The paper concludes that except the federal government can curtail its spending profligacy and adopts a more stringent fiscal discipline rules, the current benchmark is unrealistic and unsuitable for fiscal management of oil revenue in the context of Nigerian economic spending profile.

  4. Comprehensive benchmarking and ensemble approaches for metagenomic classifiers.

    Science.gov (United States)

    McIntyre, Alexa B R; Ounit, Rachid; Afshinnekoo, Ebrahim; Prill, Robert J; Hénaff, Elizabeth; Alexander, Noah; Minot, Samuel S; Danko, David; Foox, Jonathan; Ahsanuddin, Sofia; Tighe, Scott; Hasan, Nur A; Subramanian, Poorani; Moffat, Kelly; Levy, Shawn; Lonardi, Stefano; Greenfield, Nick; Colwell, Rita R; Rosen, Gail L; Mason, Christopher E

    2017-09-21

    One of the main challenges in metagenomics is the identification of microorganisms in clinical and environmental samples. While an extensive and heterogeneous set of computational tools is available to classify microorganisms using whole-genome shotgun sequencing data, comprehensive comparisons of these methods are limited. In this study, we use the largest-to-date set of laboratory-generated and simulated controls across 846 species to evaluate the performance of 11 metagenomic classifiers. Tools were characterized on the basis of their ability to identify taxa at the genus, species, and strain levels, quantify relative abundances of taxa, and classify individual reads to the species level. Strikingly, the number of species identified by the 11 tools can differ by over three orders of magnitude on the same datasets. Various strategies can ameliorate taxonomic misclassification, including abundance filtering, ensemble approaches, and tool intersection. Nevertheless, these strategies were often insufficient to completely eliminate false positives from environmental samples, which are especially important where they concern medically relevant species. Overall, pairing tools with different classification strategies (k-mer, alignment, marker) can combine their respective advantages. This study provides positive and negative controls, titrated standards, and a guide for selecting tools for metagenomic analyses by comparing ranges of precision, accuracy, and recall. We show that proper experimental design and analysis parameters can reduce false positives, provide greater resolution of species in complex metagenomic samples, and improve the interpretation of results.

  5. Benchmark selection

    DEFF Research Database (Denmark)

    Hougaard, Jens Leth; Tvede, Mich

    2002-01-01

    Within a production theoretic framework, this paper considers an axiomatic approach to benchmark selection. It is shown that two simple and weak axioms; efficiency and comprehensive monotonicity characterize a natural family of benchmarks which typically becomes unique. Further axioms are added...... in order to obtain a unique selection...

  6. Depletion benchmarks calculation of random media using explicit modeling approach of RMC

    International Nuclear Information System (INIS)

    Liu, Shichang; She, Ding; Liang, Jin-gang; Wang, Kan

    2016-01-01

    Highlights: • Explicit modeling of RMC is applied to depletion benchmark for HTGR fuel element. • Explicit modeling can provide detailed burnup distribution and burnup heterogeneity. • The results would serve as a supplement for the HTGR fuel depletion benchmark. • The method of adjacent burnup regions combination is proposed for full-core problems. • The combination method can reduce memory footprint, keeping the computing accuracy. - Abstract: Monte Carlo method plays an important role in accurate simulation of random media, owing to its advantages of the flexible geometry modeling and the use of continuous-energy nuclear cross sections. Three stochastic geometry modeling methods including Random Lattice Method, Chord Length Sampling and explicit modeling approach with mesh acceleration technique, have been implemented in RMC to simulate the particle transport in the dispersed fuels, in which the explicit modeling method is regarded as the best choice. In this paper, the explicit modeling method is applied to the depletion benchmark for HTGR fuel element, and the method of combination of adjacent burnup regions has been proposed and investigated. The results show that the explicit modeling can provide detailed burnup distribution of individual TRISO particles, and this work would serve as a supplement for the HTGR fuel depletion benchmark calculations. The combination of adjacent burnup regions can effectively reduce the memory footprint while keeping the computational accuracy.

  7. Toxicological Benchmarks for Wildlife

    Energy Technology Data Exchange (ETDEWEB)

    Sample, B.E. Opresko, D.M. Suter, G.W.

    1993-01-01

    Ecological risks of environmental contaminants are evaluated by using a two-tiered process. In the first tier, a screening assessment is performed where concentrations of contaminants in the environment are compared to no observed adverse effects level (NOAEL)-based toxicological benchmarks. These benchmarks represent concentrations of chemicals (i.e., concentrations presumed to be nonhazardous to the biota) in environmental media (water, sediment, soil, food, etc.). While exceedance of these benchmarks does not indicate any particular level or type of risk, concentrations below the benchmarks should not result in significant effects. In practice, when contaminant concentrations in food or water resources are less than these toxicological benchmarks, the contaminants may be excluded from further consideration. However, if the concentration of a contaminant exceeds a benchmark, that contaminant should be retained as a contaminant of potential concern (COPC) and investigated further. The second tier in ecological risk assessment, the baseline ecological risk assessment, may use toxicological benchmarks as part of a weight-of-evidence approach (Suter 1993). Under this approach, based toxicological benchmarks are one of several lines of evidence used to support or refute the presence of ecological effects. Other sources of evidence include media toxicity tests, surveys of biota (abundance and diversity), measures of contaminant body burdens, and biomarkers. This report presents NOAEL- and lowest observed adverse effects level (LOAEL)-based toxicological benchmarks for assessment of effects of 85 chemicals on 9 representative mammalian wildlife species (short-tailed shrew, little brown bat, meadow vole, white-footed mouse, cottontail rabbit, mink, red fox, and whitetail deer) or 11 avian wildlife species (American robin, rough-winged swallow, American woodcock, wild turkey, belted kingfisher, great blue heron, barred owl, barn owl, Cooper's hawk, and red

  8. Gamma irradiator dose mapping simulation using the MCNP code and benchmarking with dosimetry

    International Nuclear Information System (INIS)

    Sohrabpour, M.; Hassanzadeh, M.; Shahriari, M.; Sharifzadeh, M.

    2002-01-01

    The Monte Carlo transport code, MCNP, has been applied in simulating dose rate distribution in the IR-136 gamma irradiator system. Isodose curves, cumulative dose values, and system design data such as throughputs, over-dose-ratios, and efficiencies have been simulated as functions of product density. Simulated isodose curves, and cumulative dose values were compared with dosimetry values obtained using polymethyle-methacrylate, Fricke, ethanol-chlorobenzene, and potassium dichromate dosimeters. The produced system design data were also found to agree quite favorably with those of the system manufacturer's data. MCNP has thus been found to be an effective transport code for handling of various dose mapping excercises for gamma irradiators

  9. Using the benchmark dose (BMD) methodology to determine an appropriate reduction of certain ingredients in food products.

    Science.gov (United States)

    Bi, Jian

    2010-01-01

    As the desire to promote health increases, reductions of certain ingredients, for example, sodium, sugar, and fat in food products, are widely requested. However, the reduction is not risk free in sensory and marketing aspects. Over reduction may change the taste and influence the flavor of a product and lead to a decrease in consumer's overall liking or purchase intent for the product. This article uses the benchmark dose (BMD) methodology to determine an appropriate reduction. Calculations of BMD and one-sided lower confidence limit of BMD are illustrated. The article also discusses how to calculate BMD and BMDL for over dispersed binary data in replicated testing based on a corrected beta-binomial model. USEPA Benchmark Dose Software (BMDS) were used and S-Plus programs were developed. The method discussed in the article is originally used to determine an appropriate reduction of certain ingredients, for example, sodium, sugar, and fat in food products, considering both health reason and sensory or marketing risk.

  10. What is a food and what is a medicinal product in the European Union? Use of the benchmark dose (BMD) methodology to define a threshold for "pharmacological action".

    Science.gov (United States)

    Lachenmeier, Dirk W; Steffen, Christian; el-Atma, Oliver; Maixner, Sibylle; Löbell-Behrends, Sigrid; Kohl-Himmelseher, Matthias

    2012-11-01

    The decision criterion for the demarcation between foods and medicinal products in the EU is the significant "pharmacological action". Based on six examples of substances with ambivalent status, the benchmark dose (BMD) method is evaluated to provide a threshold for pharmacological action. Using significant dose-response models from literature clinical trial data or epidemiology, the BMD values were 63mg/day for caffeine, 5g/day for alcohol, 6mg/day for lovastatin, 769mg/day for glucosamine sulfate, 151mg/day for Ginkgo biloba extract, and 0.4mg/day for melatonin. The examples for caffeine and alcohol validate the approach because intake above BMD clearly exhibits pharmacological action. Nevertheless, due to uncertainties in dose-response modelling as well as the need for additional uncertainty factors to consider differences in sensitivity within the human population, a "borderline range" on the dose-response curve remains. "Pharmacological action" has proven to be not very well suited as binary decision criterion between foods and medicinal product. The European legislator should rethink the definition of medicinal products, as the current situation based on complicated case-by-case decisions on pharmacological action leads to an unregulated market flooded with potentially illegal food supplements. Copyright © 2012 Elsevier Inc. All rights reserved.

  11. Microdosimetric approach for lung dose assessments

    International Nuclear Information System (INIS)

    Hofmann, W.; Steinhausler, F.; Pohl, E.; Bernroider, G.

    1980-01-01

    In the macroscopic region the term ''organ dose'' is related to an uniform energy deposition within a homogeneous biological target. Considering the lung, inhaled radioactive nuclides, however, show a significant non-uniform distribution pattern throughout the respiratory tract. For the calculation of deposition and clearance of inhaled alpha-emitting radionuclides within different regions of this organ, a detailed compartment model, based on the Weibel model A was developed. Since biological effects (e.g. lung cancer initiation) are primarily caused at the cellular level, the interaction of alpha particles with different types of cells of the lung tissue was studied. The basic approach is to superimpose alpha particle tracks on magnified images of randomly selected tissue slices, simulating alpha emitting sources. Particle tracks are generated by means of a specially developed computer program and used as input data for an on-line electronic image analyzer (Quantimet-720). Using adaptive pattern recognition methods the different cells in the lung tissue can be identified and their distribution within the whole organ determined. This microdosimetric method is applied to soluble radon decay products as well as to insoluble, highly localized, plutonium particles. For a defined microdistribution of alpha emitters, the resulting dose, integrated over all cellular dose values, is compared to the compartmental doses of the ICRP lung model. Furthermore this methodology is also applicable to other organs and tissues of the human body for dose calculations in practical health physics. (author)

  12. The role of efficiency estimates in regulatory price reviews: Ofgem's approach to benchmarking electricity networks

    International Nuclear Information System (INIS)

    Pollitt, Michael

    2005-01-01

    Electricity regulators around the world make use of efficiency analysis (or benchmarking) to produce estimates of the likely amount of cost reduction which regulated electric utilities can achieve. This short paper examines the use of such efficiency estimates by the UK electricity regulator (Ofgem) within electricity distribution and transmission price reviews. It highlights the place of efficiency analysis within the calculation of X factors. We suggest a number of problems with the current approach and make suggestions for the future development of X factor setting. (author)

  13. Benchmarking and validation of a Geant4-SHADOW Monte Carlo simulation for dose calculations in microbeam radiation therapy.

    Science.gov (United States)

    Cornelius, Iwan; Guatelli, Susanna; Fournier, Pauline; Crosbie, Jeffrey C; Sanchez Del Rio, Manuel; Bräuer-Krisch, Elke; Rosenfeld, Anatoly; Lerch, Michael

    2014-05-01

    Microbeam radiation therapy (MRT) is a synchrotron-based radiotherapy modality that uses high-intensity beams of spatially fractionated radiation to treat tumours. The rapid evolution of MRT towards clinical trials demands accurate treatment planning systems (TPS), as well as independent tools for the verification of TPS calculated dose distributions in order to ensure patient safety and treatment efficacy. Monte Carlo computer simulation represents the most accurate method of dose calculation in patient geometries and is best suited for the purpose of TPS verification. A Monte Carlo model of the ID17 biomedical beamline at the European Synchrotron Radiation Facility has been developed, including recent modifications, using the Geant4 Monte Carlo toolkit interfaced with the SHADOW X-ray optics and ray-tracing libraries. The code was benchmarked by simulating dose profiles in water-equivalent phantoms subject to irradiation by broad-beam (without spatial fractionation) and microbeam (with spatial fractionation) fields, and comparing against those calculated with a previous model of the beamline developed using the PENELOPE code. Validation against additional experimental dose profiles in water-equivalent phantoms subject to broad-beam irradiation was also performed. Good agreement between codes was observed, with the exception of out-of-field doses and toward the field edge for larger field sizes. Microbeam results showed good agreement between both codes and experimental results within uncertainties. Results of the experimental validation showed agreement for different beamline configurations. The asymmetry in the out-of-field dose profiles due to polarization effects was also investigated, yielding important information for the treatment planning process in MRT. This work represents an important step in the development of a Monte Carlo-based independent verification tool for treatment planning in MRT.

  14. Benchmark measurements and simulations of dose perturbations due to metallic spheres in proton beams

    International Nuclear Information System (INIS)

    Newhauser, Wayne D.; Rechner, Laura; Mirkovic, Dragan; Yepes, Pablo; Koch, Nicholas C.; Titt, Uwe; Fontenot, Jonas D.; Zhang, Rui

    2013-01-01

    Monte Carlo simulations are increasingly used for dose calculations in proton therapy due to its inherent accuracy. However, dosimetric deviations have been found using Monte Carlo code when high density materials are present in the proton beamline. The purpose of this work was to quantify the magnitude of dose perturbation caused by metal objects. We did this by comparing measurements and Monte Carlo predictions of dose perturbations caused by the presence of small metal spheres in several clinical proton therapy beams as functions of proton beam range and drift space. Monte Carlo codes MCNPX, GEANT4 and Fast Dose Calculator (FDC) were used. Generally good agreement was found between measurements and Monte Carlo predictions, with the average difference within 5% and maximum difference within 17%. The modification of multiple Coulomb scattering model in MCNPX code yielded improvement in accuracy and provided the best overall agreement with measurements. Our results confirmed that Monte Carlo codes are well suited for predicting multiple Coulomb scattering in proton therapy beams when short drift spaces are involved. - Highlights: • We compared measurements and Monte Carlo predictions of dose perturbations caused by the metal objects in proton beams. • Different Monte Carlo codes were used, including MCNPX, GEANT4 and Fast Dose Calculator. • Good agreement was found between measurements and Monte Carlo simulations. • The modification of multiple Coulomb scattering model in MCNPX code yielded improved accuracy. • Our results confirmed that Monte Carlo codes are well suited for predicting multiple Coulomb scattering in proton therapy

  15. Selecting concepts for a concept-based curriculum: application of a benchmark approach.

    Science.gov (United States)

    Giddens, Jean Foret; Wright, Mary; Gray, Irene

    2012-09-01

    In response to a transformational movement in nursing education, faculty across the country are considering changes to curricula and approaches to teaching. As a result, an emerging trend in many nursing programs is the adoption of a concept-based curriculum. As part of the curriculum development process, the selection of concepts, competencies, and exemplars on which to build courses and base content is needed. This article presents a benchmark approach used to validate and finalize concept selection among educators developing a concept-based curriculum for a statewide nursing consortium. These findings are intended to inform other nurse educators who are currently involved with or are considering this curriculum approach. Copyright 2012, SLACK Incorporated.

  16. Application of benchmark dose modeling to protein expression data in the development and analysis of mode of action/adverse outcome pathways for testicular toxicity.

    Science.gov (United States)

    Chepelev, Nikolai L; Meek, M E Bette; Yauk, Carole Lyn

    2014-11-01

    Reliable quantification of gene and protein expression has potential to contribute significantly to the characterization of hypothesized modes of action (MOA) or adverse outcome pathways for critical effects of toxicants. Quantitative analysis of gene expression by benchmark dose (BMD) modeling has been facilitated by the development of effective software tools. In contrast, protein expression is still generally quantified by a less robust effect level (no or lowest [adverse] effect levels) approach, which minimizes its potential utility in the consideration of dose-response and temporal concordance for key events in hypothesized MOAs. BMD modeling is applied here to toxicological data on testicular toxicity to investigate its potential utility in analyzing protein expression relevant to the proposed MOA to inform human health risk assessment. The results illustrate how the BMD analysis of protein expression in animal tissues in response to toxicant exposure: (1) complements other toxicity data, and (2) contributes to consideration of the empirical concordance of dose-response relationships, as part of the weight of evidence for hypothesized MOAs to facilitate consideration and application in regulatory risk assessment. Lack of BMD analysis in proteomics has likely limited its use for these purposes. This paper illustrates the added value of BMD modeling to support and strengthen hypothetical MOAs as a basis to facilitate the translation and uptake of the results of proteomic research into risk assessment. Copyright © 2014 Her Majesty the Queen in Right of Canada. Journal of Applied Toxicology © 2014 John Wiley & Sons, Ltd.

  17. 77 FR 36533 - Notice of Availability of the Benchmark Dose Technical Guidance

    Science.gov (United States)

    2012-06-19

    ... environment, the EPA routinely conducts risk assessments on chemical agents that may be toxic to humans. A key component of the risk assessment process involves evaluating the dose-response relationship between exposure... BMD methodology for human health risk assessments. The document discusses computation of BMD values...

  18. Dose mapping simulation using the MCNP code for the Syrian gamma irradiation facility and benchmarking

    International Nuclear Information System (INIS)

    Khattab, K.; Boush, M.; Alkassiri, H.

    2013-01-01

    Highlights: • The MCNP4C was used to calculate the gamma ray dose rate spatial distribution in for the SGIF. • Measurement of the gamma ray dose rate spatial distribution using the Chlorobenzene dosimeter was conducted as well. • Good agreements were noticed between the calculated and measured results. • The maximum relative differences were less than 7%, 4% and 4% in the x, y and z directions respectively. - Abstract: A three dimensional model for the Syrian gamma irradiation facility (SGIF) is developed in this paper to calculate the gamma ray dose rate spatial distribution in the irradiation room at the 60 Co source board using the MCNP-4C code. Measurement of the gamma ray dose rate spatial distribution using the Chlorobenzene dosimeter is conducted as well to compare the calculated and measured results. Good agreements are noticed between the calculated and measured results with maximum relative differences less than 7%, 4% and 4% in the x, y and z directions respectively. This agreement indicates that the established model is an accurate representation of the SGIF and can be used in the future to make the calculation design for a new irradiation facility

  19. Fitting and benchmarking of Monte Carlo output parameters for iridium-192 high dose rate brachytherapy source

    International Nuclear Information System (INIS)

    Acquah, F.G.

    2011-01-01

    Brachytherapy, the use of radioactive sources for the treatment of tumours is an important tool in radiation oncology. Accurate calculations of dose delivered to malignant and normal tissues are the main responsibility of the Medical Physics staff. With the use of Treatment Planning System (TPS) computers now becoming a standard practice in the Radiation Oncology Departments, Independent calculations to certify the results of these commercial TPSs are important part of a good quality management system for brachytherapy implants. There are inherent errors in the dose distributions produced by these TPSs due to its failure to account for heterogeneity in the calculation algorithms and Monte Carlo (MC) method seems to be the panacea for these corrections. In this study, a fit functional form using MC output parameters was performed to reduce dose calculation uncertainty using the Matlab software curve fitting applications. This includes the modification of the AAPM TG-43 parameters to accommodate the new developments for a rapid brachytherapy dose rate calculation. Analytical computations were performed to hybridize the anisotropy function, F(r,θ) and radial dose function, g(r) into a single new function f(r,θ) for the Nucletron microSelectron High Dose Rate 'new or v2' (mHDRv2) 192 Ir brachytherapy source. In order to minimize computation time and to improve the accuracy of manual calculations, the dosimetry function f(r,θ) used fewer parameters and formulas for the fit. Using MC outputs as the standard, the percentage errors for the fits were calculated and used to evaluate the average and maximum uncertainties. Dose rate deviation between the MC data and fit were also quantified as errors(E), which showed minimal values. These results showed that the dosimetry parameters from this study as compared to those of MC outputs parameters were in good agreement and better than the results obtained from literature. The work confirms a lot of promise in building robust

  20. Benchmarking the implementation of E-Commerce A Case Study Approach

    OpenAIRE

    von Ettingshausen, C. R. D. Freiherr

    2009-01-01

    The purpose of this thesis was to develop a guideline to support the implementation of E-Commerce with E-Commerce benchmarking. Because of its importance as an interface with the customer, web-site benchmarking has been a widely researched topic. However, limited research has been conducted on benchmarking E-Commerce across other areas of the value chain. Consequently this thesis aims to extend benchmarking into E-Commerce related subjects. The literature review examined ...

  1. Benchmark studies of induced radioactivity and remanent dose rates produced in LHC materials

    International Nuclear Information System (INIS)

    Brugger, M.; Mayer, S.; Roesler, S.; Ulrici, L.; Khater, H.; Prinz, A.; Vincke, H.

    2005-01-01

    Samples of materials that will be used for elements of the LHC machine as well as for shielding and construction components were irradiated in the stray radiation field of the CERN-EU high-energy Reference Field facility. The materials included various types of steel, copper, titanium, concrete and marble as well as light materials such as carbon composites and boron nitride. Emphasis was put on an accurate recording of the irradiation conditions, such as irradiation profile and intensity, and on a detailed determination of the elemental composition of the samples. After the irradiation, the specific activity induced in the samples as well as the remanent dose rate were measured at different cooling times ranging from about 20 minutes to two months. Furthermore, the irradiation experiment was simulated using the FLUKA Monte Carlo code and specific activities. In addition, dose rates were calculated. The latter was based on a new method simulating the production of various isotopes and the electromagnetic cascade induced by radioactive decay at a certain cooling time. In general, solid agreement was found, which engenders confidence in the predictive power of the applied codes and tools for the estimation of the radioactive nuclide inventory of the LHC machine as well as the calculation of remanent doses to personnel during interventions. (authors)

  2. Vision-Based Parking-Slot Detection: A Benchmark and A Learning-Based Approach

    Directory of Open Access Journals (Sweden)

    Lin Zhang

    2018-03-01

    Full Text Available Recent years have witnessed a growing interest in developing automatic parking systems in the field of intelligent vehicles. However, how to effectively and efficiently locating parking-slots using a vision-based system is still an unresolved issue. Even more seriously, there is no publicly available labeled benchmark dataset for tuning and testing parking-slot detection algorithms. In this paper, we attempt to fill the above-mentioned research gaps to some extent and our contributions are twofold. Firstly, to facilitate the study of vision-based parking-slot detection, a large-scale parking-slot image database is established. This database comprises 8600 surround-view images collected from typical indoor and outdoor parking sites. For each image in this database, the marking-points and parking-slots are carefully labeled. Such a database can serve as a benchmark to design and validate parking-slot detection algorithms. Secondly, a learning-based parking-slot detection approach, namely P S D L , is proposed. Using P S D L , given a surround-view image, the marking-points will be detected first and then the valid parking-slots can be inferred. The efficacy and efficiency of P S D L have been corroborated on our database. It is expected that P S D L can serve as a baseline when the other researchers develop more sophisticated methods.

  3. Application of the random vibration approach in the seismic analysis of LMFBR structures - Benchmark calculations

    International Nuclear Information System (INIS)

    Preumont, A.; Shilab, S.; Cornaggia, L.; Reale, M.; Labbe, P.; Noe, H.

    1992-01-01

    This benchmark exercise is the continuation of the state-of-the-art review (EUR 11369 EN) which concluded that the random vibration approach could be an effective tool in seismic analysis of nuclear power plants, with potential advantages on time history and response spectrum techniques. As compared to the latter, the random vibration method provides an accurate treatment of multisupport excitations, non classical damping as well as the combination of high-frequency modal components. With respect to the former, the random vibration method offers direct information on statistical variability (probability distribution) and cheaper computations. The disadvantages of the random vibration method are that it is based on stationary results, and requires a power spectral density input instead of a response spectrum. A benchmark exercise to compare the three methods from the various aspects mentioned above, on one or several simple structures has been made. The following aspects have been covered with the simplest possible models: (i) statistical variability, (ii) multisupport excitation, (iii) non-classical damping. The random vibration method is therefore concluded to be a reliable method of analysis. Its use is recommended, particularly for preliminary design, owing to its computational advantage on multiple time history analysis

  4. Benchmarking residual dose rates in a NuMI-like environment

    Energy Technology Data Exchange (ETDEWEB)

    Igor L. Rakhno et al.

    2001-11-02

    Activation of various structural and shielding materials is an important issue for many applications. A model developed recently to calculate residual activity of arbitrary composite materials for arbitrary irradiation and cooling times is presented in the paper. Measurements have been performed at the Fermi National Accelerator Laboratory using a 120 GeV proton beam to study induced radioactivation of materials used for beam line components and shielding. The calculated residual dose rates for the samples studied behind the target and outside of the thick shielding are presented and compared with the measured ones. Effects of energy spectra, sample material and dimensions, their distance from the shielding, and gaps between the shielding modules and walls as well as between the modules themselves were studied in detail.

  5. Benchmark of PENELOPE code for low-energy photon transport: dose comparisons with MCNP4 and EGS4

    International Nuclear Information System (INIS)

    Ye, Sung-Joon; Brezovich, Ivan A; Pareek, Prem; Naqvi, Shahid A

    2004-01-01

    The expanding clinical use of low-energy photon emitting 125 I and 103 Pd seeds in recent years has led to renewed interest in their dosimetric properties. Numerous papers pointed out that higher accuracy could be obtained in Monte Carlo simulations by utilizing newer libraries for the low-energy photon cross-sections, such as XCOM and EPDL97. The recently developed PENELOPE 2001 Monte Carlo code is user friendly and incorporates photon cross-section data from the EPDL97. The code has been verified for clinical dosimetry of high-energy electron and photon beams, but has not yet been tested at low energies. In the present work, we have benchmarked the PENELOPE code for 10-150 keV photons. We computed radial dose distributions from 0 to 10 cm in water at photon energies of 10-150 keV using both PENELOPE and MCNP4C with either DLC-146 or DLC-200 cross-section libraries, assuming a point source located at the centre of a 30 cm diameter and 20 cm length cylinder. Throughout the energy range of simulated photons (except for 10 keV), PENELOPE agreed within statistical uncertainties (at worst ±5%) with MCNP/DLC-146 in the entire region of 1-10 cm and with published EGS4 data up to 5 cm. The dose at 1 cm (or dose rate constant) of PENELOPE agreed with MCNP/DLC-146 and EGS4 data within approximately ±2% in the range of 20-150 keV, while MCNP/DLC-200 produced values up to 9% lower in the range of 20-100 keV than PENELOPE or the other codes. However, the differences among the four datasets became negligible above 100 keV

  6. Benchmark of PENELOPE code for low-energy photon transport: dose comparisons with MCNP4 and EGS4.

    Science.gov (United States)

    Ye, Sung-Joon; Brezovich, Ivan A; Pareek, Prem; Naqvi, Shahid A

    2004-02-07

    The expanding clinical use of low-energy photon emitting 125I and 103Pd seeds in recent years has led to renewed interest in their dosimetric properties. Numerous papers pointed out that higher accuracy could be obtained in Monte Carlo simulations by utilizing newer libraries for the low-energy photon cross-sections, such as XCOM and EPDL97. The recently developed PENELOPE 2001 Monte Carlo code is user friendly and incorporates photon cross-section data from the EPDL97. The code has been verified for clinical dosimetry of high-energy electron and photon beams, but has not yet been tested at low energies. In the present work, we have benchmarked the PENELOPE code for 10-150 keV photons. We computed radial dose distributions from 0 to 10 cm in water at photon energies of 10-150 keV using both PENELOPE and MCNP4C with either DLC-146 or DLC-200 cross-section libraries, assuming a point source located at the centre of a 30 cm diameter and 20 cm length cylinder. Throughout the energy range of simulated photons (except for 10 keV), PENELOPE agreed within statistical uncertainties (at worst +/- 5%) with MCNP/DLC-146 in the entire region of 1-10 cm and with published EGS4 data up to 5 cm. The dose at 1 cm (or dose rate constant) of PENELOPE agreed with MCNP/DLC-146 and EGS4 data within approximately +/- 2% in the range of 20-150 keV, while MCNP/DLC-200 produced values up to 9% lower in the range of 20-100 keV than PENELOPE or the other codes. However, the differences among the four datasets became negligible above 100 keV.

  7. An Economical Approach to Estimate a Benchmark Capital Stock. An Optimal Consistency Method

    OpenAIRE

    Jose Miguel Albala-Bertrand

    2003-01-01

    There are alternative methods of estimating capital stock for a benchmark year. However, these methods are costly and time-consuming, requiring the gathering of much basic information as well as the use of some convenient assumptions and guesses. In addition, a way is needed of checking whether the estimated benchmark is at the correct level. This paper proposes an optimal consistency method (OCM), which enables a capital stock to be estimated for a benchmark year, and which can also be used ...

  8. Estimate of safe human exposure levels for lunar dust based on comparative benchmark dose modeling.

    Science.gov (United States)

    James, John T; Lam, Chiu-Wing; Santana, Patricia A; Scully, Robert R

    2013-04-01

    Brief exposures of Apollo astronauts to lunar dust occasionally elicited upper respiratory irritation; however, no limits were ever set for prolonged exposure to lunar dust. The United States and other space faring nations intend to return to the moon for extensive exploration within a few decades. In the meantime, habitats for that exploration, whether mobile or fixed, must be designed to limit human exposure to lunar dust to safe levels. Herein we estimate safe exposure limits for lunar dust collected during the Apollo 14 mission. We instilled three respirable-sized (∼2 μ mass median diameter) lunar dusts (two ground and one unground) and two standard dusts of widely different toxicities (quartz and TiO₂) into the respiratory system of rats. Rats in groups of six were given 0, 1, 2.5 or 7.5 mg of the test dust in a saline-Survanta® vehicle, and biochemical and cellular biomarkers of toxicity in lung lavage fluid were assayed 1 week and one month after instillation. By comparing the dose--response curves of sensitive biomarkers, we estimated safe exposure levels for astronauts and concluded that unground lunar dust and dust ground by two different methods were not toxicologically distinguishable. The safe exposure estimates were 1.3 ± 0.4 mg/m³ (jet-milled dust), 1.0 ± 0.5 mg/m³ (ball-milled dust) and 0.9 ± 0.3 mg/m³ (unground, natural dust). We estimate that 0.5-1 mg/m³ of lunar dust is safe for periodic human exposures during long stays in habitats on the lunar surface.

  9. Patient-specific IMRT verification using independent fluence-based dose calculation software: experimental benchmarking and initial clinical experience

    International Nuclear Information System (INIS)

    Georg, Dietmar; Stock, Markus; Kroupa, Bernhard; Olofsson, Joergen; Nyholm, Tufve; Ahnesjoe, Anders; Karlsson, Mikael

    2007-01-01

    Experimental methods are commonly used for patient-specific intensity-modulated radiotherapy (IMRT) verification. The purpose of this study was to investigate the accuracy and performance of independent dose calculation software (denoted as 'MUV' (monitor unit verification)) for patient-specific quality assurance (QA). 52 patients receiving step-and-shoot IMRT were considered. IMRT plans were recalculated by the treatment planning systems (TPS) in a dedicated QA phantom, in which an experimental 1D and 2D verification (0.3 cm 3 ionization chamber; films) was performed. Additionally, an independent dose calculation was performed. The fluence-based algorithm of MUV accounts for collimator transmission, rounded leaf ends, tongue-and-groove effect, backscatter to the monitor chamber and scatter from the flattening filter. The dose calculation utilizes a pencil beam model based on a beam quality index. DICOM RT files from patient plans, exported from the TPS, were directly used as patient-specific input data in MUV. For composite IMRT plans, average deviations in the high dose region between ionization chamber measurements and point dose calculations performed with the TPS and MUV were 1.6 ± 1.2% and 0.5 ± 1.1% (1 S.D.). The dose deviations between MUV and TPS slightly depended on the distance from the isocentre position. For individual intensity-modulated beams (total 367), an average deviation of 1.1 ± 2.9% was determined between calculations performed with the TPS and with MUV, with maximum deviations up to 14%. However, absolute dose deviations were mostly less than 3 cGy. Based on the current results, we aim to apply a confidence limit of 3% (with respect to the prescribed dose) or 6 cGy for routine IMRT verification. For off-axis points at distances larger than 5 cm and for low dose regions, we consider 5% dose deviation or 10 cGy acceptable. The time needed for an independent calculation compares very favourably with the net time for an experimental approach

  10. A practical approach to determine dose metrics for nanomaterials.

    Science.gov (United States)

    Delmaar, Christiaan J E; Peijnenburg, Willie J G M; Oomen, Agnes G; Chen, Jingwen; de Jong, Wim H; Sips, Adriënne J A M; Wang, Zhuang; Park, Margriet V D Z

    2015-05-01

    Traditionally, administered mass is used to describe doses of conventional chemical substances in toxicity studies. For deriving toxic doses of nanomaterials, mass and chemical composition alone may not adequately describe the dose, because particles with the same chemical composition can have completely different toxic mass doses depending on properties such as particle size. Other dose metrics such as particle number, volume, or surface area have been suggested, but consensus is lacking. The discussion regarding the most adequate dose metric for nanomaterials clearly needs a systematic, unbiased approach to determine the most appropriate dose metric for nanomaterials. In the present study, the authors propose such an approach and apply it to results from in vitro and in vivo experiments with silver and silica nanomaterials. The proposed approach is shown to provide a convenient tool to systematically investigate and interpret dose metrics of nanomaterials. Recommendations for study designs aimed at investigating dose metrics are provided. © 2015 SETAC.

  11. An Eigenstructure Assignment Approach to FDI for the Industrial Actuator Benchmark Test

    DEFF Research Database (Denmark)

    Jørgensen, R.B.; Patton, R.J.; Chen, J.

    1995-01-01

    This paper examines the robustness in modelling uncertainties of an observer-based fault detection and isolation scheme applied to the industrial actuator benchmark problem.......This paper examines the robustness in modelling uncertainties of an observer-based fault detection and isolation scheme applied to the industrial actuator benchmark problem....

  12. Multi-objective approach in thermoenvironomic optimization of a benchmark cogeneration system

    International Nuclear Information System (INIS)

    Sayyaadi, Hoseyn

    2009-01-01

    Multi-objective optimization for designing of a benchmark cogeneration system known as CGAM cogeneration system has been performed. In optimization approach, the exergetic, economic and environmental aspects have been considered, simultaneously. The thermodynamic modeling has been implemented comprehensively while economic analysis conducted in accordance with the total revenue requirement (TRR) method. The results for the single objective thermoeconomic optimization have been compared with the previous studies in optimization of CGAM problem. In multi-objective optimization of the CGAM problem, the three objective functions including the exergetic efficiency, total levelized cost rate of the system product and the cost rate of environmental impact have been considered. The environmental impact objective function has been defined and expressed in cost terms. This objective has been integrated with the thermoeconomic objective to form a new unique objective function known as a thermoenvironomic objective function. The thermoenvironomic objective has been minimized while the exergetic objective has been maximized. One of the most suitable optimization techniques developed using a particular class of search algorithms known as multi-objective evolutionary algorithms (MOEAs) has been considered here. This approach which is developed based on the genetic algorithm has been applied to find the set of Pareto optimal solutions with respect to the aforementioned objective functions. An example of decision-making has been presented and a final optimal solution has been introduced. The sensitivity of the solutions to the interest rate and the fuel cost has been studied

  13. Benchmarking Density Functional Theory Approaches for the Description of Symmetry-Breaking in Long Polymethine Dyes

    KAUST Repository

    Gieseking, Rebecca L.

    2016-04-25

    Long polymethines are well-known experimentally to symmetry-break, which dramatically modifies their linear and nonlinear optical properties. Computational modeling could be very useful to provide insight into the symmetry-breaking process, which is not readily available experimentally; however, accurately predicting the crossover point from symmetric to symmetry-broken structures has proven challenging. Here, we benchmark the accuracy of several DFT approaches relative to CCSD(T) geometries. In particular, we compare analogous hybrid and long-range corrected (LRC) functionals to clearly show the influence of the functional exchange term. Although both hybrid and LRC functionals can be tuned to reproduce the CCSD(T) geometries, the LRC functionals are better performing at reproducing the geometry evolution with chain length and provide a finite upper limit for the gas-phase crossover point; these methods also provide good agreement with the experimental crossover points for more complex polymethines in polar solvents. Using an approach based on LRC functionals, a reduction in the crossover length is found with increasing medium dielectric constant, which is related to localization of the excess charge on the end groups. Symmetry-breaking is associated with the appearance of an imaginary frequency of b2 symmetry involving a large change in the degree of bond-length alternation. Examination of the IR spectra show that short, isolated streptocyanines have a mode at ~1200 cm-1 involving a large change in bond-length alternation; as the polymethine length or the medium dielectric increases, the frequency of this mode decreases before becoming imaginary at the crossover point.

  14. Test One to Test Many: A Unified Approach to Quantum Benchmarks

    Science.gov (United States)

    Bai, Ge; Chiribella, Giulio

    2018-04-01

    Quantum benchmarks are routinely used to validate the experimental demonstration of quantum information protocols. Many relevant protocols, however, involve an infinite set of input states, of which only a finite subset can be used to test the quality of the implementation. This is a problem, because the benchmark for the finitely many states used in the test can be higher than the original benchmark calculated for infinitely many states. This situation arises in the teleportation and storage of coherent states, for which the benchmark of 50% fidelity is commonly used in experiments, although finite sets of coherent states normally lead to higher benchmarks. Here, we show that the average fidelity over all coherent states can be indirectly probed with a single setup, requiring only two-mode squeezing, a 50-50 beam splitter, and homodyne detection. Our setup enables a rigorous experimental validation of quantum teleportation, storage, amplification, attenuation, and purification of noisy coherent states. More generally, we prove that every quantum benchmark can be tested by preparing a single entangled state and measuring a single observable.

  15. Benchmarking in University Toolbox

    Directory of Open Access Journals (Sweden)

    Katarzyna Kuźmicz

    2015-06-01

    Full Text Available In the face of global competition and rising challenges that higher education institutions (HEIs meet, it is imperative to increase innovativeness and efficiency of their management. Benchmarking can be the appropriate tool to search for a point of reference necessary to assess institution’s competitive position and learn from the best in order to improve. The primary purpose of the paper is to present in-depth analysis of benchmarking application in HEIs worldwide. The study involves indicating premises of using benchmarking in HEIs. It also contains detailed examination of types, approaches and scope of benchmarking initiatives. The thorough insight of benchmarking applications enabled developing classification of benchmarking undertakings in HEIs. The paper includes review of the most recent benchmarking projects and relating them to the classification according to the elaborated criteria (geographical range, scope, type of data, subject, support and continuity. The presented examples were chosen in order to exemplify different approaches to benchmarking in higher education setting. The study was performed on the basis of the published reports from benchmarking projects, scientific literature and the experience of the author from the active participation in benchmarking projects. The paper concludes with recommendations for university managers undertaking benchmarking, derived on the basis of the conducted analysis.

  16. Cellular vs. organ approaches to dose estimates

    International Nuclear Information System (INIS)

    Adelstein, S.J.; Kassis, A.I.; Sastry, K.S.R.

    1986-01-01

    The cellular distribution of tissue-incorporated radionuclides has generally been neglected in the dosimetry of internal emitters. Traditional dosimetry assumes homogeneous distribution of radionuclides in organs of interest, while presuming that the ranges of particulate radiations are large relative to typical cell diameters. The macroscopic distribution of dose thus calculated has generally served as a sufficient approximation for the energy deposited within radiosensitive sites. However, with the increasing utilization of intracellular agents, such as thallium-201, it has become necessary to examine the microscopic distribution of energy at the cellular level. This is particularly important in the instance of radionuclides that decay by electron capture or by internal conversion with the release of Auger and Coster-Kronig electrons. In many instances, these electrons are released as a dense shower of low-energy particles with ranges of subcellular dimensions. The high electron density in the immediate vicinity of the decaying atom produces a focal deposition of energy that far exceeds the average dose taken over several cell diameters. These studies point out the increasing need to take into account the microscopic distribution of dose on the cellular level as radionuclides distributed in cells become more commonplace, especially if the decay involves electron capture or internal conversion. As radiotracers are developed for the measurement of intracellular functions these factors should be given greater consideration. 16 references, 5 figures, 5 tables

  17. Appraisement and benchmarking of third-party logistic service provider by exploration of risk-based approach

    Directory of Open Access Journals (Sweden)

    Nitin Kumar Sahu

    2015-12-01

    Full Text Available In the present era, Reverse Logistics Support has monitored as a momentous realm, where stuffs are transferred from point of consumption to origin. The companies who provide the logistic equipments, i.e. Truck, Joseph Cyril Bomford, and Shipment, etc. to its partner’s firms called Third-Party Logistics (3PL service provider. Today, the feasible 3PL service provider evaluation-opt problem is yet an amorous dilemma. The appraisement and benchmarking of logistics service providers in extent of index; allied risk-based indices and their interrelated metrics; outlooked as a great tool for each international firm, in order that firm could obtain their core goals. The novelty of manuscript is that here, a hairy-based approach has been integrated and then implemented upon a novel developed multi hierarchical third-party logistics (3PL service providers appraisement index in purpose to umpire the 3PL provider for their strong and ill’s core indices. Moreover, the overall score (Si system has also been carried out for benchmarking the 3PL provider companies, where s1 has been found as the best 3PL service provider. The developed approach enabled the manager of firms to make the verdict towards the best inclusive evaluation process of 3PL performance appraisement and benchmarking. A numerical illustration has also been provided to validate the verdict support system.

  18. Benchmarking the efficiency of the Chilean water and sewerage companies: a double-bootstrap approach.

    Science.gov (United States)

    Molinos-Senante, María; Donoso, Guillermo; Sala-Garrido, Ramon; Villegas, Andrés

    2018-03-01

    Benchmarking the efficiency of water companies is essential to set water tariffs and to promote their sustainability. In doing so, most of the previous studies have applied conventional data envelopment analysis (DEA) models. However, it is a deterministic method that does not allow to identify environmental factors influencing efficiency scores. To overcome this limitation, this paper evaluates the efficiency of a sample of Chilean water and sewerage companies applying a double-bootstrap DEA model. Results evidenced that the ranking of water and sewerage companies changes notably whether efficiency scores are computed applying conventional or double-bootstrap DEA models. Moreover, it was found that the percentage of non-revenue water and customer density are factors influencing the efficiency of Chilean water and sewerage companies. This paper illustrates the importance of using a robust and reliable method to increase the relevance of benchmarking tools.

  19. Hand rub dose needed for a single disinfection varies according to product: A bias in benchmarking using indirect hand hygiene indicator

    Directory of Open Access Journals (Sweden)

    Raphaële Girard

    2012-12-01

    Results: Data from 27 products and 1706 tests were analyzed. Depending on the product, the dose needed to ensure a 30-s contact duration in 75% of tests ranging from 2 ml to more than 3 ml, and to ensure a contact duration exceeding the EN 1500 times in 75% of tests ranging from 1.5 ml to more than 3 ml. The aftermath interpretation is the following: if different products are used, the volume utilized does not give an unbiased estimation of the HH compliance. Other compliance evaluation methods remain necessary for efficient benchmarking.

  20. Preparation of dry ports for a competitive environment in the container seaport system: A process benchmarking approach

    Directory of Open Access Journals (Sweden)

    J. Jeevan

    2017-06-01

    Full Text Available The significant exodus of containers inland due to the container revolution has increased the salience of inland terminals for efficient freight distribution. Further, the migration of containers gradually inland has forced seaports to depend on these inland terminals to determine their competitiveness and offer a mechanism for competitive freight price to the consumer. The performance of dry ports need to be improved along with the dynamic nature of maritime business, to efficiently fulfil the demand all the key players in the container seaport system, provide economies of scale and scope to their respective clients and enhances the importance of inland networks to improve and consistently elongate the competitiveness of container seaports. Predicated to these importance, this paper aims to enhance dry port performance by adapting a process benchmarking strategy among the Malaysian dry ports. Prior to the adaptation of the process benchmarking approach, a grounded theory had been conducted as a method of analysis among the key players of the Malaysian container seaport system in order to provide essential inputs for the benchmarking. Through this paper, the outcome shows all four Malaysian dry ports need to improve their transportation infrastructure and operation facilities, container planning strategy, competition, location and externalities in order to assist all the key players in the container seaport system efficiently and effectively.

  1. Derivation of the critical effect size/benchmark response for the dose-response analysis of the uptake of radioactive iodine in the human thyroid.

    Science.gov (United States)

    Weterings, Peter J J M; Loftus, Christine; Lewandowski, Thomas A

    2016-08-22

    Potential adverse effects of chemical substances on thyroid function are usually examined by measuring serum levels of thyroid-related hormones. Instead, recent risk assessments for thyroid-active chemicals have focussed on iodine uptake inhibition, an upstream event that by itself is not necessarily adverse. Establishing the extent of uptake inhibition that can be considered de minimis, the chosen benchmark response (BMR), is therefore critical. The BMR values selected by two international advisory bodies were 5% and 50%, a difference that had correspondingly large impacts on the estimated risks and health-based guidance values that were established. Potential treatment-related inhibition of thyroidal iodine uptake is usually determined by comparing thyroidal uptake of radioactive iodine (RAIU) during treatment with a single pre-treatment RAIU value. In the present study it is demonstrated that the physiological intra-individual variation in iodine uptake is much larger than 5%. Consequently, in-treatment RAIU values, expressed as a percentage of the pre-treatment value, have an inherent variation, that needs to be considered when conducting dose-response analyses. Based on statistical and biological considerations, a BMR of 20% is proposed for benchmark dose analysis of human thyroidal iodine uptake data, to take the inherent variation in relative RAIU data into account. Implications for the tolerated daily intakes for perchlorate and chlorate, recently established by the European Food Safety Authority (EFSA), are discussed. Copyright © 2016 The Author(s). Published by Elsevier Ireland Ltd.. All rights reserved.

  2. Benchmarking Tool Kit.

    Science.gov (United States)

    Canadian Health Libraries Association.

    Nine Canadian health libraries participated in a pilot test of the Benchmarking Tool Kit between January and April, 1998. Although the Tool Kit was designed specifically for health libraries, the content and approach are useful to other types of libraries as well. Used to its full potential, benchmarking can provide a common measuring stick to…

  3. Assessment of the municipal solid waste management system in Accra, Ghana: A 'Wasteaware' benchmark indicator approach.

    Science.gov (United States)

    Oduro-Appiah, Kwaku; Scheinberg, Anne; Mensah, Anthony; Afful, Abraham; Boadu, Henry Kofi; de Vries, Nanne

    2017-11-01

    This article assesses the performance of the city of Accra, Ghana, in municipal solid waste management as defined by the integrated sustainable waste management framework. The article reports on a participatory process to socialise the Wasteaware benchmark indicators and apply them to an upgraded set of data and information. The process has engaged 24 key stakeholders for 9 months, to diagram the flow of materials and benchmark three physical components and three governance aspects of the city's municipal solid waste management system. The results indicate that Accra is well below some other lower middle-income cities regarding sustainable modernisation of solid waste services. Collection coverage and capture of 75% and 53%, respectively, are a disappointing result, despite (or perhaps because of) 20 years of formal private sector involvement in service delivery. A total of 62% of municipal solid waste continues to be disposed of in controlled landfills and the reported recycling rate of 5% indicates both a lack of good measurement and a lack of interest in diverting waste from disposal. Drains, illegal dumps and beaches are choked with discarded bottles and plastic packaging. The quality of collection, disposal and recycling score between low and medium on the Wasteaware indicators, and the scores for user inclusivity, financial sustainability and local institutional coherence are low. The analysis suggests that waste and recycling would improve through greater provider inclusivity, especially the recognition and integration of the informal sector, and interventions that respond to user needs for more inclusive decision-making.

  4. A Statewide Collaboration: Ohio Level III Trauma Centers' Approach to the Development of a Benchmarking System.

    Science.gov (United States)

    Lang, Carrie L; Simon, Diane; Kilgore, Jane

    The American College of Surgeons Committee on Trauma revised the Resources for Optimal Care of the Injured Patient to include the criteria for trauma centers to participate in a risk-adjusted benchmarking system. Trauma Quality Improvement Program is currently the risk-adjusted benchmarking program sponsored by the American College of Surgeons, which will be required of all trauma centers to participate in early 2017. Prior to this, there were no risk-adjusted programs for Level III verified trauma centers. The Ohio Society of Trauma Nurse Leaders is a collaborative group made up of trauma program managers, coordinators, and other trauma leaders who meet 6 times a year. Within this group, a Level III Subcommittee was formed initially to provide a place for the Level III centers to discuss issues specific to the Level III centers. When the new requirement regarding risk-adjustment became official, the subcommittee agreed to begin reporting simple data points with the idea to risk adjust in the future.

  5. Altered operant responding for motor reinforcement and the determination of benchmark doses following perinatal exposure to low-level 2,3,7,8-tetrachlorodibenzo-p-dioxin.

    Science.gov (United States)

    Markowski, V P; Zareba, G; Stern, S; Cox, C; Weiss, B

    2001-06-01

    Pregnant Holtzman rats were exposed to a single oral dose of 0, 20, 60, or 180 ng/kg 2,3,7,8-tetrachlorodibenzo-p-dioxin (TCDD) on the 18th day of gestation. Their adult female offspring were trained to respond on a lever for brief opportunities to run in specially designed running wheels. Once they had begun responding on a fixed-ratio 1 (FR1) schedule of reinforcement, the fixed-ratio requirement for lever pressing was increased at five-session intervals to values of FR2, FR5, FR10, FR20, and FR30. We examined vaginal cytology after each behavior session to track estrous cyclicity. Under each of the FR values, perinatal TCDD exposure produced a significant dose-related reduction in the number of earned opportunities to run, the lever response rate, and the total number of revolutions in the wheel. Estrous cyclicity was not affected. Because of the consistent dose-response relationship at all FR values, we used the behavioral data to calculate benchmark doses based on displacements from modeled zero-dose performance of 1% (ED(01)) and 10% (ED(10)), as determined by a quadratic fit to the dose-response function. The mean ED(10) benchmark dose for earned run opportunities was 10.13 ng/kg with a 95% lower bound of 5.77 ng/kg. The corresponding ED(01) was 0.98 ng/kg with a 95% lower bound of 0.83 ng/kg. The mean ED(10) for total wheel revolutions was calculated as 7.32 ng/kg with a 95% lower bound of 5.41 ng/kg. The corresponding ED(01) was 0.71 ng/kg with a 95% lower bound of 0.60. These values should be viewed from the perspective of current human body burdens, whose average value, based on TCDD toxic equivalents, has been calculated as 13 ng/kg.

  6. Environmental dose reconstruction: Approaches to an inexact science

    International Nuclear Information System (INIS)

    Hoffman, F.O.

    1991-01-01

    The endpoints of environmental dose reconstruction are quantitative yet the science is inexact. Four problems related to this issue are described. These problems are: (1) Defining the scope of the assessment and setting logical priorities for detailed investigations, (2) Recognizing the influence of investigator judgment of the results, (3) Selecting an endpoint other than dose for the assessment of multiple contaminants, and (4) Resolving the conflict between credibility and expertise in selecting individuals responsible for dose reconstruction. Approaches are recommended for dealing with each of these problems

  7. Estimation of benchmark dose as the threshold levels of urinary cadmium, based on excretion of total protein, β 2-microglobulin, and N-acetyl-β-D-glucosaminidase in cadmium nonpolluted regions in Japan

    International Nuclear Information System (INIS)

    Kobayashi, Etsuko; Suwazono, Yasushi; Uetani, Mirei; Inaba, Takeya; Oishi, Mitsuhiro; Kido, Teruhiko; Nishijo, Muneko; Nakagawa, Hideaki; Nogawa, Koji

    2006-01-01

    Previously, we investigated the association between urinary cadmium (Cd) concentration and indicators of renal dysfunction, including total protein, β 2 -microglobulin (β 2 -MG), and N-acetyl-β-D-glucosaminidase (NAG). In 2778 inhabitants ≥50 years of age (1114 men, 1664 women) in three different Cd nonpolluted areas in Japan, we showed that a dose-response relationship existed between renal effects and Cd exposure in the general environment without any known Cd pollution. However, we could not estimate the threshold levels of urinary Cd at that time. In the present study, we estimated the threshold levels of urinary Cd as the benchmark dose low (BMDL) using the benchmark dose (BMD) approach. Urinary Cd excretion was divided into 10 categories, and an abnormality rate was calculated for each. Cut-off values for urinary substances were defined as corresponding to the 84% and 95% upper limit values of the target population who have not smoked. Then we calculated the BMD and BMDL using a log-logistic model. The values of BMD and BMDL for all urinary substances could be calculated. The BMDL for the 84% cut-off value of β 2 -MG, setting an abnormal value at 5%, was 2.4 μg/g creatinine (cr) in men and 3.3 μg/g cr in women. In conclusion, the present study demonstrated that the threshold level of urinary Cd could be estimated in people living in the general environment without any known Cd-pollution in Japan, and the value was inferred to be almost the same as that in Belgium, Sweden, and China

  8. Posture Control—Human-Inspired Approaches for Humanoid Robot Benchmarking: Conceptualizing Tests, Protocols and Analyses

    Directory of Open Access Journals (Sweden)

    Thomas Mergner

    2018-05-01

    Full Text Available Posture control is indispensable for both humans and humanoid robots, which becomes especially evident when performing sensorimotor tasks such as moving on compliant terrain or interacting with the environment. Posture control is therefore targeted in recent proposals of robot benchmarking in order to advance their development. This Methods article suggests corresponding robot tests of standing balance, drawing inspirations from the human sensorimotor system and presenting examples from robot experiments. To account for a considerable technical and algorithmic diversity among robots, we focus in our tests on basic posture control mechanisms, which provide humans with an impressive postural versatility and robustness. Specifically, we focus on the mechanically challenging balancing of the whole body above the feet in the sagittal plane around the ankle joints in concert with the upper body balancing around the hip joints. The suggested tests target three key issues of human balancing, which appear equally relevant for humanoid bipeds: (1 four basic physical disturbances (support surface (SS tilt and translation, field and contact forces may affect the balancing in any given degree of freedom (DoF. Targeting these disturbances allows us to abstract from the manifold of possible behavioral tasks. (2 Posture control interacts in a conflict-free way with the control of voluntary movements for undisturbed movement execution, both with “reactive” balancing of external disturbances and “proactive” balancing of self-produced disturbances from the voluntary movements. Our proposals therefore target both types of disturbances and their superposition. (3 Relevant for both versatility and robustness of the control, linkages between the posture control mechanisms across DoFs provide their functional cooperation and coordination at will and on functional demands. The suggested tests therefore include ankle-hip coordination. Suggested benchmarking

  9. Sustainable operations management and benchmarking in brewing: A factor weighting approach

    Directory of Open Access Journals (Sweden)

    Daniel P. Bumblauskas

    2017-06-01

    Full Text Available The brewing industry has been moving towards more efficient use of energy, water reuse and stewardship, and the tracking of greenhouse gas (GHG emissions to better manage environmental and social responsibility. Commercial breweries use a great deal of water and energy to convert one gallon (liter of water into one gallon (liter of beer. An analysis was conducted on sustainable operations and supply chain management at various United States and international breweries, specifically Europe, to benchmark brewery performance and establish common metrics for sustainability in the beer supply chain. The primary research questions explored in this article are whether water reclamation and GHG emissions can be properly monitored and measured and if processes can be created to help control waste (lean and emissions. Additional questions include how we can use operations management strategies and techniques such as the Factor-Weighted Method (FWM in industries such as brewing to develop sustainability scorecards.

  10. Fuzzy Similarity Measures Approach in Benchmarking Taxonomies of Threats against SMEs in Developing Economies

    DEFF Research Database (Denmark)

    Yeboah-Boateng, Ezer Osei

    2013-01-01

    There are various threats that militate against SMEs in developing economies. However, most SMEs fall on the conservative “TV News Effect” of most-publicized cyber-threats or incidences, with disproportionate mitigation measures. This paper endeavors to establish a taxonomy of threat agents to fill...... in the void. Various fuzzy similarity measures based on multi-attribute decision-making techniques have been employed in the evaluation. The taxonomy offers a panoramic view of cyber-threats in assessing mission-critical assets, and serves as a benchmark for initiating appropriate mitigation strategies. SMEs...... in developing economies were strategically interviewed for their expert opinions on various business and security metrics. The study established that natural disasters, which are perennial in most developing economies, are the most critical cyber-threat agent, whilst social engineering is the least critical...

  11. Study of blood flow in several benchmark micro-channels using a two-fluid approach.

    Science.gov (United States)

    Wu, Wei-Tao; Yang, Fang; Antaki, James F; Aubry, Nadine; Massoudi, Mehrdad

    2015-10-01

    It is known that in a vessel whose characteristic dimension (e.g., its diameter) is in the range of 20 to 500 microns, blood behaves as a non-Newtonian fluid, exhibiting complex phenomena, such as shear-thinning, stress relaxation, and also multi-component behaviors, such as the Fahraeus effect, plasma-skimming, etc. For describing these non-Newtonian and multi-component characteristics of blood, using the framework of mixture theory, a two-fluid model is applied, where the plasma is treated as a Newtonian fluid and the red blood cells (RBCs) are treated as shear-thinning fluid. A computational fluid dynamic (CFD) simulation incorporating the constitutive model was implemented using OpenFOAM® in which benchmark problems including a sudden expansion and various driven slots and crevices were studied numerically. The numerical results exhibited good agreement with the experimental observations with respect to both the velocity field and the volume fraction distribution of RBCs.

  12. Hand rub dose needed for a single disinfection varies according to product: a bias in benchmarking using indirect hand hygiene indicator.

    Science.gov (United States)

    Girard, Raphaële; Aupee, Martine; Erb, Martine; Bettinger, Anne; Jouve, Alice

    2012-12-01

    The 3ml volume currently used as the hand hygiene (HH) measure has been explored as the pertinent dose for an indirect indicator of HH compliance. A multicenter study was conducted in order to ascertain the required dose using different products. The average contact duration before drying was measured and compared with references. Effective hand coverage had to include the whole hand and the wrist. Two durations were chosen as points of reference: 30s, as given by guidelines, and the duration validated by the European standard EN 1500. Each product was to be tested, using standardized procedures, by three nosocomial infection prevention teams, for three different doses (3, 2 and 1.5ml). Data from 27 products and 1706 tests were analyzed. Depending on the product, the dose needed to ensure a 30-s contact duration in 75% of tests ranging from 2ml to more than 3ml, and to ensure a contact duration exceeding the EN 1500 times in 75% of tests ranging from 1.5ml to more than 3ml. The aftermath interpretation is the following: if different products are used, the volume utilized does not give an unbiased estimation of the HH compliance. Other compliance evaluation methods remain necessary for efficient benchmarking. Copyright © 2012 Ministry of Health, Saudi Arabia. Published by Elsevier Ltd. All rights reserved.

  13. Nernst-Planck Based Description of Transport, Coulombic Interactions and Geochemical Reactions in Porous Media: Modeling Approach and Benchmark Experiments

    DEFF Research Database (Denmark)

    Rolle, Massimo; Sprocati, Riccardo; Masi, Matteo

    2018-01-01

    ‐ but also under advection‐dominated flow regimes. To accurately describe charge effects in flow‐through systems, we propose a multidimensional modeling approach based on the Nernst‐Planck formulation of diffusive/dispersive fluxes. The approach is implemented with a COMSOL‐PhreeqcRM coupling allowing us......, and high‐resolution experimental datasets. The latter include flow‐through experiments that have been carried out in this study to explore the effects of electrostatic interactions in fully three‐dimensional setups. The results of the simulations show excellent agreement for all the benchmarks problems...... the quantification and visualization of the specific contributions to the diffusive/dispersive Nernst‐Planck fluxes, including the Fickian component, the term arising from the activity coefficient gradients, and the contribution due to electromigration....

  14. Benchmark experiments of dose distributions in phantom placed behind iron and concrete shields at the TIARA facility

    International Nuclear Information System (INIS)

    Nakane, Yoshihiro; Sakamoto, Yukio; Tsuda, Shuichi

    2004-01-01

    To verify the calculation methods used for the evaluations of neutron dose at the radiation shielding design of the high-intensity proton accelerator facility (J-PARC), dose distributions in a plastic phantom of 30x30x30 cm 3 slab placed behind iron and concrete test shields were measured by using a tissue equivalent proportional counter for 65-MeV quasi-monoenergetic neutrons generated from the 7 Li(p,n) reactions with 68-MeV protons at the TIARA facility. Dose distributions in the phantom were calculated by using the MCNPX and the NMTC/JAM-MCNP codes with the flux-to-dose conversion coefficients prepared for the shielding design of the facility. The comparison results show the calculated results were in good agreement with the measured ones within 20%. (author)

  15. Benchmarking whole-building energy performance with multi-criteria technique for order preference by similarity to ideal solution using a selective objective-weighting approach

    International Nuclear Information System (INIS)

    Wang, Endong

    2015-01-01

    Highlights: • A TOPSIS based multi-criteria whole-building energy benchmarking is developed. • A selective objective-weighting procedure is used for a cost-accuracy tradeoff. • Results from a real case validated the benefits of the presented approach. - Abstract: This paper develops a robust multi-criteria Technique for Order Preference by Similarity to Ideal Solution (TOPSIS) based building energy efficiency benchmarking approach. The approach is explicitly selective to address multicollinearity trap due to the subjectivity in selecting energy variables by considering cost-accuracy trade-off. It objectively weights the relative importance of individual pertinent efficiency measuring criteria using either multiple linear regression or principal component analysis contingent on meta data quality. Through this approach, building energy performance is comprehensively evaluated and optimized. Simultaneously, the significant challenges associated with conventional single-criterion benchmarking models can be avoided. Together with a clustering algorithm on a three-year panel dataset, the benchmarking case of 324 single-family dwellings demonstrated an improved robustness of the presented multi-criteria benchmarking approach over the conventional single-criterion ones

  16. Radiation dose in mammography: an energy-balance approach

    International Nuclear Information System (INIS)

    Shrivastava, P.N.

    1981-01-01

    An energy-balance approach for calculation of mean, integral, and midpoint doses in mammography is introduced. Estimation of mean absorbed dose for individual applications is described. Calculations made for a range of xeromammographic techniques used at various breast cancer detection centers show that although increasing the beam h.v.l. dramatically decreases breast surface exposure, it is insignificant in lowering mean breast dose or radiation risk. Thus selection of a moderate h.v.l. to optimize image quality in xeromammography may be more beneficial than unduly increasing h.v.l. merely to reduce surface exposure. The mean breast dose per mammogram with low h.v.l. screen-film techniques was 3 to 9 times lower than for xeromammography, suggesting that general acceptance of screen-film techniques can significantly reduce the risk associated with mammography

  17. Radiation dose in mammography: an energy-balance approach

    International Nuclear Information System (INIS)

    Shrivastava, P.N.

    1981-01-01

    An energy-balance approach for calculation of mean, integral, and midpoint doses in mammography is introduced. Estimation of mean absorbed dose for individual applications is described. Differences in breast composition and thickness are accounted for by simple measurements of entrance and exit exposures. Calculations made for a range of xeromammographic techniques used at various breast cancer detection centers show that although increasing the beam h.v.l. dramatically decreases breast surface exposure, it is insignificant in lowering mean breast dose or radiation risk. Thus selection of a moderate h.v.l. to optimize image quality (soft-tissue contrast) in xeromammography may be more beneficial than unduly increasing h.v.l. merely to reduce surface exposure. The mean breast dose per mammogram with low-h.v.l. screen-film techniques was 3 to 9 times lower than for xeromammography, suggesting that general acceptance of screen-film techniques can significantly reduce the risk associated with mammography

  18. Library Benchmarking

    Directory of Open Access Journals (Sweden)

    Wiji Suwarno

    2017-02-01

    Full Text Available The term benchmarking has been encountered in the implementation of total quality (TQM or in Indonesian termed holistic quality management because benchmarking is a tool to look for ideas or learn from the library. Benchmarking is a processof measuring and comparing for continuous business process of systematic and continuous measurement, the process of measuring and comparing for continuous business process of an organization to get information that can help these organization improve their performance efforts.

  19. Two NEA sensitivity, 1-D benchmark calculations. Part I: Sensitivity of the dose rate at the outside of a PWR configuration and of the vessel damage

    International Nuclear Information System (INIS)

    Canali, U.; Gonano, G.; Nicks, R.

    1978-01-01

    Within the framework of the coordinated programme of sensitivity analysis studies, the reactor shielding benchmark calculation concerning the shield of a typical Pressurized Water Reactor, as proposed by I.K.E. (Stuttgart) and K.W.U. (Erlangen) has been performed. The direct and adjoint fluxes were calculated using ANISN, the cross-section sensitivity using SWANLAKE. The cross-section library used was EL4, 100 neutron + 19 gamma groups. The following quantities were of interest: neutron damage in the pressure vessel; dose rate outside the concrete shield. SWANLAKE was used to calculate the sensitivity of the above mentioned results to variations in the density of each nuclide present. The contributions of the different cross-section Legendre components are also given. Sensitivity profiles indicate the energy ranges in which a cross-section variation has a greater influence on the results. (author)

  20. A survey-based benchmarking approach for health care using the Baldrige quality criteria.

    Science.gov (United States)

    Jennings, K; Westfall, F

    1994-09-01

    Since 1988, manufacturing and service industries have been using the Malcolm Baldrige National Quality Award to assess their management processes (for example, leadership, information, and analysis) against critical performance criteria. Recognizing that the typical Baldrige assessment is time intensive and dependent on intensive training, The Pacer Group, a consulting firm in Dayton, Ohio, developed a self-assessment tool based on the Baldrige criteria which provides a snapshot assessment of an organization's management practices. The survey was administered at 25 hospitals within a health care system. Hospitals were able to compare their scores with other hospitals in the system, as well as the scores of a Baldrige award winner. Results were also analyzed on a systemwide basis to identify strengths and weaknesses across the system. For all 25 hospitals, the following areas were identified as strengths: management of process quality, leadership, and customer focus and satisfaction. Weaknesses included lack of employee involvement in the quality planning process, poor design of quality systems, and lack of cross-departmental cooperation. One of the surveyed hospitals launched improvement initiatives in knowledge of improvement tools and methods and in a patient satisfaction focus. A team was formed to improve the human resource management system. Also, a new unit was designed using patient-centered care principles. A team re-evaluated every operation that affected patients on the unit. A survey modeled after the Baldrige Award criteria can be useful in benchmarking an organization's quality improvement practices.

  1. [An approach to care indicators benchmarking. Learning to improve patient safety].

    Science.gov (United States)

    de Andrés Gimeno, B; Salazar de la Guerra, R M; Ferrer Arnedo, C; Revuelta Zamorano, M; Ayuso Murillo, D; González Soria, J

    2014-01-01

    Improvements in clinical safety can be achieved by promoting a safety culture, professional training, and learning through benchmarking. The aim of this study was to identify areas for improvement after analysing the safety indicators in two public Hospitals in North-West Madrid Region. Descriptive study performed during 2011 in Hospital Universitario Puerta de Hierro Majadahonda (HUPHM) and Hospital de Guadarrama (HG). The variables under study were 40 indicators on nursing care related to patient safety. Nineteen of them were defined in the SENECA project as care quality standards in order to improve patient safety in the hospitals. The data collected were clinical history, Madrid Health Service assessment reports, care procedures, and direct observation Within the 40 indicators: 22 of them were structured (procedures), HUPHM had 86%, and HG 95% 14 process indicators (training and protocols compliance) with similar results in both hospitals, apart from the care continuity reports and training in hand hygiene. The 4 results indicators (pressure ulcer, falls and pain) showed different results. The analysis of the indicators allowed the following actions to be taken: to identify improvements to be made in each hospital, to develop joint safety recommendations in nursing care protocols in prevention and treatment of chronic wound, to establish systematic pain assessments, and to prepare continuity care reports on all patients transferred from HUPHM to HG. Copyright © 2013 SECA. Published by Elsevier Espana. All rights reserved.

  2. Antibiotic reimbursement in a model delinked from sales: a benchmark-based worldwide approach.

    Science.gov (United States)

    Rex, John H; Outterson, Kevin

    2016-04-01

    Despite the life-saving ability of antibiotics and their importance as a key enabler of all of modern health care, their effectiveness is now threatened by a rising tide of resistance. Unfortunately, the antibiotic pipeline does not match health needs because of challenges in discovery and development, as well as the poor economics of antibiotics. Discovery and development are being addressed by a range of public-private partnerships; however, correcting the poor economics of antibiotics will need an overhaul of the present business model on a worldwide scale. Discussions are now converging on delinking reward from antibiotic sales through prizes, milestone payments, or insurance-like models in which innovation is rewarded with a fixed series of payments of a predictable size. Rewarding all drugs with the same payments could create perverse incentives to produce drugs that provide the least possible innovation. Thus, we propose a payment model using a graded array of benchmarked rewards designed to encourage the development of antibiotics with the greatest societal value, together with appropriate worldwide access to antibiotics to maximise human health. Copyright © 2016 Elsevier Ltd. All rights reserved.

  3. Study of blood flow in several benchmark micro-channels using a two-fluid approach

    Science.gov (United States)

    Wu, Wei-Tao; Yang, Fang; Antaki, James F.; Aubry, Nadine; Massoudi, Mehrdad

    2015-01-01

    It is known that in a vessel whose characteristic dimension (e.g., its diameter) is in the range of 20 to 500 microns, blood behaves as a non-Newtonian fluid, exhibiting complex phenomena, such as shear-thinning, stress relaxation, and also multi-component behaviors, such as the Fahraeus effect, plasma-skimming, etc. For describing these non-Newtonian and multi-component characteristics of blood, using the framework of mixture theory, a two-fluid model is applied, where the plasma is treated as a Newtonian fluid and the red blood cells (RBCs) are treated as shear-thinning fluid. A computational fluid dynamic (CFD) simulation incorporating the constitutive model was implemented using OpenFOAM® in which benchmark problems including a sudden expansion and various driven slots and crevices were studied numerically. The numerical results exhibited good agreement with the experimental observations with respect to both the velocity field and the volume fraction distribution of RBCs. PMID:26240438

  4. Benchmark Dose Modeling Estimates of the Concentrations of Inorganic Arsenic That Induce Changes to the Neonatal Transcriptome, Proteome, and Epigenome in a Pregnancy Cohort.

    Science.gov (United States)

    Rager, Julia E; Auerbach, Scott S; Chappell, Grace A; Martin, Elizabeth; Thompson, Chad M; Fry, Rebecca C

    2017-10-16

    Prenatal inorganic arsenic (iAs) exposure influences the expression of critical genes and proteins associated with adverse outcomes in newborns, in part through epigenetic mediators. The doses at which these genomic and epigenomic changes occur have yet to be evaluated in the context of dose-response modeling. The goal of the present study was to estimate iAs doses that correspond to changes in transcriptomic, proteomic, epigenomic, and integrated multi-omic signatures in human cord blood through benchmark dose (BMD) modeling. Genome-wide DNA methylation, microRNA expression, mRNA expression, and protein expression levels in cord blood were modeled against total urinary arsenic (U-tAs) levels from pregnant women exposed to varying levels of iAs. Dose-response relationships were modeled in BMDExpress, and BMDs representing 10% response levels were estimated. Overall, DNA methylation changes were estimated to occur at lower exposure concentrations in comparison to other molecular endpoints. Multi-omic module eigengenes were derived through weighted gene co-expression network analysis, representing co-modulated signatures across transcriptomic, proteomic, and epigenomic profiles. One module eigengene was associated with decreased gestational age occurring alongside increased iAs exposure. Genes/proteins within this module eigengene showed enrichment for organismal development, including potassium voltage-gated channel subfamily Q member 1 (KCNQ1), an imprinted gene showing differential methylation and expression in response to iAs. Modeling of this prioritized multi-omic module eigengene resulted in a BMD(BMDL) of 58(45) μg/L U-tAs, which was estimated to correspond to drinking water arsenic concentrations of 51(40) μg/L. Results are in line with epidemiological evidence supporting effects of prenatal iAs occurring at levels iAs exposure influences neonatal outcome-relevant transcriptomic, proteomic, and epigenomic profiles.

  5. Local approach of cleavage fracture applied to a vessel with subclad flaw. A benchmark on computational simulation

    International Nuclear Information System (INIS)

    Moinereau, D.; Brochard, J.; Guichard, D.; Bhandari, S.; Sherry, A.; France, C.

    1996-10-01

    A benchmark on the computational simulation of a cladded vessel with a 6.2 mm sub-clad flaw submitted to a thermal transient has been conducted. Two-dimensional elastic and elastic-plastic finite element computations of the vessel have been performed by the different partners with respective finite element codes ASTER (EDF), CASTEM 2000 (CEA), SYSTUS (Framatome) and ABAQUS (AEA Technology). Main results have been compared: temperature field in the vessel, crack opening, opening stress at crack tips, stress intensity factor in cladding and base metal, Weibull stress σ w and probability of failure in base metal, void growth rate R/R 0 in cladding. This comparison shows an excellent agreement on main results, in particular on results obtained with local approach. (K.A.)

  6. Interactive benchmarking

    DEFF Research Database (Denmark)

    Lawson, Lartey; Nielsen, Kurt

    2005-01-01

    We discuss individual learning by interactive benchmarking using stochastic frontier models. The interactions allow the user to tailor the performance evaluation to preferences and explore alternative improvement strategies by selecting and searching the different frontiers using directional...... in the suggested benchmarking tool. The study investigates how different characteristics on dairy farms influences the technical efficiency....

  7. RUNE benchmarks

    DEFF Research Database (Denmark)

    Peña, Alfredo

    This report contains the description of a number of benchmarks with the purpose of evaluating flow models for near-shore wind resource estimation. The benchmarks are designed based on the comprehensive database of observations that the RUNE coastal experiment established from onshore lidar...

  8. Benchmarking the stochastic time-dependent variational approach for excitation dynamics in molecular aggregates

    Energy Technology Data Exchange (ETDEWEB)

    Chorošajev, Vladimir [Department of Theoretical Physics, Faculty of Physics, Vilnius University, Sauletekio 9-III, 10222 Vilnius (Lithuania); Gelzinis, Andrius; Valkunas, Leonas [Department of Theoretical Physics, Faculty of Physics, Vilnius University, Sauletekio 9-III, 10222 Vilnius (Lithuania); Department of Molecular Compound Physics, Center for Physical Sciences and Technology, Sauletekio 3, 10222 Vilnius (Lithuania); Abramavicius, Darius, E-mail: darius.abramavicius@ff.vu.lt [Department of Theoretical Physics, Faculty of Physics, Vilnius University, Sauletekio 9-III, 10222 Vilnius (Lithuania)

    2016-12-20

    Highlights: • The Davydov ansatze can be used for finite temperature simulations with an extension. • The accuracy is high if the system is strongly coupled to the environmental phonons. • The approach can simulate time-resolved fluorescence spectra. - Abstract: Time dependent variational approach is a convenient method to characterize the excitation dynamics in molecular aggregates for different strengths of system-bath interaction a, which does not require any additional perturbative schemes. Until recently, however, this method was only applicable in zero temperature case. It has become possible to extend this method for finite temperatures with the introduction of stochastic time dependent variational approach. Here we present a comparison between this approach and the exact hierarchical equations of motion approach for describing excitation dynamics in a broad range of temperatures. We calculate electronic population evolution, absorption and auxiliary time resolved fluorescence spectra in different regimes and find that the stochastic approach shows excellent agreement with the exact approach when the system-bath coupling is sufficiently large and temperatures are high. The differences between the two methods are larger, when temperatures are lower or the system-bath coupling is small.

  9. Benchmarking Density Functional Theory Approaches for the Description of Symmetry-Breaking in Long Polymethine Dyes

    KAUST Repository

    Gieseking, Rebecca L.; Ravva, Mahesh Kumar; Coropceanu, Veaceslav; Bredas, Jean-Luc

    2016-01-01

    in polar solvents. Using an approach based on LRC functionals, a reduction in the crossover length is found with increasing medium dielectric constant, which is related to localization of the excess charge on the end groups. Symmetry-breaking is associated

  10. A Bootstrap Approach of Benchmarking Organizational Maturity Model of Software Product With Educational Maturity Model

    OpenAIRE

    R.Manjula; J.Vaideeswaran

    2012-01-01

    This Software product line engineering is an inter-disciplinary concept. It spans the dimensions of business, architecture, process, and the organization. Similarly, Education System engineering is also an inter-disciplinary concept, which spans the dimensions of academic, infrastructure, facilities, administration etc. Some of the potential benefits of this approach include continuous improvements in System quality and adhering to global standards. The increasing competency in IT and Educati...

  11. WLUP benchmarks

    International Nuclear Information System (INIS)

    Leszczynski, Francisco

    2002-01-01

    The IAEA-WIMS Library Update Project (WLUP) is on the end stage. The final library will be released on 2002. It is a result of research and development made by more than ten investigators during 10 years. The organization of benchmarks for testing and choosing the best set of data has been coordinated by the author of this paper. It is presented the organization, name conventions, contents and documentation of WLUP benchmarks, and an updated list of the main parameters for all cases. First, the benchmarks objectives and types are given. Then, comparisons of results from different WIMSD libraries are included. Finally it is described the program QVALUE for analysis and plot of results. Some examples are given. The set of benchmarks implemented on this work is a fundamental tool for testing new multigroup libraries. (author)

  12. Exploring trade-offs between VMAT dose quality and delivery efficiency using a network optimization approach

    International Nuclear Information System (INIS)

    Salari, Ehsan; Craft, David; Wala, Jeremiah

    2012-01-01

    To formulate and solve the fluence-map merging procedure of the recently-published VMAT treatment-plan optimization method, called vmerge, as a bi-criteria optimization problem. Using an exact merging method rather than the previously-used heuristic, we are able to better characterize the trade-off between the delivery efficiency and dose quality. vmerge begins with a solution of the fluence-map optimization problem with 180 equi-spaced beams that yields the ‘ideal’ dose distribution. Neighboring fluence maps are then successively merged, meaning that they are added together and delivered as a single map. The merging process improves the delivery efficiency at the expense of deviating from the initial high-quality dose distribution. We replace the original merging heuristic by considering the merging problem as a discrete bi-criteria optimization problem with the objectives of maximizing the treatment efficiency and minimizing the deviation from the ideal dose. We formulate this using a network-flow model that represents the merging problem. Since the problem is discrete and thus non-convex, we employ a customized box algorithm to characterize the Pareto frontier. The Pareto frontier is then used as a benchmark to evaluate the performance of the standard vmerge algorithm as well as two other similar heuristics. We test the exact and heuristic merging approaches on a pancreas and a prostate cancer case. For both cases, the shape of the Pareto frontier suggests that starting from a high-quality plan, we can obtain efficient VMAT plans through merging neighboring fluence maps without substantially deviating from the initial dose distribution. The trade-off curves obtained by the various heuristics are contrasted and shown to all be equally capable of initial plan simplifications, but to deviate in quality for more drastic efficiency improvements. This work presents a network optimization approach to the merging problem. Contrasting the trade-off curves of the

  13. Exploring trade-offs between VMAT dose quality and delivery efficiency using a network optimization approach.

    Science.gov (United States)

    Salari, Ehsan; Wala, Jeremiah; Craft, David

    2012-09-07

    To formulate and solve the fluence-map merging procedure of the recently-published VMAT treatment-plan optimization method, called VMERGE, as a bi-criteria optimization problem. Using an exact merging method rather than the previously-used heuristic, we are able to better characterize the trade-off between the delivery efficiency and dose quality. VMERGE begins with a solution of the fluence-map optimization problem with 180 equi-spaced beams that yields the 'ideal' dose distribution. Neighboring fluence maps are then successively merged, meaning that they are added together and delivered as a single map. The merging process improves the delivery efficiency at the expense of deviating from the initial high-quality dose distribution. We replace the original merging heuristic by considering the merging problem as a discrete bi-criteria optimization problem with the objectives of maximizing the treatment efficiency and minimizing the deviation from the ideal dose. We formulate this using a network-flow model that represents the merging problem. Since the problem is discrete and thus non-convex, we employ a customized box algorithm to characterize the Pareto frontier. The Pareto frontier is then used as a benchmark to evaluate the performance of the standard VMERGE algorithm as well as two other similar heuristics. We test the exact and heuristic merging approaches on a pancreas and a prostate cancer case. For both cases, the shape of the Pareto frontier suggests that starting from a high-quality plan, we can obtain efficient VMAT plans through merging neighboring fluence maps without substantially deviating from the initial dose distribution. The trade-off curves obtained by the various heuristics are contrasted and shown to all be equally capable of initial plan simplifications, but to deviate in quality for more drastic efficiency improvements. This work presents a network optimization approach to the merging problem. Contrasting the trade-off curves of the merging

  14. Analysis Approach and Data Package for Mayak Public Doses

    Energy Technology Data Exchange (ETDEWEB)

    Eslinger, Paul W.; Napier, Bruce A.

    2013-09-18

    Historical activities at facilities producing nuclear materials for weapons released radioactivity into the air and water. Past studies in the United States have evaluated the release, atmospheric transport and environmental accumulation of 131I from the nuclear facilities at Hanford in Washington State and the resulting dose to members of the public (Farris et al. 1994). A multi-year dose reconstruction effort (Mokrov et al. 2004) is also being conducted to produce representative dose estimates for members of the public living near Mayak, Russia, from atmospheric releases of 131I at the facilities of the Mayak Production Association. The approach to calculating individual doses to members of the public from historical releases of airborne 131I has the following general steps: • Construct estimates of releases 131I to the air from production facilities. • Model the transport of 131I in the air and subsequent deposition on the ground and vegetation. • Model the accumulation of 131I in soil, water and food products (environmental media). • Calculate the dose for an individual by matching the appropriate lifestyle and consumption data for the individual to the concentrations of 131I in environmental media at their residence location. A number of computer codes were developed to facilitate the study of airborne 131I emissions at Hanford. Of particular interest is DESCARTES code that modeled accumulation of 131I in environmental media (Miley et al. 1994). In addition, the CIDER computer code estimated annual doses to individuals (Eslinger et al. 1994) using the equations and parameters specific to Hanford (Snyder et al. 1994). Several of the computer codes developed to model 131I releases from Hanford are general enough to be used for other facilities. Additional codes have been developed, including the new individual dose code CiderF (Eslinger and Napier 2013), and applied to historical releases of 131I from Mayak. This document provides a data package that

  15. Advanced Computational Approaches for Characterizing Stochastic Cellular Responses to Low Dose, Low Dose Rate Exposures

    Energy Technology Data Exchange (ETDEWEB)

    Scott, Bobby, R., Ph.D.

    2003-06-27

    OAK - B135 This project final report summarizes modeling research conducted in the U.S. Department of Energy (DOE), Low Dose Radiation Research Program at the Lovelace Respiratory Research Institute from October 1998 through June 2003. The modeling research described involves critically evaluating the validity of the linear nonthreshold (LNT) risk model as it relates to stochastic effects induced in cells by low doses of ionizing radiation and genotoxic chemicals. The LNT model plays a central role in low-dose risk assessment for humans. With the LNT model, any radiation (or genotoxic chemical) exposure is assumed to increase one¡¯s risk of cancer. Based on the LNT model, others have predicted tens of thousands of cancer deaths related to environmental exposure to radioactive material from nuclear accidents (e.g., Chernobyl) and fallout from nuclear weapons testing. Our research has focused on developing biologically based models that explain the shape of dose-response curves for low-dose radiation and genotoxic chemical-induced stochastic effects in cells. Understanding the shape of the dose-response curve for radiation and genotoxic chemical-induced stochastic effects in cells helps to better understand the shape of the dose-response curve for cancer induction in humans. We have used a modeling approach that facilitated model revisions over time, allowing for timely incorporation of new knowledge gained related to the biological basis for low-dose-induced stochastic effects in cells. Both deleterious (e.g., genomic instability, mutations, and neoplastic transformation) and protective (e.g., DNA repair and apoptosis) effects have been included in our modeling. Our most advanced model, NEOTRANS2, involves differing levels of genomic instability. Persistent genomic instability is presumed to be associated with nonspecific, nonlethal mutations and to increase both the risk for neoplastic transformation and for cancer occurrence. Our research results, based on

  16. Benchmarking electricity distribution

    Energy Technology Data Exchange (ETDEWEB)

    Watts, K. [Department of Justice and Attorney-General, QLD (Australia)

    1995-12-31

    Benchmarking has been described as a method of continuous improvement that involves an ongoing and systematic evaluation and incorporation of external products, services and processes recognised as representing best practice. It is a management tool similar to total quality management (TQM) and business process re-engineering (BPR), and is best used as part of a total package. This paper discusses benchmarking models and approaches and suggests a few key performance indicators that could be applied to benchmarking electricity distribution utilities. Some recent benchmarking studies are used as examples and briefly discussed. It is concluded that benchmarking is a strong tool to be added to the range of techniques that can be used by electricity distribution utilities and other organizations in search of continuous improvement, and that there is now a high level of interest in Australia. Benchmarking represents an opportunity for organizations to approach learning from others in a disciplined and highly productive way, which will complement the other micro-economic reforms being implemented in Australia. (author). 26 refs.

  17. A Semiempirical Approach to the Determination of Daily Erythemal Doses.

    Science.gov (United States)

    Silva, Abel A; Yamamoto, Ana L C; Corrêa, Marcelo P

    2018-02-15

    The maintenance of ground-based instruments to measure the incidence of ultraviolet radiation (UVR) from the Sun demands strict and well-developed procedures. A piece of equipment can be out of service for a couple of weeks or months for calibration, repair or even the improvement of the facilities where it has been set up. However, the replacement of an instrument in such circumstances can be logistically and financially prohibitive. On the other hand, the lack of data can jeopardize a long-term experiment. In this study, we introduce a semiempirical approach to the determination of the theoretical daily erythemal dose (DED t ) for periods of instrumental absence in a tropical site. The approach is based on 5 years of ground-based measurements of daily erythemal dose (DED) linearly correlated with parameters of total ozone column (TOC) and reflectivity (R PC ) from the Ozone Monitoring Instrument (OMI) and the cosine of solar zenith angle at noon (SZA n ). Seventeen months of missing ground-based data were replaced with DED t , leading to a complete 5-year series of data. The lowest and the highest values of typical DED were 2411 ± 322 J m -2 (1σ) (winter) and 5263 ± 997 J m -2 (summer). The monthly integrated erythemal dose (mED) varied from 59 kJ m -2 (winter) to 162 kJ m -2 (summer). Both of them depended mainly on cos(SZA n ) and R PC . The 12-month integrated erythemal dose (12-ED) ranged from 1350 kJ m -2 to 1546 kJ m -2 , but it can depend significantly on other atmospheric parameter (maybe aerosols) not explicitly considered here. © 2018 The American Society of Photobiology.

  18. Measuring and Benchmarking Technical Efficiency of Public Hospitals in Tianjin, China: A Bootstrap-Data Envelopment Analysis Approach.

    Science.gov (United States)

    Li, Hao; Dong, Siping

    2015-01-01

    China has long been stuck in applying traditional data envelopment analysis (DEA) models to measure technical efficiency of public hospitals without bias correction of efficiency scores. In this article, we have introduced the Bootstrap-DEA approach from the international literature to analyze the technical efficiency of public hospitals in Tianjin (China) and tried to improve the application of this method for benchmarking and inter-organizational learning. It is found that the bias corrected efficiency scores of Bootstrap-DEA differ significantly from those of the traditional Banker, Charnes, and Cooper (BCC) model, which means that Chinese researchers need to update their DEA models for more scientific calculation of hospital efficiency scores. Our research has helped shorten the gap between China and the international world in relative efficiency measurement and improvement of hospitals. It is suggested that Bootstrap-DEA be widely applied into afterward research to measure relative efficiency and productivity of Chinese hospitals so as to better serve for efficiency improvement and related decision making. © The Author(s) 2015.

  19. Benchmark of Client and Server-Side Catchment Delineation Approaches on Web-Based Systems

    Science.gov (United States)

    Demir, I.; Sermet, M. Y.; Sit, M. A.

    2016-12-01

    Recent advances in internet and cyberinfrastructure technologies have provided the capability to acquire large scale spatial data from various gauges and sensor networks. The collection of environmental data increased demand for applications which are capable of managing and processing large-scale and high-resolution data sets. With the amount and resolution of data sets provided, one of the challenging tasks for organizing and customizing hydrological data sets is delineation of watersheds on demand. Watershed delineation is a process for creating a boundary that represents the contributing area for a specific control point or water outlet, with intent of characterization and analysis of portions of a study area. Although many GIS tools and software for watershed analysis are available on desktop systems, there is a need for web-based and client-side techniques for creating a dynamic and interactive environment for exploring hydrological data. In this project, we demonstrated several watershed delineation techniques on the web with various techniques implemented on the client-side using JavaScript and WebGL, and on the server-side using Python and C++. We also developed a client-side GPGPU (General Purpose Graphical Processing Unit) algorithm to analyze high-resolution terrain data for watershed delineation which allows parallelization using GPU. The web-based real-time analysis of watershed segmentation can be helpful for decision-makers and interested stakeholders while eliminating the need of installing complex software packages and dealing with large-scale data sets. Utilization of the client-side hardware resources also eliminates the need of servers due its crowdsourcing nature. Our goal for future work is to improve other hydrologic analysis methods such as rain flow tracking by adapting presented approaches.

  20. Regulatory Benchmarking

    DEFF Research Database (Denmark)

    Agrell, Per J.; Bogetoft, Peter

    2017-01-01

    Benchmarking methods, and in particular Data Envelopment Analysis (DEA), have become well-established and informative tools for economic regulation. DEA is now routinely used by European regulators to set reasonable revenue caps for energy transmission and distribution system operators. The appli......Benchmarking methods, and in particular Data Envelopment Analysis (DEA), have become well-established and informative tools for economic regulation. DEA is now routinely used by European regulators to set reasonable revenue caps for energy transmission and distribution system operators....... The application of bench-marking in regulation, however, requires specific steps in terms of data validation, model specification and outlier detection that are not systematically documented in open publications, leading to discussions about regulatory stability and economic feasibility of these techniques...

  1. Regulatory Benchmarking

    DEFF Research Database (Denmark)

    Agrell, Per J.; Bogetoft, Peter

    2017-01-01

    Benchmarking methods, and in particular Data Envelopment Analysis (DEA), have become well-established and informative tools for economic regulation. DEA is now routinely used by European regulators to set reasonable revenue caps for energy transmission and distribution system operators. The appli......Benchmarking methods, and in particular Data Envelopment Analysis (DEA), have become well-established and informative tools for economic regulation. DEA is now routinely used by European regulators to set reasonable revenue caps for energy transmission and distribution system operators....... The application of benchmarking in regulation, however, requires specific steps in terms of data validation, model specification and outlier detection that are not systematically documented in open publications, leading to discussions about regulatory stability and economic feasibility of these techniques...

  2. Dose selection based on physiologically based pharmacokinetic (PBPK) approaches.

    Science.gov (United States)

    Jones, Hannah M; Mayawala, Kapil; Poulin, Patrick

    2013-04-01

    Physiologically based pharmacokinetic (PBPK) models are built using differential equations to describe the physiology/anatomy of different biological systems. Readily available in vitro and in vivo preclinical data can be incorporated into these models to not only estimate pharmacokinetic (PK) parameters and plasma concentration-time profiles, but also to gain mechanistic insight into compound properties. They provide a mechanistic framework to understand and extrapolate PK and dose across in vitro and in vivo systems and across different species, populations and disease states. Using small molecule and large molecule examples from the literature and our own company, we have shown how PBPK techniques can be utilised for human PK and dose prediction. Such approaches have the potential to increase efficiency, reduce the need for animal studies, replace clinical trials and increase PK understanding. Given the mechanistic nature of these models, the future use of PBPK modelling in drug discovery and development is promising, however some limitations need to be addressed to realise its application and utility more broadly.

  3. Dose-response curve estimation: a semiparametric mixture approach.

    Science.gov (United States)

    Yuan, Ying; Yin, Guosheng

    2011-12-01

    In the estimation of a dose-response curve, parametric models are straightforward and efficient but subject to model misspecifications; nonparametric methods are robust but less efficient. As a compromise, we propose a semiparametric approach that combines the advantages of parametric and nonparametric curve estimates. In a mixture form, our estimator takes a weighted average of the parametric and nonparametric curve estimates, in which a higher weight is assigned to the estimate with a better model fit. When the parametric model assumption holds, the semiparametric curve estimate converges to the parametric estimate and thus achieves high efficiency; when the parametric model is misspecified, the semiparametric estimate converges to the nonparametric estimate and remains consistent. We also consider an adaptive weighting scheme to allow the weight to vary according to the local fit of the models. We conduct extensive simulation studies to investigate the performance of the proposed methods and illustrate them with two real examples. © 2011, The International Biometric Society.

  4. The Precautionary Principle and statistical approaches to uncertainty

    DEFF Research Database (Denmark)

    Keiding, Niels; Budtz-Jørgensen, Esben

    2003-01-01

    Bayesian model averaging; Benchmark approach to safety standards in toxicology; dose-response relationship; environmental standards; exposure measurement uncertainty; Popper falsification......Bayesian model averaging; Benchmark approach to safety standards in toxicology; dose-response relationship; environmental standards; exposure measurement uncertainty; Popper falsification...

  5. The Precautionary Principle and Statistical Approaches to Uncertainty

    DEFF Research Database (Denmark)

    Keiding, Niels; Budtz-Jørgensen, Esben

    2005-01-01

    Bayesian model averaging; Benchmark approach to safety standars in toxicology; dose-response relationships; environmental standards; exposure measurement uncertainty; Popper falsification......Bayesian model averaging; Benchmark approach to safety standars in toxicology; dose-response relationships; environmental standards; exposure measurement uncertainty; Popper falsification...

  6. A mathematical approach to optimal selection of dose values in the additive dose method of ERP dosimetry

    International Nuclear Information System (INIS)

    Hayes, R.B.; Haskell, E.H.; Kenner, G.H.

    1996-01-01

    Additive dose methods commonly used in electron paramagnetic resonance (EPR) dosimetry are time consuming and labor intensive. We have developed a mathematical approach for determining optimal spacing of applied doses and the number of spectra which should be taken at each dose level. Expected uncertainitites in the data points are assumed to be normally distributed with a fixed standard deviation and linearity of dose response is also assumed. The optimum spacing and number of points necessary for the minimal error can be estimated, as can the likely error in the resulting estimate. When low doses are being estimated for tooth enamel samples the optimal spacing is shown to be a concentration of points near the zero dose value with fewer spectra taken at a single high dose value within the range of known linearity. Optimization of the analytical process results in increased accuracy and sample throughput

  7. OECD/DOE/CEA VVER-1000 coolant transient (V1000CT) benchmark - a consistent approach for assessing coupled codes for RIA analysis

    International Nuclear Information System (INIS)

    Boyan D Ivanov; Kostadin N Ivanov; Eric Royer; Sylvie Aniel; Nikola Kolev; Pavlin Groudev

    2005-01-01

    Full text of publication follows: The Rod Ejection Accident (REA) and Main Steam Line Break (MSLB) are two of the most important Design Basis Accidents (DBA) for VVER-1000 exhibiting significant localized space-time effects. A consistent approach for assessing coupled three-dimensional (3-D) neutron kinetics/thermal hydraulics codes for these Reactivity Insertion Accidents (RIA) is to first validate the codes using the available plant test (measured) data and after that perform cross code comparative analysis for REA and MSLB scenarios. In the framework of joint effort between the Nuclear Energy Agency (NEA) of OECD, the United States Department of Energy (US DOE), and the Commissariat a l'Energie Atomique (CEA), France a coupled 3-D neutron kinetics/thermal hydraulics benchmark was defined. The benchmark is based on data from the Unit 6 of the Bulgarian Kozloduy Nuclear Power Plant (NPP). In performing this work the PSU, USA and CEA-Saclay, France have collaborated with Bulgarian organizations, in particular with the KNPP and the INRNE. The benchmark consists of two phases: Phase 1: Main Coolant Pump Switching On; Phase 2: Coolant Mixing Tests and MSLB. In addition to the measured (experiment) scenario, an extreme calculation scenario was defined for better testing 3-D neutronics/thermal-hydraulics techniques: rod ejection simulation with control rod being ejected in the core sector cooled by the switched on MCP. Since the previous coupled code benchmarks indicated that further development of the mixing computation models in the integrated codes is necessary, a coolant mixing experiment and MSLB transients are selected for simulation in Phase 2 of the benchmark. The MSLB event is characterized by a large asymmetric cooling of the core, stuck rods and a large primary coolant flow variation. Two scenarios are defined in Phase 2: the first scenario is taken from the current licensing practice and the second one is derived from the original one using aggravating

  8. Bayesian approach in MN low dose of radiation counting

    International Nuclear Information System (INIS)

    Serna Berna, A.; Alcaraz, M.; Acevedo, C.; Navarro, J. L.; Alcanzar, M. D.; Canteras, M.

    2006-01-01

    The Micronucleus assay in lymphocytes is a well established technique for the assessment of genetic damage induced by ionizing radiation. Due to the presence of a natural background of MN the net MN is obtained by subtracting this value to the gross value. When very low doses of radiation are given the induced MN is close even lower than the predetermined background value. Furthermore, the damage distribution induced by the radiation follows a Poisson probability distribution. These two facts pose a difficult task to obtain the net counting rate in the exposed situations. It is possible to overcome this problem using a bayesian approach, in which the selection of a priori distributions for the background and net counting rate plays an important role. In the present work we make a detailed analysed using bayesian theory to infer the net counting rate in two different situations: a) when the background is known for an individual sample, using exact value value for the background and Jeffreys prior for the net counting rate, and b) when the background is not known and we make use of a population background distribution as background prior function and constant prior for the net counting rate. (Author)

  9. How Benchmarking and Higher Education Came Together

    Science.gov (United States)

    Levy, Gary D.; Ronco, Sharron L.

    2012-01-01

    This chapter introduces the concept of benchmarking and how higher education institutions began to use benchmarking for a variety of purposes. Here, benchmarking is defined as a strategic and structured approach whereby an organization compares aspects of its processes and/or outcomes to those of another organization or set of organizations to…

  10. Benchmarking: applications to transfusion medicine.

    Science.gov (United States)

    Apelseth, Torunn Oveland; Molnar, Laura; Arnold, Emmy; Heddle, Nancy M

    2012-10-01

    Benchmarking is as a structured continuous collaborative process in which comparisons for selected indicators are used to identify factors that, when implemented, will improve transfusion practices. This study aimed to identify transfusion medicine studies reporting on benchmarking, summarize the benchmarking approaches used, and identify important considerations to move the concept of benchmarking forward in the field of transfusion medicine. A systematic review of published literature was performed to identify transfusion medicine-related studies that compared at least 2 separate institutions or regions with the intention of benchmarking focusing on 4 areas: blood utilization, safety, operational aspects, and blood donation. Forty-five studies were included: blood utilization (n = 35), safety (n = 5), operational aspects of transfusion medicine (n = 5), and blood donation (n = 0). Based on predefined criteria, 7 publications were classified as benchmarking, 2 as trending, and 36 as single-event studies. Three models of benchmarking are described: (1) a regional benchmarking program that collects and links relevant data from existing electronic sources, (2) a sentinel site model where data from a limited number of sites are collected, and (3) an institutional-initiated model where a site identifies indicators of interest and approaches other institutions. Benchmarking approaches are needed in the field of transfusion medicine. Major challenges include defining best practices and developing cost-effective methods of data collection. For those interested in initiating a benchmarking program, the sentinel site model may be most effective and sustainable as a starting point, although the regional model would be the ideal goal. Copyright © 2012 Elsevier Inc. All rights reserved.

  11. Use of benchmark dose-volume histograms for selection of the optimal technique between three-dimensional conformal radiation therapy and intensity-modulated radiation therapy in prostate cancer

    International Nuclear Information System (INIS)

    Luo Chunhui; Yang, Claus Chunli; Narayan, Samir; Stern, Robin L.; Perks, Julian; Goldberg, Zelanna; Ryu, Janice; Purdy, James A.; Vijayakumar, Srinivasan

    2006-01-01

    Purpose: The aim of this study was to develop and validate our own benchmark dose-volume histograms (DVHs) of bladder and rectum for both conventional three-dimensional conformal radiation therapy (3D-CRT) and intensity-modulated radiation therapy (IMRT), and to evaluate quantitatively the benefits of using IMRT vs. 3D-CRT in treating localized prostate cancer. Methods and Materials: During the implementation of IMRT for prostate cancer, our policy was to plan each patient with both 3D-CRT and IMRT. This study included 31 patients with T1b to T2c localized prostate cancer, for whom we completed double-planning using both 3D-CRT and IMRT techniques. The target volumes included prostate, either with or without proximal seminal vesicles. Bladder and rectum DVH data were summarized to obtain an average DVH for each technique and then compared using two-tailed paired t test analysis. Results: For 3D-CRT our bladder doses were as follows: mean 28.8 Gy, v60 16.4%, v70 10.9%; rectal doses were: mean 39.3 Gy, v60 21.8%, v70 13.6%. IMRT plans resulted in similar mean dose values: bladder 26.4 Gy, rectum 34.9 Gy, but lower values of v70 for the bladder (7.8%) and rectum (9.3%). These benchmark DVHs have resulted in a critical evaluation of our 3D-CRT techniques over time. Conclusion: Our institution has developed benchmark DVHs for bladder and rectum based on our clinical experience with 3D-CRT and IMRT. We use these standards as well as differences in individual cases to make decisions on whether patients may benefit from IMRT treatment rather than 3D-CRT

  12. A review of occupational dose assessment uncertainties and approaches

    International Nuclear Information System (INIS)

    Anderson, R. W.

    2004-01-01

    The Radiological Protection Practitioner (RPP) will spend a considerable proportion of his time predicting or assessing retrospective radiation exposures to occupational personnel for different purposes. The assessments can be for a variety of purposes, such as to predict doses for occupational dose control, or project design purposes or to make retrospective estimates for the dose record, or account for dosemeters which have been lost or damaged. There are other less frequent occasions when dose assessment will be required such as to support legal cases and compensation claims and to provide the detailed dose information for epidemiological studies. It is important that the level of detail, justification and supporting evidence in the dose assessment is suitable for the requirements. So for instance, day to day operational dose assessments often rely mainly on the knowledge of the RPP in discussion with operators whilst at the other end of the spectrum a historical dose assessment for a legal case will require substantial research and supporting evidence for the estimate to withstand forensic challenge. The robustness of the assessment will depend on many factors including a knowledge of the work activities, the radiation dose uptake and field characteristics; all of which are affected by factors such as the time elapsed, the memory of operators and the dosemeters employed. This paper reviews the various options and uncertainties in dose assessments ranging from use of personal dosimetry results to the development of upper bound assessments. The level of assessment, the extent of research and the evidence adduced should then be appropriate to the end use of the estimate. (Author)

  13. Evaluation of various approaches for assessing dose indicators and patient organ doses resulting from radiotherapy cone-beam CT

    International Nuclear Information System (INIS)

    Rampado, Osvaldo; Giglioli, Francesca Romana; Rossetti, Veronica; Ropolo, Roberto; Fiandra, Christian; Ragona, Riccardo

    2016-01-01

    Purpose: The aim of this study was to evaluate various approaches for assessing patient organ doses resulting from radiotherapy cone-beam CT (CBCT), by the use of thermoluminescent dosimeter (TLD) measurements in anthropomorphic phantoms, a Monte Carlo based dose calculation software, and different dose indicators as presently defined. Methods: Dose evaluations were performed on a CBCT Elekta XVI (Elekta, Crawley, UK) for different protocols and anatomical regions. The first part of the study focuses on using PCXMC software (PCXMC 2.0, STUK, Helsinki, Finland) for calculating organ doses, adapting the input parameters to simulate the exposure geometry, and beam dose distribution in an appropriate way. The calculated doses were compared to readouts of TLDs placed in an anthropomorphic Rando phantom. After this validation, the software was used for analyzing organ dose variability associated with patients’ differences in size and gender. At the same time, various dose indicators were evaluated: kerma area product (KAP), cumulative air-kerma at the isocenter (K_a_i_r), cone-beam dose index, and central cumulative dose. The latter was evaluated in a single phantom and in a stack of three adjacent computed tomography dose index phantoms. Based on the different dose indicators, a set of coefficients was calculated to estimate organ doses for a range of patient morphologies, using their equivalent diameters. Results: Maximum organ doses were about 1 mGy for head and neck and 25 mGy for chest and pelvis protocols. The differences between PCXMC and TLDs doses were generally below 10% for organs within the field of view and approximately 15% for organs at the boundaries of the radiation beam. When considering patient size and gender variability, differences in organ doses up to 40% were observed especially in the pelvic region; for the organs in the thorax, the maximum differences ranged between 20% and 30%. Phantom dose indexes provided better correlation with organ doses

  14. Methods to stimulate national and sub-national benchmarking through international health system performance comparisons: a Canadian approach.

    Science.gov (United States)

    Veillard, Jeremy; Moses McKeag, Alexandra; Tipper, Brenda; Krylova, Olga; Reason, Ben

    2013-09-01

    This paper presents, discusses and evaluates methods used by the Canadian Institute for Health Information to present health system performance international comparisons in ways that facilitate their understanding by the public and health system policy-makers and can stimulate performance benchmarking. We used statistical techniques to normalize the results and present them on a standardized scale facilitating understanding of results. We compared results to the OECD average, and to benchmarks. We also applied various data quality rules to ensure the validity of results. In order to evaluate the impact of the public release of these results, we used quantitative and qualitative methods and documented other types of impact. We were able to present results for performance indicators and dimensions at national and sub-national levels; develop performance profiles for each Canadian province; and show pan-Canadian performance patterns for specific performance indicators. The results attracted significant media attention at national level and reactions from various stakeholders. Other impacts such as requests for additional analysis and improvement in data timeliness were observed. The methods used seemed attractive to various audiences in the Canadian context and achieved the objectives originally defined. These methods could be refined and applied in different contexts. Copyright © 2013 The Authors. Published by Elsevier Ireland Ltd.. All rights reserved.

  15. A Global Vision over Benchmarking Process: Benchmarking Based Enterprises

    OpenAIRE

    Sitnikov, Catalina; Giurca Vasilescu, Laura

    2008-01-01

    Benchmarking uses the knowledge and the experience of others to improve the enterprise. Starting from the analysis of the performance and underlying the strengths and weaknesses of the enterprise it should be assessed what must be done in order to improve its activity. Using benchmarking techniques, an enterprise looks at how processes in the value chain are performed. The approach based on the vision “from the whole towards the parts” (a fragmented image of the enterprise’s value chain) redu...

  16. Approach to 3D dose verification by utilizing autoactivation

    Energy Technology Data Exchange (ETDEWEB)

    Nakajima, Yasunori, E-mail: yasunori.nkjm@gmail.com [Tokyo Institute of Technology, Yokohama-shi (Japan); Kohno, Toshiyuki [Tokyo Institute of Technology, Yokohama-shi (Japan); Inaniwa, Taku; Sato, Shinji; Yoshida, Eiji; Yamaya, Taiga [National Institute of Radiological Sciences, Chiba-shi (Japan); Tsuruta, Yuki [Tokyo Institute of Technology, Yokohama-shi (Japan); Sihver, Lembit [Chalmers University of Technology, Gothenburg (Sweden)

    2011-08-21

    To evaluate the deposited dose distribution in a target, we have proposed to utilize the annihilation gamma-rays emitted from the positron emitters distributed in the target irradiated with stable heavy-ion beams. Verification of the one dimensional (1-D) dose distributions along and perpendicular to a beam axis was achieved through our previous works. The purpose of this work is to verify 3-D dose distributions. As the first attempt uniform PMMA targets were irradiated in simple rectangular parallelepiped shapes, and the annihilation gamma-rays were detected with a PET scanner. By comparing the detected annihilation gamma-ray distributions with the calculated ones the dose distributions were estimated. As a result the estimated positions of the distal edges of the dose distributions were in agreement with the measured ones within 1 mm. However, the estimated positions of the proximal edges were different from the measured ones by 5-9 mm depending on the thickness of the irradiation filed.

  17. Principles of protection: a formal approach for evaluating dose distributions

    International Nuclear Information System (INIS)

    Wikman-Svahn, Per; Peterson, Martin; Hansson, Sven Ove

    2006-01-01

    One of the central issues in radiation protection consists in determining what weight should be given to individual doses in relation to collective or aggregated doses. A mathematical framework is introduced in which such assessments can be made precisely in terms of comparisons between alternative distributions of individual doses. In addition to evaluation principles that are well known from radiation protection, a series of principles that are derived from parallel discussions in moral philosophy and welfare economics is investigated. A battery of formal properties is then used to investigate the evaluative principles. The results indicate that one of the new principles, bilinear prioritarianism, may be preferable to current practices, since it satisfies efficiency-related properties better without sacrificing other desirable properties

  18. Benchmarking in Foodservice Operations

    National Research Council Canada - National Science Library

    Johnson, Bonnie

    1998-01-01

    The objective of this study was to identify usage of foodservice performance measures, important activities in foodservice benchmarking, and benchmarking attitudes, beliefs, and practices by foodservice directors...

  19. The Hanford Environmental Dose Reconstruction (HEDR) Project: Technical approach

    International Nuclear Information System (INIS)

    Napier, B.A.; Freshley, M.D.; Gilbert, R.O.; Haerer, H.A.; Morgan, L.G.; Rhoads, R.E.; Woodruff, R.K.

    1990-01-01

    Historical measurements and current assessment techniques are being combined to estimate potential radiation doses to people from radioactive releases to the air, the Columbia River, soils, and ground water at the Hanford Site since 1944. Environmental contamination from these releases has been monitored, at varying levels of detail, for 45 yr. Phase I of the Hanford Environmental Reconstruction Project will estimate the magnitude of potential doses, their areal extends, and their associated uncertainties. The Phase I study area comprises 10 counties in eastern Washington and northern Oregon, within a 100-mi radius of the site, including a stretch of the Columbia River that was most significantly affected. These counties contain a range of projected and measured contaminant levels, environmental exposure pathways, and population groups. Phase I dose estimates are being developed for the periods 1944 through 1947 for air pathways and 1964 through 1966 for river pathways. Important radionuclide/pathway combinations include fission products, such as 131 I, in milk for early atmospheric releases and activation products, such as 32 P and 65 Zn, in fish for releases to the river. Potential doses range over several orders of magnitude within the study area. We will expand the time periods and study are in three successive phases, as warranted by results of Phase I

  20. Approaches to reducing photon dose calculation errors near metal implants

    Energy Technology Data Exchange (ETDEWEB)

    Huang, Jessie Y.; Followill, David S.; Howell, Rebecca M.; Mirkovic, Dragan; Kry, Stephen F., E-mail: sfkry@mdanderson.org [Department of Radiation Physics, The University of Texas MD Anderson Cancer Center, 1515 Holcombe Boulevard, Houston, Texas 77030 and Graduate School of Biomedical Sciences, The University of Texas Health Science Center Houston, Houston, Texas 77030 (United States); Liu, Xinming [Department of Imaging Physics, The University of Texas MD Anderson Cancer Center, 1515 Holcombe Boulevard, Houston, Texas 77030 and Graduate School of Biomedical Sciences, The University of Texas Health Science Center Houston, Houston, Texas 77030 (United States); Stingo, Francesco C. [Department of Biostatistics, The University of Texas MD Anderson Cancer Center, 1515 Holcombe Boulevard, Houston, Texas 77030 and Graduate School of Biomedical Sciences, The University of Texas Health Science Center Houston, Houston, Texas 77030 (United States)

    2016-09-15

    Purpose: Dose calculation errors near metal implants are caused by limitations of the dose calculation algorithm in modeling tissue/metal interface effects as well as density assignment errors caused by imaging artifacts. The purpose of this study was to investigate two strategies for reducing dose calculation errors near metal implants: implementation of metal-based energy deposition kernels in the convolution/superposition (C/S) dose calculation method and use of metal artifact reduction methods for computed tomography (CT) imaging. Methods: Both error reduction strategies were investigated using a simple geometric slab phantom with a rectangular metal insert (composed of titanium or Cerrobend), as well as two anthropomorphic phantoms (one with spinal hardware and one with dental fillings), designed to mimic relevant clinical scenarios. To assess the dosimetric impact of metal kernels, the authors implemented titanium and silver kernels in a commercial collapsed cone C/S algorithm. To assess the impact of CT metal artifact reduction methods, the authors performed dose calculations using baseline imaging techniques (uncorrected 120 kVp imaging) and three commercial metal artifact reduction methods: Philips Healthcare’s O-MAR, GE Healthcare’s monochromatic gemstone spectral imaging (GSI) using dual-energy CT, and GSI with metal artifact reduction software (MARS) applied. For the simple geometric phantom, radiochromic film was used to measure dose upstream and downstream of metal inserts. For the anthropomorphic phantoms, ion chambers and radiochromic film were used to quantify the benefit of the error reduction strategies. Results: Metal kernels did not universally improve accuracy but rather resulted in better accuracy upstream of metal implants and decreased accuracy directly downstream. For the clinical cases (spinal hardware and dental fillings), metal kernels had very little impact on the dose calculation accuracy (<1.0%). Of the commercial CT artifact

  1. Approaches to reducing photon dose calculation errors near metal implants

    International Nuclear Information System (INIS)

    Huang, Jessie Y.; Followill, David S.; Howell, Rebecca M.; Mirkovic, Dragan; Kry, Stephen F.; Liu, Xinming; Stingo, Francesco C.

    2016-01-01

    Purpose: Dose calculation errors near metal implants are caused by limitations of the dose calculation algorithm in modeling tissue/metal interface effects as well as density assignment errors caused by imaging artifacts. The purpose of this study was to investigate two strategies for reducing dose calculation errors near metal implants: implementation of metal-based energy deposition kernels in the convolution/superposition (C/S) dose calculation method and use of metal artifact reduction methods for computed tomography (CT) imaging. Methods: Both error reduction strategies were investigated using a simple geometric slab phantom with a rectangular metal insert (composed of titanium or Cerrobend), as well as two anthropomorphic phantoms (one with spinal hardware and one with dental fillings), designed to mimic relevant clinical scenarios. To assess the dosimetric impact of metal kernels, the authors implemented titanium and silver kernels in a commercial collapsed cone C/S algorithm. To assess the impact of CT metal artifact reduction methods, the authors performed dose calculations using baseline imaging techniques (uncorrected 120 kVp imaging) and three commercial metal artifact reduction methods: Philips Healthcare’s O-MAR, GE Healthcare’s monochromatic gemstone spectral imaging (GSI) using dual-energy CT, and GSI with metal artifact reduction software (MARS) applied. For the simple geometric phantom, radiochromic film was used to measure dose upstream and downstream of metal inserts. For the anthropomorphic phantoms, ion chambers and radiochromic film were used to quantify the benefit of the error reduction strategies. Results: Metal kernels did not universally improve accuracy but rather resulted in better accuracy upstream of metal implants and decreased accuracy directly downstream. For the clinical cases (spinal hardware and dental fillings), metal kernels had very little impact on the dose calculation accuracy (<1.0%). Of the commercial CT artifact

  2. Benchmarking and Performance Measurement.

    Science.gov (United States)

    Town, J. Stephen

    This paper defines benchmarking and its relationship to quality management, describes a project which applied the technique in a library context, and explores the relationship between performance measurement and benchmarking. Numerous benchmarking methods contain similar elements: deciding what to benchmark; identifying partners; gathering…

  3. Benchmarking in the Netherlands

    International Nuclear Information System (INIS)

    1999-01-01

    In two articles an overview is given of the activities in the Dutch industry and energy sector with respect to benchmarking. In benchmarking operational processes of different competitive businesses are compared to improve your own performance. Benchmark covenants for energy efficiency between the Dutch government and industrial sectors contribute to a growth of the number of benchmark surveys in the energy intensive industry in the Netherlands. However, some doubt the effectiveness of the benchmark studies

  4. Case-mix adjustment approach to benchmarking prevalence rates of nosocomial infection in hospitals in Cyprus and Greece.

    Science.gov (United States)

    Kritsotakis, Evangelos I; Dimitriadis, Ioannis; Roumbelaki, Maria; Vounou, Emelia; Kontou, Maria; Papakyriakou, Panikos; Koliou-Mazeri, Maria; Varthalitis, Ioannis; Vrouchos, George; Troulakis, George; Gikas, Achilleas

    2008-08-01

    To examine the effect of heterogeneous case mix for a benchmarking analysis and interhospital comparison of the prevalence rates of nosocomial infection. Cross-sectional survey. Eleven hospitals located in Cyprus and in the region of Crete in Greece. The survey included all inpatients in the medical, surgical, pediatric, and gynecology-obstetrics wards, as well as those in intensive care units. Centers for Disease Control and Prevention criteria were used to define nosocomial infection. The information collected for all patients included demographic characteristics, primary admission diagnosis, Karnofsky functional status index, Charlson comorbidity index, McCabe-Jackson severity of illness classification, use of antibiotics, and prior exposures to medical and surgical risk factors. Outcome data were also recorded for all patients. Case mix-adjusted rates were calculated by using a multivariate logistic regression model for nosocomial infection risk and an indirect standardization method.Results. The overall prevalence rate of nosocomial infection was 7.0% (95% confidence interval, 5.9%-8.3%) among 1,832 screened patients. Significant variation in nosocomial infection rates was observed across hospitals (range, 2.2%-9.6%). Logistic regression analysis indicated that the mean predicted risk of nosocomial infection across hospitals ranged from 3.7% to 10.3%, suggesting considerable variation in patient risk. Case mix-adjusted rates ranged from 2.6% to 12.4%, and the relative ranking of hospitals was affected by case-mix adjustment in 8 cases (72.8%). Nosocomial infection was significantly and independently associated with mortality (adjusted odds ratio, 3.6 [95% confidence interval, 2.1-6.1]). The first attempt to rank the risk of nosocomial infection in these regions demonstrated the importance of accounting for heterogeneous case mix before attempting interhospital comparisons.

  5. Benchmarking in Czech Higher Education

    Directory of Open Access Journals (Sweden)

    Plaček Michal

    2015-12-01

    Full Text Available The first part of this article surveys the current experience with the use of benchmarking at Czech universities specializing in economics and management. The results indicate that collaborative benchmarking is not used on this level today, but most actors show some interest in its introduction. The expression of the need for it and the importance of benchmarking as a very suitable performance-management tool in less developed countries are the impetus for the second part of our article. Based on an analysis of the current situation and existing needs in the Czech Republic, as well as on a comparison with international experience, recommendations for public policy are made, which lie in the design of a model of a collaborative benchmarking for Czech economics and management in higher-education programs. Because the fully complex model cannot be implemented immediately – which is also confirmed by structured interviews with academics who have practical experience with benchmarking –, the final model is designed as a multi-stage model. This approach helps eliminate major barriers to the implementation of benchmarking.

  6. Application of the ELDO approach to assess cumulative eye lens doses for interventional cardiologists

    International Nuclear Information System (INIS)

    Farah, J.; Jacob, S.; Clairand, I.; Struelens, L.; Vanhavere, F.; Auvinen, A.; Koukorava, C.; Schnelzer, M.

    2015-01-01

    In preparation of a large European epidemiological study on the relation between eye lens dose and the occurrence of lens opacities, the European ELDO project focused on the development of practical methods to estimate retrospectively cumulative eye lens dose for interventional medical professionals exposed to radiation. The present paper applies one of the ELDO approaches, correlating eye lens dose to whole-body doses, to assess cumulative eye lens dose for 14 different Finnish interventional cardiologists for whom annual whole-body dose records were available for their entire working period. The estimated cumulative left and right eye lens dose ranged from 8 to 264 mSv and 6 to 225 mSv, respectively. In addition, calculations showed annual eye lens doses sometimes exceeding the new ICRP annual limit of 20 mSv. The work also highlights the large uncertainties associated with the application of such an approach proving the need for dedicated dosimetry systems in the routine monitoring of the eye lens dose. (authors)

  7. Non-parametric identification of multivariable systems : a local rational modeling approach with application to a vibration isolation benchmark

    NARCIS (Netherlands)

    Voorhoeve, R.J.; van der Maas, A.; Oomen, T.A.J.

    2018-01-01

    Frequency response function (FRF) identification is often used as a basis for control systems design and as a starting point for subsequent parametric system identification. The aim of this paper is to develop a multiple-input multiple-output (MIMO) local parametric modeling approach for FRF

  8. On the development and benchmarking of an approach to model gas transport in fractured media with immobile water storage

    Science.gov (United States)

    Harp, D. R.; Ortiz, J. P.; Pandey, S.; Karra, S.; Viswanathan, H. S.; Stauffer, P. H.; Anderson, D. N.; Bradley, C. R.

    2017-12-01

    In unsaturated fractured media, the rate of gas transport is much greater than liquid transport in many applications (e.g., soil vapor extraction operations, methane leaks from hydraulic fracking, shallow CO2 transport from geologic sequestration operations, and later-time radionuclide gas transport from underground nuclear explosions). However, the relatively immobile pore water can inhibit or promote gas transport for soluble constituents by providing storage. In scenarios with constant pressure gradients, the gas transport will be retarded. In scenarios with reversing pressure gradients (i.e. barometric pressure variations) pore water storage can enhance gas transport by providing a ratcheting mechanism. Recognizing the computational efficiency that can be gained using a single-phase model and the necessity of considering pore water storage, we develop a Richard's solution approach that includes kinetic dissolution/volatilization of constituents. Henry's Law governs the equilibrium gaseous/aqueous phase partitioning in the approach. The approach is implemented in a development branch of the PFLOTRAN simulator. We verify the approach with analytical solutions of: (1) 1D gas diffusion, (2) 1D gas advection, (3) sinusoidal barometric pumping of a fracture, and (4) gas transport along a fracture with uniform flow and diffusive walls. We demonstrate the retardation of gas transport in cases with constant pressure gradients and the enhancement of gas transport with reversing pressure gradients. The figure presents the verification of our approach to the analytical solution of barometric pumping of a fracture from Nilson et al (1991) where the x-axis "Horizontal axis" is the distance into the matrix block from the fracture.

  9. Aquatic Life Benchmarks

    Data.gov (United States)

    U.S. Environmental Protection Agency — The Aquatic Life Benchmarks is an EPA-developed set of criteria for freshwater species. These benchmarks are based on toxicity values reviewed by EPA and used in the...

  10. Using the latent class approach to cluster firms in benchmarking: An application to the US electricity transmission industry

    Directory of Open Access Journals (Sweden)

    Manuel Llorca

    2014-03-01

    Full Text Available In this paper we advocate using the latent class model (LCM approach to control for technological differences in traditional efficiency analysis of regulated electricity networks. Our proposal relies on the fact that latent class models are designed to cluster firms by uncovering differences in technology parameters. Moreover, it can be viewed as a supervised method for clustering data that takes into account the same (production or cost relationship that is analysed later, often using nonparametric frontier techniques. The simulation exercises show that the proposed approach outperforms other sample selection procedures. The proposed methodology is illustrated with an application to a sample of US electricity transmission firms for the period 2001–2009.

  11. Radiation Detection Computational Benchmark Scenarios

    Energy Technology Data Exchange (ETDEWEB)

    Shaver, Mark W.; Casella, Andrew M.; Wittman, Richard S.; McDonald, Ben S.

    2013-09-24

    Modeling forms an important component of radiation detection development, allowing for testing of new detector designs, evaluation of existing equipment against a wide variety of potential threat sources, and assessing operation performance of radiation detection systems. This can, however, result in large and complex scenarios which are time consuming to model. A variety of approaches to radiation transport modeling exist with complementary strengths and weaknesses for different problems. This variety of approaches, and the development of promising new tools (such as ORNL’s ADVANTG) which combine benefits of multiple approaches, illustrates the need for a means of evaluating or comparing different techniques for radiation detection problems. This report presents a set of 9 benchmark problems for comparing different types of radiation transport calculations, identifying appropriate tools for classes of problems, and testing and guiding the development of new methods. The benchmarks were drawn primarily from existing or previous calculations with a preference for scenarios which include experimental data, or otherwise have results with a high level of confidence, are non-sensitive, and represent problem sets of interest to NA-22. From a technical perspective, the benchmarks were chosen to span a range of difficulty and to include gamma transport, neutron transport, or both and represent different important physical processes and a range of sensitivity to angular or energy fidelity. Following benchmark identification, existing information about geometry, measurements, and previous calculations were assembled. Monte Carlo results (MCNP decks) were reviewed or created and re-run in order to attain accurate computational times and to verify agreement with experimental data, when present. Benchmark information was then conveyed to ORNL in order to guide testing and development of hybrid calculations. The results of those ADVANTG calculations were then sent to PNNL for

  12. Benchmarking in digital circuit design automation

    NARCIS (Netherlands)

    Jozwiak, L.; Gawlowski, D.M.; Slusarczyk, A.S.

    2008-01-01

    This paper focuses on benchmarking, which is the main experimental approach to the design method and EDA-tool analysis, characterization and evaluation. We discuss the importance and difficulties of benchmarking, as well as the recent research effort related to it. To resolve several serious

  13. Dynamic behaviour of a planar micro-beam loaded by a fluid-gap: Analytical and numerical approach in a high frequency range, benchmark solutions

    Science.gov (United States)

    Novak, A.; Honzik, P.; Bruneau, M.

    2017-08-01

    Miniaturized vibrating MEMS devices, active (receivers or emitters) or passive devices, and their use for either new applications (hearing, meta-materials, consumer devices,…) or metrological purposes under non-standard conditions, are involved today in several acoustic domains. More in-depth characterisation than the classical ones available until now are needed. In this context, the paper presents analytical and numerical approaches for describing the behaviour of three kinds of planar micro-beams of rectangular shape (suspended rigid or clamped elastic planar beam) loaded by a backing cavity or a fluid-gap, surrounded by very thin slits, and excited by an incident acoustic field. The analytical approach accounts for the coupling between the vibrating structure and the acoustic field in the backing cavity, the thermal and viscous diffusion processes in the boundary layers in the slits and the cavity, the modal behaviour for the vibrating structure, and the non-uniformity of the acoustic field in the backing cavity which is modelled in using an integral formulation with a suitable Green's function. Benchmark solutions are proposed in terms of beam motion (from which the sensitivity, input impedance, and pressure transfer function can be calculated). A numerical implementation (FEM) is handled against which the analytical results are tested.

  14. Benchmarking for Higher Education.

    Science.gov (United States)

    Jackson, Norman, Ed.; Lund, Helen, Ed.

    The chapters in this collection explore the concept of benchmarking as it is being used and developed in higher education (HE). Case studies and reviews show how universities in the United Kingdom are using benchmarking to aid in self-regulation and self-improvement. The chapters are: (1) "Introduction to Benchmarking" (Norman Jackson…

  15. New approach for food allergy management using low-dose oral food challenges and low-dose oral immunotherapies.

    Science.gov (United States)

    Yanagida, Noriyuki; Okada, Yu; Sato, Sakura; Ebisawa, Motohiro

    2016-04-01

    A number of studies have suggested that a large subset of children (approximately 70%) who react to unheated milk or egg can tolerate extensively heated forms of these foods. A diet that includes baked milk or egg is well tolerated and appears to accelerate the development of regular milk or egg tolerance when compared with strict avoidance. However, the indications for an oral food challenge (OFC) using baked products are limited for patients with high specific IgE values or large skin prick test diameters. Oral immunotherapies (OITs) are becoming increasingly popular for the management of food allergies. However, the reported efficacy of OIT is not satisfactory, given the high frequency of symptoms and requirement for long-term therapy. With food allergies, removing the need to eliminate a food that could be consumed in low doses could significantly improve quality of life. This review discusses the importance of an OFC and OIT that use low doses of causative foods as the target volumes. Utilizing an OFC or OIT with a low dose as the target volume could be a novel approach for accelerating the tolerance to causative foods. Copyright © 2015 Japanese Society of Allergology. Production and hosting by Elsevier B.V. All rights reserved.

  16. Optimal medication dosing from suboptimal clinical examples: a deep reinforcement learning approach.

    Science.gov (United States)

    Nemati, Shamim; Ghassemi, Mohammad M; Clifford, Gari D

    2016-08-01

    Misdosing medications with sensitive therapeutic windows, such as heparin, can place patients at unnecessary risk, increase length of hospital stay, and lead to wasted hospital resources. In this work, we present a clinician-in-the-loop sequential decision making framework, which provides an individualized dosing policy adapted to each patient's evolving clinical phenotype. We employed retrospective data from the publicly available MIMIC II intensive care unit database, and developed a deep reinforcement learning algorithm that learns an optimal heparin dosing policy from sample dosing trails and their associated outcomes in large electronic medical records. Using separate training and testing datasets, our model was observed to be effective in proposing heparin doses that resulted in better expected outcomes than the clinical guidelines. Our results demonstrate that a sequential modeling approach, learned from retrospective data, could potentially be used at the bedside to derive individualized patient dosing policies.

  17. Erosion of a confined stratified layer by a vertical jet – Detailed assessment of a CFD approach against the OECD/NEA PSI benchmark

    Energy Technology Data Exchange (ETDEWEB)

    Kelm, S., E-mail: s.kelm@fz-juelich.de [Forschungszentrum Jülich GmbH, 52425 Jülich (Germany); Kapulla, R., E-mail: ralf.kapulla@psi.ch [Paul Scherrer Institute, 5232 Villigen PSI (Switzerland); Allelein, H.-J., E-mail: allelein@lrst.rwth-aachen.de [Forschungszentrum Jülich GmbH, 52425 Jülich (Germany); RWTH Aachen University, 52080 Aachen (Germany)

    2017-02-15

    Highlights: • Systematic of a U-RANS approach, capable to be applied at containment scale. • Validation against measured and derived point-wise and field data. • Validation by means of transported quantities (concentration) but also underlying flow field and turbulent kinetic energy. • U-RANS approach yields in overall consistent and plausible results. • But unexpected difference in SST and k–ε identified for free-stream flow. - Abstract: Recently, a blind CFD benchmark exercise was conducted by the OECD/NEA (2013–2014) based on an experiment in the PANDA facility at the Paul Scherrer Institute (PSI) in Switzerland, investigating the turbulent erosion of a stratified helium rich layer in the upper region of the test vessel by means of a vertical air-helium jet impinging from below. In addition to the ‘classical’ pointwise measurements available for similar experiments conducted in the past, significant additional efforts were spent on the experimental characterization of the underlying flow field and turbulent quantities by means of particle image velocimetry (PIV) for the benchmark. This data is well suited for a detailed assessment of the driving jet flow and its interaction with the stratified layer. Both are essential in order to avoid elimination of different errors, which is possible if validation is performed in a global manner. Different impacts on the simulation results, in particular on the jet profile and on the mixing progress, are discussed in this paper. A systematic validation is carried out based on measured and derived quantities. It is identified that e.g. the mesh resolution in the jet and mixing zone has only a minor impact, while small changes in turbulence modeling strategy or the chosen model constants, like Sc{sub t}, significantly affect the simulation results. Finally, the chosen unsteady RANS model represents mixing process consistently in the transient progression and instantaneous flow variables, while an unexpected

  18. Benchmarking multimedia performance

    Science.gov (United States)

    Zandi, Ahmad; Sudharsanan, Subramania I.

    1998-03-01

    With the introduction of faster processors and special instruction sets tailored to multimedia, a number of exciting applications are now feasible on the desktops. Among these is the DVD playback consisting, among other things, of MPEG-2 video and Dolby digital audio or MPEG-2 audio. Other multimedia applications such as video conferencing and speech recognition are also becoming popular on computer systems. In view of this tremendous interest in multimedia, a group of major computer companies have formed, Multimedia Benchmarks Committee as part of Standard Performance Evaluation Corp. to address the performance issues of multimedia applications. The approach is multi-tiered with three tiers of fidelity from minimal to full compliant. In each case the fidelity of the bitstream reconstruction as well as quality of the video or audio output are measured and the system is classified accordingly. At the next step the performance of the system is measured. In many multimedia applications such as the DVD playback the application needs to be run at a specific rate. In this case the measurement of the excess processing power, makes all the difference. All these make a system level, application based, multimedia benchmark very challenging. Several ideas and methodologies for each aspect of the problems will be presented and analyzed.

  19. An approach to routine individual internal dose monitoring at the object 'Shelter' personnel considering uncertainties

    International Nuclear Information System (INIS)

    Mel'nichuk, D.V.; Bondarenko, O.O.; Medvedjev, S.Yu.

    2002-01-01

    An approach to organisation of routine individual internal dose monitoring of the personnel of the Object 'Shelter' is presented in the work, that considers individualised uncertainties. In this aspect two methods of effective dose assessment based on bioassay are considered in the work: (1) traditional indirect method at which application results of workplace monitoring are not taken into account, and (2) a combined method in which both results of bioassay measurements and workplace monitoring are considered

  20. Benchmarking semantic web technology

    CERN Document Server

    García-Castro, R

    2009-01-01

    This book addresses the problem of benchmarking Semantic Web Technologies; first, from a methodological point of view, proposing a general methodology to follow in benchmarking activities over Semantic Web Technologies and, second, from a practical point of view, presenting two international benchmarking activities that involved benchmarking the interoperability of Semantic Web technologies using RDF(S) as the interchange language in one activity and OWL in the other.The book presents in detail how the different resources needed for these interoperability benchmarking activities were defined:

  1. Radiation dose optimization research: Exposure technique approaches in CR imaging – A literature review

    International Nuclear Information System (INIS)

    Seeram, Euclid; Davidson, Rob; Bushong, Stewart; Swan, Hans

    2013-01-01

    The purpose of this paper is to review the literature on exposure technique approaches in Computed Radiography (CR) imaging as a means of radiation dose optimization in CR imaging. Specifically the review assessed three approaches: optimization of kVp; optimization of mAs; and optimization of the Exposure Indicator (EI) in practice. Only papers dating back to 2005 were described in this review. The major themes, patterns, and common findings from the literature reviewed showed that important features are related to radiation dose management strategies for digital radiography include identification of the EI as a dose control mechanism and as a “surrogate for dose management”. In addition the use of the EI has been viewed as an opportunity for dose optimization. Furthermore optimization research has focussed mainly on optimizing the kVp in CR imaging as a means of implementing the ALARA philosophy, and studies have concentrated on mainly chest imaging using different CR systems such as those commercially available from Fuji, Agfa, Kodak, and Konica-Minolta. These studies have produced “conflicting results”. In addition, a common pattern was the use of automatic exposure control (AEC) and the measurement of constant effective dose, and the use of a dose-area product (DAP) meter

  2. Update on the Code Intercomparison and Benchmark for Muon Fluence and Absorbed Dose Induced by an 18 GeV Electron Beam After Massive Iron Shielding

    Energy Technology Data Exchange (ETDEWEB)

    Fasso, A. [SLAC; Ferrari, A. [CERN; Ferrari, A. [HZDR, Dresden; Mokhov, N. V. [Fermilab; Mueller, S. E. [HZDR, Dresden; Nelson, W. R. [SLAC; Roesler, S. [CERN; Sanami, t.; Striganov, S. I. [Fermilab; Versaci, R. [Unlisted, CZ

    2016-12-01

    In 1974, Nelson, Kase and Svensson published an experimental investigation on muon shielding around SLAC high-energy electron accelerators [1]. They measured muon fluence and absorbed dose induced by 14 and 18 GeV electron beams hitting a copper/water beamdump and attenuated in a thick steel shielding. In their paper, they compared the results with the theoretical models available at that time. In order to compare their experimental results with present model calculations, we use the modern transport Monte Carlo codes MARS15, FLUKA2011 and GEANT4 to model the experimental setup and run simulations. The results are then compared between the codes, and with the SLAC data.

  3. Why we need new approaches to low-dose risk modeling

    International Nuclear Information System (INIS)

    Alvarez, J.L.; Seiler, F.A.

    1996-01-01

    The linear no-threshold model for radiation effects was introduced as a conservative model for the design of radiation protection programs. The model has persisted not only as the basis for such programs, but has come to be treated as a dogma and is often confused with scientific fact. In this examination a number of serious problems with the linear no-threshold model of radiation carcinogenesis were demonstrated, many of them invalidating the hypothesis. It was shown that the relative risk formalism did not approach 1 as the dose approaches zero. When morality ratios were used instead, the data in the region below 0.3 Sv were systematically below the predictions of the linear model. It was also shown that the data above 0.3 Sv were of little use in formulating a model at low doses. In addition, these data are valid only for doses accumulated at high dose rates, and there is no scientific justification for using the model in low-dose, low-dose-rate extrapolations for purposes of radiation protection. Further examination of model fits to the Japanese survivor data were attempted. Several such models were fit to the data including an unconstrained linear, linear-square root, and Weibull, all of which fit the data better than the relative risk, linear no-threshold model. These fits were used to demonstrate that the linear model systematically over estimates the risk at low doses in the Japanese survivor data set. It is recommended here that an unbiased re-analysis of the data be undertaken and the results used to construct a new model, based on all pertinent data. This model could then form the basis for managing radiation risks in the appropriate regions of dose and dose rate

  4. RISKIND verification and benchmark comparisons

    International Nuclear Information System (INIS)

    Biwer, B.M.; Arnish, J.J.; Chen, S.Y.; Kamboj, S.

    1997-08-01

    This report presents verification calculations and benchmark comparisons for RISKIND, a computer code designed to estimate potential radiological consequences and health risks to individuals and the population from exposures associated with the transportation of spent nuclear fuel and other radioactive materials. Spreadsheet calculations were performed to verify the proper operation of the major options and calculational steps in RISKIND. The program is unique in that it combines a variety of well-established models into a comprehensive treatment for assessing risks from the transportation of radioactive materials. Benchmark comparisons with other validated codes that incorporate similar models were also performed. For instance, the external gamma and neutron dose rate curves for a shipping package estimated by RISKIND were compared with those estimated by using the RADTRAN 4 code and NUREG-0170 methodology. Atmospheric dispersion of released material and dose estimates from the GENII and CAP88-PC codes. Verification results have shown the program to be performing its intended function correctly. The benchmark results indicate that the predictions made by RISKIND are within acceptable limits when compared with predictions from similar existing models

  5. RISKIND verification and benchmark comparisons

    Energy Technology Data Exchange (ETDEWEB)

    Biwer, B.M.; Arnish, J.J.; Chen, S.Y.; Kamboj, S.

    1997-08-01

    This report presents verification calculations and benchmark comparisons for RISKIND, a computer code designed to estimate potential radiological consequences and health risks to individuals and the population from exposures associated with the transportation of spent nuclear fuel and other radioactive materials. Spreadsheet calculations were performed to verify the proper operation of the major options and calculational steps in RISKIND. The program is unique in that it combines a variety of well-established models into a comprehensive treatment for assessing risks from the transportation of radioactive materials. Benchmark comparisons with other validated codes that incorporate similar models were also performed. For instance, the external gamma and neutron dose rate curves for a shipping package estimated by RISKIND were compared with those estimated by using the RADTRAN 4 code and NUREG-0170 methodology. Atmospheric dispersion of released material and dose estimates from the GENII and CAP88-PC codes. Verification results have shown the program to be performing its intended function correctly. The benchmark results indicate that the predictions made by RISKIND are within acceptable limits when compared with predictions from similar existing models.

  6. Transcriptional profiling of the dose response: a more powerful approach for characterizing drug activities.

    Directory of Open Access Journals (Sweden)

    Rui-Ru Ji

    2009-09-01

    Full Text Available The dose response curve is the gold standard for measuring the effect of a drug treatment, but is rarely used in genomic scale transcriptional profiling due to perceived obstacles of cost and analysis. One barrier to examining transcriptional dose responses is that existing methods for microarray data analysis can identify patterns, but provide no quantitative pharmacological information. We developed analytical methods that identify transcripts responsive to dose, calculate classical pharmacological parameters such as the EC50, and enable an in-depth analysis of coordinated dose-dependent treatment effects. The approach was applied to a transcriptional profiling study that evaluated four kinase inhibitors (imatinib, nilotinib, dasatinib and PD0325901 across a six-logarithm dose range, using 12 arrays per compound. The transcript responses proved a powerful means to characterize and compare the compounds: the distribution of EC50 values for the transcriptome was linked to specific targets, dose-dependent effects on cellular processes were identified using automated pathway analysis, and a connection was seen between EC50s in standard cellular assays and transcriptional EC50s. Our approach greatly enriches the information that can be obtained from standard transcriptional profiling technology. Moreover, these methods are automated, robust to non-optimized assays, and could be applied to other sources of quantitative data.

  7. A Novel Admixture-Based Pharmacogenetic Approach to Refine Warfarin Dosing in Caribbean Hispanics

    Science.gov (United States)

    Claudio-Campos, Karla; Rivera-Miranda, Giselle; Bermúdez-Bosch, Luis; Renta, Jessicca Y.; Cadilla, Carmen L.; Cruz, Iadelisse; Feliu, Juan F.; Vergara, Cunegundo; Ruaño, Gualberto

    2016-01-01

    Aim This study is aimed at developing a novel admixture-adjusted pharmacogenomic approach to individually refine warfarin dosing in Caribbean Hispanic patients. Patients & Methods A multiple linear regression analysis of effective warfarin doses versus relevant genotypes, admixture, clinical and demographic factors was performed in 255 patients and further validated externally in another cohort of 55 individuals. Results The admixture-adjusted, genotype-guided warfarin dosing refinement algorithm developed in Caribbean Hispanics showed better predictability (R2 = 0.70, MAE = 0.72mg/day) than a clinical algorithm that excluded genotypes and admixture (R2 = 0.60, MAE = 0.99mg/day), and outperformed two prior pharmacogenetic algorithms in predicting effective dose in this population. For patients at the highest risk of adverse events, 45.5% of the dose predictions using the developed pharmacogenetic model resulted in ideal dose as compared with only 29% when using the clinical non-genetic algorithm (pwarfarin dose variance when externally validated in 55 individuals from an independent validation cohort (MAE = 0.89 mg/day, 24% mean bias). Conclusions Results supported our rationale to incorporate individual’s genotypes and unique admixture metrics into pharmacogenetic refinement models in order to increase predictability when expanding them to admixed populations like Caribbean Hispanics. Trial Registration ClinicalTrials.gov NCT01318057 PMID:26745506

  8. A Novel Admixture-Based Pharmacogenetic Approach to Refine Warfarin Dosing in Caribbean Hispanics.

    Directory of Open Access Journals (Sweden)

    Jorge Duconge

    Full Text Available This study is aimed at developing a novel admixture-adjusted pharmacogenomic approach to individually refine warfarin dosing in Caribbean Hispanic patients.A multiple linear regression analysis of effective warfarin doses versus relevant genotypes, admixture, clinical and demographic factors was performed in 255 patients and further validated externally in another cohort of 55 individuals.The admixture-adjusted, genotype-guided warfarin dosing refinement algorithm developed in Caribbean Hispanics showed better predictability (R2 = 0.70, MAE = 0.72mg/day than a clinical algorithm that excluded genotypes and admixture (R2 = 0.60, MAE = 0.99mg/day, and outperformed two prior pharmacogenetic algorithms in predicting effective dose in this population. For patients at the highest risk of adverse events, 45.5% of the dose predictions using the developed pharmacogenetic model resulted in ideal dose as compared with only 29% when using the clinical non-genetic algorithm (p<0.001. The admixture-driven pharmacogenetic algorithm predicted 58% of warfarin dose variance when externally validated in 55 individuals from an independent validation cohort (MAE = 0.89 mg/day, 24% mean bias.Results supported our rationale to incorporate individual's genotypes and unique admixture metrics into pharmacogenetic refinement models in order to increase predictability when expanding them to admixed populations like Caribbean Hispanics.ClinicalTrials.gov NCT01318057.

  9. Dose assessment and approach to the safety for the public in the emergency. Proceedings

    International Nuclear Information System (INIS)

    Nakajima, Toshiyuki

    1994-03-01

    This issue is the collection of the papers presented at the 21st NIRS seminar on Dose Assessment and Approach to the Safety for the Public in the Emergency. The 16 of the presented papers are indexed individually. (J.P.N.)

  10. Implementation of the systems approach to improve a pharmacist-managed vancomycin dosing service.

    Science.gov (United States)

    Gagnon, David J; Roberts, Russel; Sylvia, Lynne

    2014-12-01

    Quality improvements achieved by applying the systems approach to assess the clinical effectiveness, operational efficiency, and financial feasibility of a pharmacist-managed vancomycin dosing service are described. Faced with increased patient volumes and resource demands, the pharmacy department at Tufts Medical Center conducted an evaluation of its adult inpatient vancomycin dosing service using the systems approach, which emphasizes multidisciplinary assessment of system inputs, processes, and outcomes and consensus-building methods to identify needed changes and recommended action steps. A multidisciplinary committee composed of representatives of the medical center's pharmacy, internal medicine, infectious diseases, nursing, phlebotomy, and clinical laboratory services was assembled; in a series of three moderated monthly sessions, committee members deliberated and ultimately reached consensus on a list of action items. Relative to a concurrent intradepartmental assessment of the vancomycin dosing service based solely on pharmacist feedback, the systems approach identified a greater number and wider array of needed improvements in key program areas. Quality improvements implemented as a direct result of the systems-based analysis included a policy change authorizing pharmacists to order serum vancomycin determinations without physician cosignature and inclusion of a vancomycin dosing algorithm in the institutional antibiotic dosing guide. Future changes based on deliverable action items will result in a structured process to help direct program resources toward the patients most in need of pharmacist-managed vancomycin dosing services. The systems approach allowed for a comprehensive multidisciplinary evaluation of the service, as indicated by the identification of process improvements not identified by the department of pharmacy alone. Copyright © 2014 by the American Society of Health-System Pharmacists, Inc. All rights reserved.

  11. MCNP neutron benchmarks

    International Nuclear Information System (INIS)

    Hendricks, J.S.; Whalen, D.J.; Cardon, D.A.; Uhle, J.L.

    1991-01-01

    Over 50 neutron benchmark calculations have recently been completed as part of an ongoing program to validate the MCNP Monte Carlo radiation transport code. The new and significant aspects of this work are as follows: These calculations are the first attempt at a validation program for MCNP and the first official benchmarking of version 4 of the code. We believe the chosen set of benchmarks is a comprehensive set that may be useful for benchmarking other radiation transport codes and data libraries. These calculations provide insight into how well neutron transport calculations can be expected to model a wide variety of problems

  12. SP2Bench: A SPARQL Performance Benchmark

    Science.gov (United States)

    Schmidt, Michael; Hornung, Thomas; Meier, Michael; Pinkel, Christoph; Lausen, Georg

    A meaningful analysis and comparison of both existing storage schemes for RDF data and evaluation approaches for SPARQL queries necessitates a comprehensive and universal benchmark platform. We present SP2Bench, a publicly available, language-specific performance benchmark for the SPARQL query language. SP2Bench is settled in the DBLP scenario and comprises a data generator for creating arbitrarily large DBLP-like documents and a set of carefully designed benchmark queries. The generated documents mirror vital key characteristics and social-world distributions encountered in the original DBLP data set, while the queries implement meaningful requests on top of this data, covering a variety of SPARQL operator constellations and RDF access patterns. In this chapter, we discuss requirements and desiderata for SPARQL benchmarks and present the SP2Bench framework, including its data generator, benchmark queries and performance metrics.

  13. Benchmarking monthly homogenization algorithms

    Science.gov (United States)

    Venema, V. K. C.; Mestre, O.; Aguilar, E.; Auer, I.; Guijarro, J. A.; Domonkos, P.; Vertacnik, G.; Szentimrey, T.; Stepanek, P.; Zahradnicek, P.; Viarre, J.; Müller-Westermeier, G.; Lakatos, M.; Williams, C. N.; Menne, M.; Lindau, R.; Rasol, D.; Rustemeier, E.; Kolokythas, K.; Marinova, T.; Andresen, L.; Acquaotta, F.; Fratianni, S.; Cheval, S.; Klancar, M.; Brunetti, M.; Gruber, C.; Prohom Duran, M.; Likso, T.; Esteban, P.; Brandsma, T.

    2011-08-01

    The COST (European Cooperation in Science and Technology) Action ES0601: Advances in homogenization methods of climate series: an integrated approach (HOME) has executed a blind intercomparison and validation study for monthly homogenization algorithms. Time series of monthly temperature and precipitation were evaluated because of their importance for climate studies and because they represent two important types of statistics (additive and multiplicative). The algorithms were validated against a realistic benchmark dataset. The benchmark contains real inhomogeneous data as well as simulated data with inserted inhomogeneities. Random break-type inhomogeneities were added to the simulated datasets modeled as a Poisson process with normally distributed breakpoint sizes. To approximate real world conditions, breaks were introduced that occur simultaneously in multiple station series within a simulated network of station data. The simulated time series also contained outliers, missing data periods and local station trends. Further, a stochastic nonlinear global (network-wide) trend was added. Participants provided 25 separate homogenized contributions as part of the blind study as well as 22 additional solutions submitted after the details of the imposed inhomogeneities were revealed. These homogenized datasets were assessed by a number of performance metrics including (i) the centered root mean square error relative to the true homogeneous value at various averaging scales, (ii) the error in linear trend estimates and (iii) traditional contingency skill scores. The metrics were computed both using the individual station series as well as the network average regional series. The performance of the contributions depends significantly on the error metric considered. Contingency scores by themselves are not very informative. Although relative homogenization algorithms typically improve the homogeneity of temperature data, only the best ones improve precipitation data

  14. A simplified approach to characterizing a kilovoltage source spectrum for accurate dose computation

    Energy Technology Data Exchange (ETDEWEB)

    Poirier, Yannick; Kouznetsov, Alexei; Tambasco, Mauro [Department of Physics and Astronomy, University of Calgary, Calgary, Alberta T2N 4N2 (Canada); Department of Physics and Astronomy and Department of Oncology, University of Calgary and Tom Baker Cancer Centre, Calgary, Alberta T2N 4N2 (Canada)

    2012-06-15

    Purpose: To investigate and validate the clinical feasibility of using half-value layer (HVL) and peak tube potential (kVp) for characterizing a kilovoltage (kV) source spectrum for the purpose of computing kV x-ray dose accrued from imaging procedures. To use this approach to characterize a Varian Registered-Sign On-Board Imager Registered-Sign (OBI) source and perform experimental validation of a novel in-house hybrid dose computation algorithm for kV x-rays. Methods: We characterized the spectrum of an imaging kV x-ray source using the HVL and the kVp as the sole beam quality identifiers using third-party freeware Spektr to generate the spectra. We studied the sensitivity of our dose computation algorithm to uncertainties in the beam's HVL and kVp by systematically varying these spectral parameters. To validate our approach experimentally, we characterized the spectrum of a Varian Registered-Sign OBI system by measuring the HVL using a Farmer-type Capintec ion chamber (0.06 cc) in air and compared dose calculations using our computationally validated in-house kV dose calculation code to measured percent depth-dose and transverse dose profiles for 80, 100, and 125 kVp open beams in a homogeneous phantom and a heterogeneous phantom comprising tissue, lung, and bone equivalent materials. Results: The sensitivity analysis of the beam quality parameters (i.e., HVL, kVp, and field size) on dose computation accuracy shows that typical measurement uncertainties in the HVL and kVp ({+-}0.2 mm Al and {+-}2 kVp, respectively) source characterization parameters lead to dose computation errors of less than 2%. Furthermore, for an open beam with no added filtration, HVL variations affect dose computation accuracy by less than 1% for a 125 kVp beam when field size is varied from 5 Multiplication-Sign 5 cm{sup 2} to 40 Multiplication-Sign 40 cm{sup 2}. The central axis depth dose calculations and experimental measurements for the 80, 100, and 125 kVp energies agreed within

  15. The role of benchmarking for yardstick competition

    International Nuclear Information System (INIS)

    Burns, Phil; Jenkins, Cloda; Riechmann, Christoph

    2005-01-01

    With the increasing interest in yardstick regulation, there is a need to understand the most appropriate method for realigning tariffs at the outset. Benchmarking is the tool used for such realignment and is therefore a necessary first-step in the implementation of yardstick competition. A number of concerns have been raised about the application of benchmarking, making some practitioners reluctant to move towards yardstick based regimes. We assess five of the key concerns often discussed and find that, in general, these are not as great as perceived. The assessment is based on economic principles and experiences with applying benchmarking to regulated sectors, e.g. in the electricity and water industries in the UK, The Netherlands, Austria and Germany in recent years. The aim is to demonstrate that clarity on the role of benchmarking reduces the concern about its application in different regulatory regimes. We find that benchmarking can be used in regulatory settlements, although the range of possible benchmarking approaches that are appropriate will be small for any individual regulatory question. Benchmarking is feasible as total cost measures and environmental factors are better defined in practice than is commonly appreciated and collusion is unlikely to occur in environments with more than 2 or 3 firms (where shareholders have a role in monitoring and rewarding performance). Furthermore, any concern about companies under-recovering costs is a matter to be determined through the regulatory settlement and does not affect the case for using benchmarking as part of that settlement. (author)

  16. Benchmarking the energy efficiency of commercial buildings

    International Nuclear Information System (INIS)

    Chung, William; Hui, Y.V.; Lam, Y. Miu

    2006-01-01

    Benchmarking energy-efficiency is an important tool to promote the efficient use of energy in commercial buildings. Benchmarking models are mostly constructed in a simple benchmark table (percentile table) of energy use, which is normalized with floor area and temperature. This paper describes a benchmarking process for energy efficiency by means of multiple regression analysis, where the relationship between energy-use intensities (EUIs) and the explanatory factors (e.g., operating hours) is developed. Using the resulting regression model, these EUIs are then normalized by removing the effect of deviance in the significant explanatory factors. The empirical cumulative distribution of the normalized EUI gives a benchmark table (or percentile table of EUI) for benchmarking an observed EUI. The advantage of this approach is that the benchmark table represents a normalized distribution of EUI, taking into account all the significant explanatory factors that affect energy consumption. An application to supermarkets is presented to illustrate the development and the use of the benchmarking method

  17. Benchmark analysis of MCNP trademark ENDF/B-VI iron

    International Nuclear Information System (INIS)

    Court, J.D.; Hendricks, J.S.

    1994-12-01

    The MCNP ENDF/B-VI iron cross-section data was subjected to four benchmark studies as part of the Hiroshima/Nagasaki dose re-evaluation for the National Academy of Science and the Defense Nuclear Agency. The four benchmark studies were: (1) the iron sphere benchmarks from the Lawrence Livermore Pulsed Spheres; (2) the Oak Ridge National Laboratory Fusion Reactor Shielding Benchmark; (3) a 76-cm diameter iron sphere benchmark done at the University of Illinois; (4) the Oak Ridge National Laboratory Benchmark for Neutron Transport through Iron. MCNP4A was used to model each benchmark and computational results from the ENDF/B-VI iron evaluations were compared to ENDF/B-IV, ENDF/B-V, the MCNP Recommended Data Set (which includes Los Alamos National Laboratory Group T-2 evaluations), and experimental data. The results show that the ENDF/B-VI iron evaluations are as good as, or better than, previous data sets

  18. Benchmarking af kommunernes sagsbehandling

    DEFF Research Database (Denmark)

    Amilon, Anna

    Fra 2007 skal Ankestyrelsen gennemføre benchmarking af kommuernes sagsbehandlingskvalitet. Formålet med benchmarkingen er at udvikle praksisundersøgelsernes design med henblik på en bedre opfølgning og at forbedre kommunernes sagsbehandling. Dette arbejdspapir diskuterer metoder for benchmarking...

  19. Internet based benchmarking

    DEFF Research Database (Denmark)

    Bogetoft, Peter; Nielsen, Kurt

    2005-01-01

    We discuss the design of interactive, internet based benchmarking using parametric (statistical) as well as nonparametric (DEA) models. The user receives benchmarks and improvement potentials. The user is also given the possibility to search different efficiency frontiers and hereby to explore...

  20. The Drill Down Benchmark

    NARCIS (Netherlands)

    P.A. Boncz (Peter); T. Rühl (Tim); F. Kwakkel

    1998-01-01

    textabstractData Mining places specific requirements on DBMS query performance that cannot be evaluated satisfactorily using existing OLAP benchmarks. The DD Benchmark - defined here - provides a practical case and yardstick to explore how well a DBMS is able to support Data Mining applications. It

  1. Benchmarking in Identifying Priority Directions of Development of Telecommunication Operators

    Directory of Open Access Journals (Sweden)

    Zaharchenko Lolita A.

    2013-12-01

    Full Text Available The article analyses evolution of development and possibilities of application of benchmarking in the telecommunication sphere. It studies essence of benchmarking on the basis of generalisation of approaches of different scientists to definition of this notion. In order to improve activity of telecommunication operators, the article identifies the benchmarking technology and main factors, that determine success of the operator in the modern market economy, and the mechanism of benchmarking and component stages of carrying out benchmarking by a telecommunication operator. It analyses the telecommunication market and identifies dynamics of its development and tendencies of change of the composition of telecommunication operators and providers. Having generalised the existing experience of benchmarking application, the article identifies main types of benchmarking of telecommunication operators by the following features: by the level of conduct of (branch, inter-branch and international benchmarking; by relation to participation in the conduct (competitive and joint; and with respect to the enterprise environment (internal and external.

  2. The use of dose-response data in a margin of exposure approach to carcinogenic risk assessment for genotoxic chemicals in food.

    Science.gov (United States)

    Benford, Diane J

    2016-05-01

    Genotoxic substances are generally not permitted for deliberate use in food production. However, an appreciable number of known or suspected genotoxic substances occur unavoidably in food, e.g. from natural occurrence, environmental contamination and generation during cooking and processing. Over the past decade a margin of exposure (MOE) approach has increasingly been used in assessing the exposure to substances in food that are genotoxic and carcinogenic. The MOE is defined as a reference point on the dose-response curve (e.g. a benchmark dose lower confidences limit derived from a rodent carcinogenicity study) divided by the estimated human intake. A small MOE indicates a higher concern than a very large MOE. Whilst the MOE cannot be directly equated to risk, it supports prioritisation of substances for further research or for possible regulatory action, and provides a basis for communicating to the public. So far, the MOE approach has been confined to substances for which carcinogenicity data are available. In the absence of carcinogenicity data, evidence of genotoxicity is used only in hazard identification. The challenge to the genetic toxicology community is to develop approaches for characterising risk to human health based on data from genotoxicity studies. In order to achieve wide acceptance, it would be important to further address the issues that have been discussed in the context of dose-response modelling of carcinogenicity data in order to assign levels of concern to particular MOE values, and also whether it is possible to make generic conclusions on how potency in genotoxicity assays relates to carcinogenic potency. © Crown copyright 2015.

  3. How Activists Use Benchmarks

    DEFF Research Database (Denmark)

    Seabrooke, Leonard; Wigan, Duncan

    2015-01-01

    Non-governmental organisations use benchmarks as a form of symbolic violence to place political pressure on firms, states, and international organisations. The development of benchmarks requires three elements: (1) salience, that the community of concern is aware of the issue and views...... are put to the test. The first is a reformist benchmarking cycle where organisations defer to experts to create a benchmark that conforms with the broader system of politico-economic norms. The second is a revolutionary benchmarking cycle driven by expert-activists that seek to contest strong vested...... interests and challenge established politico-economic norms. Differentiating these cycles provides insights into how activists work through organisations and with expert networks, as well as how campaigns on complex economic issues can be mounted and sustained....

  4. EGS4 benchmark program

    International Nuclear Information System (INIS)

    Yasu, Y.; Hirayama, H.; Namito, Y.; Yashiro, S.

    1995-01-01

    This paper proposes EGS4 Benchmark Suite which consists of three programs called UCSAMPL4, UCSAMPL4I and XYZDOS. This paper also evaluates optimization methods of recent RISC/UNIX systems, such as IBM, HP, DEC, Hitachi and Fujitsu, for the benchmark suite. When particular compiler option and math library were included in the evaluation process, system performed significantly better. Observed performance of some of the RISC/UNIX systems were beyond some so-called Mainframes of IBM, Hitachi or Fujitsu. The computer performance of EGS4 Code System on an HP9000/735 (99MHz) was defined to be the unit of EGS4 Unit. The EGS4 Benchmark Suite also run on various PCs such as Pentiums, i486 and DEC alpha and so forth. The performance of recent fast PCs reaches that of recent RISC/UNIX systems. The benchmark programs have been evaluated with correlation of industry benchmark programs, namely, SPECmark. (author)

  5. Verification and validation benchmarks.

    Energy Technology Data Exchange (ETDEWEB)

    Oberkampf, William Louis; Trucano, Timothy Guy

    2007-02-01

    Verification and validation (V&V) are the primary means to assess the accuracy and reliability of computational simulations. V&V methods and procedures have fundamentally improved the credibility of simulations in several high-consequence fields, such as nuclear reactor safety, underground nuclear waste storage, and nuclear weapon safety. Although the terminology is not uniform across engineering disciplines, code verification deals with assessing the reliability of the software coding, and solution verification deals with assessing the numerical accuracy of the solution to a computational model. Validation addresses the physics modeling accuracy of a computational simulation by comparing the computational results with experimental data. Code verification benchmarks and validation benchmarks have been constructed for a number of years in every field of computational simulation. However, no comprehensive guidelines have been proposed for the construction and use of V&V benchmarks. For example, the field of nuclear reactor safety has not focused on code verification benchmarks, but it has placed great emphasis on developing validation benchmarks. Many of these validation benchmarks are closely related to the operations of actual reactors at near-safety-critical conditions, as opposed to being more fundamental-physics benchmarks. This paper presents recommendations for the effective design and use of code verification benchmarks based on manufactured solutions, classical analytical solutions, and highly accurate numerical solutions. In addition, this paper presents recommendations for the design and use of validation benchmarks, highlighting the careful design of building-block experiments, the estimation of experimental measurement uncertainty for both inputs and outputs to the code, validation metrics, and the role of model calibration in validation. It is argued that the understanding of predictive capability of a computational model is built on the level of

  6. Verification and validation benchmarks

    International Nuclear Information System (INIS)

    Oberkampf, William Louis; Trucano, Timothy Guy

    2007-01-01

    Verification and validation (V and V) are the primary means to assess the accuracy and reliability of computational simulations. V and V methods and procedures have fundamentally improved the credibility of simulations in several high-consequence fields, such as nuclear reactor safety, underground nuclear waste storage, and nuclear weapon safety. Although the terminology is not uniform across engineering disciplines, code verification deals with assessing the reliability of the software coding, and solution verification deals with assessing the numerical accuracy of the solution to a computational model. Validation addresses the physics modeling accuracy of a computational simulation by comparing the computational results with experimental data. Code verification benchmarks and validation benchmarks have been constructed for a number of years in every field of computational simulation. However, no comprehensive guidelines have been proposed for the construction and use of V and V benchmarks. For example, the field of nuclear reactor safety has not focused on code verification benchmarks, but it has placed great emphasis on developing validation benchmarks. Many of these validation benchmarks are closely related to the operations of actual reactors at near-safety-critical conditions, as opposed to being more fundamental-physics benchmarks. This paper presents recommendations for the effective design and use of code verification benchmarks based on manufactured solutions, classical analytical solutions, and highly accurate numerical solutions. In addition, this paper presents recommendations for the design and use of validation benchmarks, highlighting the careful design of building-block experiments, the estimation of experimental measurement uncertainty for both inputs and outputs to the code, validation metrics, and the role of model calibration in validation. It is argued that the understanding of predictive capability of a computational model is built on the

  7. Verification and validation benchmarks

    International Nuclear Information System (INIS)

    Oberkampf, William L.; Trucano, Timothy G.

    2008-01-01

    Verification and validation (V and V) are the primary means to assess the accuracy and reliability of computational simulations. V and V methods and procedures have fundamentally improved the credibility of simulations in several high-consequence fields, such as nuclear reactor safety, underground nuclear waste storage, and nuclear weapon safety. Although the terminology is not uniform across engineering disciplines, code verification deals with assessing the reliability of the software coding, and solution verification deals with assessing the numerical accuracy of the solution to a computational model. Validation addresses the physics modeling accuracy of a computational simulation by comparing the computational results with experimental data. Code verification benchmarks and validation benchmarks have been constructed for a number of years in every field of computational simulation. However, no comprehensive guidelines have been proposed for the construction and use of V and V benchmarks. For example, the field of nuclear reactor safety has not focused on code verification benchmarks, but it has placed great emphasis on developing validation benchmarks. Many of these validation benchmarks are closely related to the operations of actual reactors at near-safety-critical conditions, as opposed to being more fundamental-physics benchmarks. This paper presents recommendations for the effective design and use of code verification benchmarks based on manufactured solutions, classical analytical solutions, and highly accurate numerical solutions. In addition, this paper presents recommendations for the design and use of validation benchmarks, highlighting the careful design of building-block experiments, the estimation of experimental measurement uncertainty for both inputs and outputs to the code, validation metrics, and the role of model calibration in validation. It is argued that the understanding of predictive capability of a computational model is built on the

  8. A novel approach to pharmacodynamic assessment of antimicrobial agents: new insights to dosing regimen design.

    Directory of Open Access Journals (Sweden)

    Vincent H Tam

    Full Text Available Pharmacodynamic modeling has been increasingly used as a decision support tool to guide dosing regimen selection, both in the drug development and clinical settings. Killing by antimicrobial agents has been traditionally classified categorically as concentration-dependent (which would favor less fractionating regimens or time-dependent (for which more frequent dosing is preferred. While intuitive and useful to explain empiric data, a more informative approach is necessary to provide a robust assessment of pharmacodynamic profiles in situations other than the extremes of the spectrum (e.g., agents which exhibit partial concentration-dependent killing. A quantitative approach to describe the interaction of an antimicrobial agent and a pathogen is proposed to fill this unmet need. A hypothetic antimicrobial agent with linear pharmacokinetics is used for illustrative purposes. A non-linear functional form (sigmoid Emax of killing consisted of 3 parameters is used. Using different parameter values in conjunction with the relative growth rate of the pathogen and antimicrobial agent concentration ranges, various conventional pharmacodynamic surrogate indices (e.g., AUC/MIC, Cmax/MIC, %T>MIC could be satisfactorily linked to outcomes. In addition, the dosing intensity represented by the average kill rate of a dosing regimen can be derived, which could be used for quantitative comparison. The relevance of our approach is further supported by experimental data from our previous investigations using a variety of gram-negative bacteria and antimicrobial agents (moxifloxacin, levofloxacin, gentamicin, amikacin and meropenem. The pharmacodynamic profiles of a wide range of antimicrobial agents can be assessed by a more flexible computational tool to support dosing selection.

  9. Tourism Destination Benchmarking: Evaluation and Selection of the Benchmarking Partners

    Directory of Open Access Journals (Sweden)

    Luštický Martin

    2012-03-01

    Full Text Available Tourism development has an irreplaceable role in regional policy of almost all countries. This is due to its undeniable benefits for the local population with regards to the economic, social and environmental sphere. Tourist destinations compete for visitors at tourism market and subsequently get into a relatively sharp competitive struggle. The main goal of regional governments and destination management institutions is to succeed in this struggle by increasing the competitiveness of their destination. The quality of strategic planning and final strategies is a key factor of competitiveness. Even though the tourism sector is not the typical field where the benchmarking methods are widely used, such approaches could be successfully applied. The paper focuses on key phases of the benchmarking process which lies in the search for suitable referencing partners. The partners are consequently selected to meet general requirements to ensure the quality if strategies. Following from this, some specific characteristics are developed according to the SMART approach. The paper tests this procedure with an expert evaluation of eight selected regional tourism strategies of regions in the Czech Republic, Slovakia and Great Britain. In this way it validates the selected criteria in the frame of the international environment. Hence, it makes it possible to find strengths and weaknesses of selected strategies and at the same time facilitates the discovery of suitable benchmarking partners.

  10. IMRT dose fractionation for head and neck cancer: Variation in current approaches will make standardisation difficult

    Energy Technology Data Exchange (ETDEWEB)

    Ho, Kean F. (Academic Dept. of Radiation Oncology, Univ. of Manchester, Manchester (United Kingdom)); Fowler, Jack F. (Dept. of Human Oncology and Medical Physics, Univ. of Wisconsin, Wisconsin (United States)); Sykes, Andrew J.; Yap, Beng K.; Lee, Lip W.; Slevin, Nick J. (Dept. of Clinical Oncology, Christie Hospital NHS Foundation Trust, Manchester (United Kingdom))

    2009-04-15

    Introduction. Altered fractionation has demonstrated clinical benefits compared to the conventional 2 Gy/day standard of 70 Gy. When using synchronous chemotherapy, there is uncertainty about optimum fractionation. IMRT with its potential for Simultaneous Integrated Boost (SIB) adds further to this uncertainty. This survey will examine international practice of IMRT fractionation and suggest possible reasons for diversity in approach. Material and methods. Fourteen international cancer centres were surveyed for IMRT dose/fractionation practised in each centre. Results. Twelve different types of dose fractionation were reported. Conventional 70-72 Gy (daily 2 Gy/fraction) was used in 3/14 centres with concurrent chemotherapy while 11/14 centres used altered fractionation. Two centres used >1 schedule. Reported schedules and number of centres included 6 fractions/week DAHANCA regime (3), modest hypofractionation (=2.2 Gy/fraction) (3), dose-escalated hypofractionation (=2.3 Gy/fraction) (4), hyperfractionation (1), continuous acceleration (1) and concomitant boost (1). Reasons for dose fractionation variability include (i) dose escalation; (ii) total irradiated volume; (iii) number of target volumes; (iv) synchronous systemic treatment; (v) shorter overall treatment time; (vi) resources availability; (vii) longer time on treatment couch; (viii) variable GTV margins; (ix) confidence in treatment setup; (x) late tissue toxicity and (xi) use of lower neck anterior fields. Conclusions. This variability in IMRT fractionation makes any meaningful comparison of treatment results difficult. Some standardization is needed particularly for design of multi-centre randomized clinical trials.

  11. Methods for extracting dose response curves from radiation therapy data. I. A unified approach

    International Nuclear Information System (INIS)

    Herring, D.F.

    1980-01-01

    This paper discusses an approach to fitting models to radiation therapy data in order to extract dose response curves for tumor local control and normal tissue damage. The approach is based on the method of maximum likelihood and is illustrated by several examples. A general linear logistic equation which leads to the Ellis nominal standard dose (NSD) equation is discussed; the fit of this equation to experimental data for mouse foot skin reactions produced by fractionated irradiation is described. A logistic equation based on the concept that normal tissue reactions are associated with the surviving fraction of cells is also discussed, and the fit of this equation to the same set of mouse foot skin reaction data is also described. These two examples illustrate the importance of choosing a model based on underlying mechanisms when one seeks to attach biological significance to a model's parameters

  12. Biological dosimetry - a Bayesian approach in the presentation of the uncertainty of the estimated dose in cases of exposure to low dose radiation

    International Nuclear Information System (INIS)

    Di Giorgio, Marina; Zaretzky, A.

    2010-01-01

    Biodosimetry laboratory experience has shown that there are limitations in the existing statistical methodology. Statistical difficulties generally occur due to the low number of aberrations leading to large uncertainties for dose estimation. Some problems derived from limitations of the classical statistical methodology, which requires that chromosome aberration yields be considered as something fixed and consequently provides a deterministic dose estimation and associated confidence limits. On the other hand, recipients of biological dosimetry reports, including medical doctors, regulators and the patients themselves may have a limited comprehension of statistics and of informed reports. Thus, the objective of the present paper is to use a Bayesian approach to present the uncertainty on the estimated dose to which a person could be exposed, in the case of low dose (occupational doses) radiation exposure. Such methodology will allow the biodosimetrists to adopt a probabilistic approach for the cytogenetic data analysis. At present, classical statistics allows to produce a confidence interval to report such dose, with a lower limit that could not detach from zero. In this situation it becomes difficult to make decisions as they could impact on the labor activities of the worker if an exposure exceeding the occupational dose limits is inferred. The proposed Bayesian approach is applied to occupational exposure scenario to contribute to take the appropriate radiation protection measures. (authors) [es

  13. A theoretical approach to the problem of dose-volume constraint estimation and their impact on the dose-volume histogram selection

    International Nuclear Information System (INIS)

    Schinkel, Colleen; Stavrev, Pavel; Stavreva, Nadia; Fallone, B. Gino

    2006-01-01

    This paper outlines a theoretical approach to the problem of estimating and choosing dose-volume constraints. Following this approach, a method of choosing dose-volume constraints based on biological criteria is proposed. This method is called ''reverse normal tissue complication probability (NTCP) mapping into dose-volume space'' and may be used as a general guidance to the problem of dose-volume constraint estimation. Dose-volume histograms (DVHs) are randomly simulated, and those resulting in clinically acceptable levels of complication, such as NTCP of 5±0.5%, are selected and averaged producing a mean DVH that is proven to result in the same level of NTCP. The points from the averaged DVH are proposed to serve as physical dose-volume constraints. The population-based critical volume and Lyman NTCP models with parameter sets taken from literature sources were used for the NTCP estimation. The impact of the prescribed value of the maximum dose to the organ, D max , on the averaged DVH and the dose-volume constraint points is investigated. Constraint points for 16 organs are calculated. The impact of the number of constraints to be fulfilled based on the likelihood that a DVH satisfying them will result in an acceptable NTCP is also investigated. It is theoretically proven that the radiation treatment optimization based on physical objective functions can sufficiently well restrict the dose to the organs at risk, resulting in sufficiently low NTCP values through the employment of several appropriate dose-volume constraints. At the same time, the pure physical approach to optimization is self-restrictive due to the preassignment of acceptable NTCP levels thus excluding possible better solutions to the problem

  14. The conversion of exposures due to radon into the effective dose: the epidemiological approach

    Energy Technology Data Exchange (ETDEWEB)

    Beck, T.R. [Federal Office for Radiation Protection, Berlin (Germany)

    2017-11-15

    The risks and dose conversion coefficients for residential and occupational exposures due to radon were determined with applying the epidemiological risk models to ICRP representative populations. The dose conversion coefficient for residential radon was estimated with a value of 1.6 mSv year{sup -1} per 100 Bq m{sup -3} (3.6 mSv per WLM), which is significantly lower than the corresponding value derived from the biokinetic and dosimetric models. The dose conversion coefficient for occupational exposures with applying the risk models for miners was estimated with a value of 14 mSv per WLM, which is in good accordance with the results of the dosimetric models. To resolve the discrepancy regarding residential radon, the ICRP approaches for the determination of risks and doses were reviewed. It could be shown that ICRP overestimates the risk for lung cancer caused by residential radon. This can be attributed to a wrong population weighting of the radon-induced risks in its epidemiological approach. With the approach in this work, the average risks for lung cancer were determined, taking into account the age-specific risk contributions of all individuals in the population. As a result, a lower risk coefficient for residential radon was obtained. The results from the ICRP biokinetic and dosimetric models for both, the occupationally exposed working age population and the whole population exposed to residential radon, can be brought in better accordance with the corresponding results of the epidemiological approach, if the respective relative radiation detriments and a radiation-weighting factor for alpha particles of about ten are used. (orig.)

  15. The conversion of exposures due to radon into the effective dose: the epidemiological approach

    International Nuclear Information System (INIS)

    Beck, T.R.

    2017-01-01

    The risks and dose conversion coefficients for residential and occupational exposures due to radon were determined with applying the epidemiological risk models to ICRP representative populations. The dose conversion coefficient for residential radon was estimated with a value of 1.6 mSv year -1 per 100 Bq m -3 (3.6 mSv per WLM), which is significantly lower than the corresponding value derived from the biokinetic and dosimetric models. The dose conversion coefficient for occupational exposures with applying the risk models for miners was estimated with a value of 14 mSv per WLM, which is in good accordance with the results of the dosimetric models. To resolve the discrepancy regarding residential radon, the ICRP approaches for the determination of risks and doses were reviewed. It could be shown that ICRP overestimates the risk for lung cancer caused by residential radon. This can be attributed to a wrong population weighting of the radon-induced risks in its epidemiological approach. With the approach in this work, the average risks for lung cancer were determined, taking into account the age-specific risk contributions of all individuals in the population. As a result, a lower risk coefficient for residential radon was obtained. The results from the ICRP biokinetic and dosimetric models for both, the occupationally exposed working age population and the whole population exposed to residential radon, can be brought in better accordance with the corresponding results of the epidemiological approach, if the respective relative radiation detriments and a radiation-weighting factor for alpha particles of about ten are used. (orig.)

  16. Benchmarking and the laboratory

    Science.gov (United States)

    Galloway, M; Nadin, L

    2001-01-01

    This article describes how benchmarking can be used to assess laboratory performance. Two benchmarking schemes are reviewed, the Clinical Benchmarking Company's Pathology Report and the College of American Pathologists' Q-Probes scheme. The Clinical Benchmarking Company's Pathology Report is undertaken by staff based in the clinical management unit, Keele University with appropriate input from the professional organisations within pathology. Five annual reports have now been completed. Each report is a detailed analysis of 10 areas of laboratory performance. In this review, particular attention is focused on the areas of quality, productivity, variation in clinical practice, skill mix, and working hours. The Q-Probes scheme is part of the College of American Pathologists programme in studies of quality assurance. The Q-Probes scheme and its applicability to pathology in the UK is illustrated by reviewing two recent Q-Probe studies: routine outpatient test turnaround time and outpatient test order accuracy. The Q-Probes scheme is somewhat limited by the small number of UK laboratories that have participated. In conclusion, as a result of the government's policy in the UK, benchmarking is here to stay. Benchmarking schemes described in this article are one way in which pathologists can demonstrate that they are providing a cost effective and high quality service. Key Words: benchmarking • pathology PMID:11477112

  17. Developing integrated benchmarks for DOE performance measurement

    Energy Technology Data Exchange (ETDEWEB)

    Barancik, J.I.; Kramer, C.F.; Thode, Jr. H.C.

    1992-09-30

    The objectives of this task were to describe and evaluate selected existing sources of information on occupational safety and health with emphasis on hazard and exposure assessment, abatement, training, reporting, and control identifying for exposure and outcome in preparation for developing DOE performance benchmarks. Existing resources and methodologies were assessed for their potential use as practical performance benchmarks. Strengths and limitations of current data resources were identified. Guidelines were outlined for developing new or improved performance factors, which then could become the basis for selecting performance benchmarks. Data bases for non-DOE comparison populations were identified so that DOE performance could be assessed relative to non-DOE occupational and industrial groups. Systems approaches were described which can be used to link hazards and exposure, event occurrence, and adverse outcome factors, as needed to generate valid, reliable, and predictive performance benchmarks. Data bases were identified which contain information relevant to one or more performance assessment categories . A list of 72 potential performance benchmarks was prepared to illustrate the kinds of information that can be produced through a benchmark development program. Current information resources which may be used to develop potential performance benchmarks are limited. There is need to develop an occupational safety and health information and data system in DOE, which is capable of incorporating demonstrated and documented performance benchmarks prior to, or concurrent with the development of hardware and software. A key to the success of this systems approach is rigorous development and demonstration of performance benchmark equivalents to users of such data before system hardware and software commitments are institutionalized.

  18. Shielding benchmark problems, (2)

    International Nuclear Information System (INIS)

    Tanaka, Shun-ichi; Sasamoto, Nobuo; Oka, Yoshiaki; Shin, Kazuo; Tada, Keiko.

    1980-02-01

    Shielding benchmark problems prepared by Working Group of Assessment of Shielding Experiments in the Research Committee on Shielding Design in the Atomic Energy Society of Japan were compiled by Shielding Laboratory in Japan Atomic Energy Research Institute. Fourteen shielding benchmark problems are presented newly in addition to twenty-one problems proposed already, for evaluating the calculational algorithm and accuracy of computer codes based on discrete ordinates method and Monte Carlo method and for evaluating the nuclear data used in codes. The present benchmark problems are principally for investigating the backscattering and the streaming of neutrons and gamma rays in two- and three-dimensional configurations. (author)

  19. Impact of imaging approach on radiation dose and associated cancer risk in children undergoing cardiac catheterization.

    Science.gov (United States)

    Hill, Kevin D; Wang, Chu; Einstein, Andrew J; Januzis, Natalie; Nguyen, Giao; Li, Jennifer S; Fleming, Gregory A; Yoshizumi, Terry K

    2017-04-01

    To quantify the impact of image optimization on absorbed radiation dose and associated risk in children undergoing cardiac catheterization. Various imaging and fluoroscopy system technical parameters including camera magnification, source-to-image distance, collimation, antiscatter grids, beam quality, and pulse rates, all affect radiation dose but have not been well studied in younger children. We used anthropomorphic phantoms (ages: newborn and 5 years old) to measure surface radiation exposure from various imaging approaches and estimated absorbed organ doses and effective doses (ED) using Monte Carlo simulations. Models developed in the National Academies' Biological Effects of Ionizing Radiation VII report were used to compare an imaging protocol optimized for dose reduction versus suboptimal imaging (+20 cm source-to-image-distance, +1 magnification setting, no collimation) on lifetime attributable risk (LAR) of cancer. For the newborn and 5-year-old phantoms, respectively ED changes were as follows: +157% and +232% for an increase from 6-inch to 10-inch camera magnification; +61% and +59% for a 20 cm increase in source-to-image-distance; -42% and -48% with addition of 1-inch periphery collimation; -31% and -46% with removal of the antiscatter grid. Compared with an optimized protocol, suboptimal imaging increased ED by 2.75-fold (newborn) and fourfold (5 years old). Estimated cancer LAR from 30-min of posteroanterior fluoroscopy using optimized versus suboptimal imaging, respectively was 0.42% versus 1.23% (newborn female), 0.20% versus 0.53% (newborn male), 0.47% versus 1.70% (5-year-old female) and 0.16% versus 0.69% (5-year-old male). Radiation-related risks to children undergoing cardiac catheterization can be substantial but are markedly reduced with an optimized imaging approach. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  20. Field evaluations of the VDmax approach for substantiation of a 25 kGy sterilization dose and its application to other preselected doses

    International Nuclear Information System (INIS)

    Kowalski, John B.; Herring, Craig; Baryschpolec, Lisa; Reger, John; Patel, Jay; Feeney, Mary; Tallentire, Alan

    2002-01-01

    The International and European standards for radiation sterilization require evidence of the effectiveness of a minimum sterilization dose of 25 kGy but do not provide detailed guidance on how this evidence can be generated. An approach, designated VD max , has recently been described and computer evaluated to provide safe and unambiguous substantiation of a 25 kGy sterilization dose. The approach has been further developed into a practical method, which has been subjected to field evaluations at three manufacturing facilities which produce different types of medical devices. The three facilities each used a different overall evaluation strategy: Facility A used VD max for quarterly dose audits; Facility B compared VD max and Method 1 in side-by-side parallel experiments; and Facility C, a new facility at start-up, used VD max for initial substantiation of 25 kGy and subsequent quarterly dose audits. A common element at all three facilities was the use of 10 product units for irradiation in the verification dose experiment. The field evaluations of the VD max method were successful at all three facilities; they included many different types of medical devices/product families with a wide range of average bioburden and sample item portion values used in the verification dose experiments. Overall, around 500 verification dose experiments were performed and no failures were observed. In the side-by-side parallel experiments, the outcomes of the VD max experiments were consistent with the outcomes observed with Method 1. The VD max approach has been extended to sterilization doses >25 and max method for doses other than 25 kGy must await controlled field evaluations and the development of appropriate specifications/standards

  1. Diagnostic Algorithm Benchmarking

    Science.gov (United States)

    Poll, Scott

    2011-01-01

    A poster for the NASA Aviation Safety Program Annual Technical Meeting. It describes empirical benchmarking on diagnostic algorithms using data from the ADAPT Electrical Power System testbed and a diagnostic software framework.

  2. Benchmarking Swiss electricity grids

    International Nuclear Information System (INIS)

    Walti, N.O.; Weber, Ch.

    2001-01-01

    This extensive article describes a pilot benchmarking project initiated by the Swiss Association of Electricity Enterprises that assessed 37 Swiss utilities. The data collected from these utilities on a voluntary basis included data on technical infrastructure, investments and operating costs. These various factors are listed and discussed in detail. The assessment methods and rating mechanisms that provided the benchmarks are discussed and the results of the pilot study are presented that are to form the basis of benchmarking procedures for the grid regulation authorities under the planned Switzerland's electricity market law. Examples of the practical use of the benchmarking methods are given and cost-efficiency questions still open in the area of investment and operating costs are listed. Prefaces by the Swiss Association of Electricity Enterprises and the Swiss Federal Office of Energy complete the article

  3. Benchmarking and Regulation

    DEFF Research Database (Denmark)

    Agrell, Per J.; Bogetoft, Peter

    . The application of benchmarking in regulation, however, requires specific steps in terms of data validation, model specification and outlier detection that are not systematically documented in open publications, leading to discussions about regulatory stability and economic feasibility of these techniques...

  4. Financial Integrity Benchmarks

    Data.gov (United States)

    City of Jackson, Mississippi — This data compiles standard financial integrity benchmarks that allow the City to measure its financial standing. It measure the City's debt ratio and bond ratings....

  5. Benchmarking in Foodservice Operations

    National Research Council Canada - National Science Library

    Johnson, Bonnie

    1998-01-01

    .... The design of this study included two parts: (1) eleven expert panelists involved in a Delphi technique to identify and rate importance of foodservice performance measures and rate the importance of benchmarking activities, and (2...

  6. MFTF TOTAL benchmark

    International Nuclear Information System (INIS)

    Choy, J.H.

    1979-06-01

    A benchmark of the TOTAL data base management system as applied to the Mirror Fusion Test Facility (MFTF) data base was implemented and run in February and March of 1979. The benchmark was run on an Interdata 8/32 and involved the following tasks: (1) data base design, (2) data base generation, (3) data base load, and (4) develop and implement programs to simulate MFTF usage of the data base

  7. Accelerator shielding benchmark problems

    International Nuclear Information System (INIS)

    Hirayama, H.; Ban, S.; Nakamura, T.

    1993-01-01

    Accelerator shielding benchmark problems prepared by Working Group of Accelerator Shielding in the Research Committee on Radiation Behavior in the Atomic Energy Society of Japan were compiled by Radiation Safety Control Center of National Laboratory for High Energy Physics. Twenty-five accelerator shielding benchmark problems are presented for evaluating the calculational algorithm, the accuracy of computer codes and the nuclear data used in codes. (author)

  8. Shielding benchmark problems

    International Nuclear Information System (INIS)

    Tanaka, Shun-ichi; Sasamoto, Nobuo; Oka, Yoshiaki; Kawai, Masayoshi; Nakazawa, Masaharu.

    1978-09-01

    Shielding benchmark problems were prepared by the Working Group of Assessment of Shielding Experiments in the Research Comittee on Shielding Design of the Atomic Energy Society of Japan, and compiled by the Shielding Laboratory of Japan Atomic Energy Research Institute. Twenty-one kinds of shielding benchmark problems are presented for evaluating the calculational algorithm and the accuracy of computer codes based on the discrete ordinates method and the Monte Carlo method and for evaluating the nuclear data used in the codes. (author)

  9. Benchmarking energy performance of residential buildings using two-stage multifactor data envelopment analysis with degree-day based simple-normalization approach

    International Nuclear Information System (INIS)

    Wang, Endong; Shen, Zhigang; Alp, Neslihan; Barry, Nate

    2015-01-01

    Highlights: • Two-stage DEA model is developed to benchmark building energy efficiency. • Degree-day based simple normalization is used to neutralize the climatic noise. • Results of a real case study validated the benefits of this new model. - Abstract: Being able to identify detailed meta factors of energy performance is essential for creating effective residential energy-retrofitting strategies. Compared to other benchmarking methods, nonparametric multifactor DEA (data envelopment analysis) is capable of discriminating scale factors from management factors to reveal more details to better guide retrofitting practices. A two-stage DEA energy benchmarking method is proposed in this paper. This method includes (1) first-stage meta DEA which integrates the common degree day metrics for neutralizing noise energy effects of exogenous climatic variables; and (2) second-stage Tobit regression for further detailed efficiency analysis. A case study involving 3-year longitudinal panel data of 189 residential buildings indicated the proposed method has advantages over existing methods in terms of its efficiency in data processing and results interpretation. The results of the case study also demonstrated high consistency with existing linear regression based DEA.

  10. Approach to non-human species radiation dose assessment in the republic of Korea

    International Nuclear Information System (INIS)

    Keum, D. K.; Jun, I.; Lim, K. M.; Choi, Y. H.

    2011-01-01

    This paper describes the approach to non-human species radiation dose assessment in Korea. As the tentative reference organisms, one plant and seven animals were selected based on the new International Commission on Radiological Protection recommendation issued in 2007, and the size of the selected organisms was determined from the corresponding Korean endemic species. A set of 25 radionuclides was considered as a potential source term of causing radiological damage to organisms. External and internal dose conversion coefficients for the selected organisms and radionuclides were calculated by the uniform isotropic model or Monte Carlo simulation. Concentration ratios of some endemic species are being measured in laboratory experiments, in parallel with the review of existing data. (authors)

  11. Low dose MDCT of the wrist-An ex vivo approach

    International Nuclear Information System (INIS)

    Bolte, H.; Sattler, E.-M.; Jahnke, T.; Roeger, I.; Biederer, J.; Jochens, A.; Dischinger, J.; Schuenke, M.; Sedlmair, M.; Heller, M.

    2011-01-01

    The primary objective of this study was to evaluate, if in multidetector computed tomography (MDCT) of the wrist a good image quality can be maintained while radiation dose is substantially reduced. In a second approach one solely parameter change that allows for the best trade-off between dose reduction and image quality should be identified. Twenty wrist specimens were examined with a 16-slice MDCT in different parameter combinations: 120 and 100 kV, 100, 70 and 40 electronic mA s, pitch factor 0.9 and 1.5. Images were reconstructed in four standard planes (slice thickness 1.0 mm, increment 0.5 mm, hard kernel) resulting into a total number of 960 images. Two observers evaluated image quality in a blinded and randomized consensus scheme. Detail quality of corticalis, spongiosa, articular surface and soft tissues was graded according to a four-point scale (1 = excellent, 2 = good, 3 = sufficient, and 4 = poor). The scan protocol with the best trade-off between radiation exposure and image quality had a parameter constellation of 100 kV, 70 electronic mA s (78 effective mA s) and a pitch of 0.9 (DLP 63 mGy cm). This represented a dose reduction of 55%. A solely decrease of voltage lead to a dose reduction of 36% without any loss of image quality. An increase of the pitch factor to 1.5 and a decrease from 70 to 40 mA s caused the most distinct impairment of image quality. In MDCT of the wrist good image quality could be maintained while radiation dose was considerably reduced. A reduction of voltage offers the best result for a solely parameter change.

  12. A Voxel-Based Approach to Explore Local Dose Differences Associated With Radiation-Induced Lung Damage

    Energy Technology Data Exchange (ETDEWEB)

    Palma, Giuseppe [Institute of Biostructure and Bioimaging, National Research Council, Naples (Italy); Monti, Serena [IRCCS SDN, Naples (Italy); D' Avino, Vittoria [Institute of Biostructure and Bioimaging, National Research Council, Naples (Italy); Conson, Manuel [Institute of Biostructure and Bioimaging, National Research Council, Naples (Italy); Department of Advanced Biomedical Sciences, Federico II University School of Medicine, Naples (Italy); Liuzzi, Raffaele [Institute of Biostructure and Bioimaging, National Research Council, Naples (Italy); Pressello, Maria Cristina [Department of Health Physics, S. Camillo-Forlanini Hospital, Rome (Italy); Donato, Vittorio [Department of Radiation Oncology, S. Camillo-Forlanini Hospital, Rome (Italy); Deasy, Joseph O. [Department of Medical Physics, Memorial Sloan Kettering Cancer Center, New York, NY (United States); Quarantelli, Mario [Institute of Biostructure and Bioimaging, National Research Council, Naples (Italy); Pacelli, Roberto [Institute of Biostructure and Bioimaging, National Research Council, Naples (Italy); Department of Advanced Biomedical Sciences, Federico II University School of Medicine, Naples (Italy); Cella, Laura, E-mail: laura.cella@cnr.it [Institute of Biostructure and Bioimaging, National Research Council, Naples (Italy)

    2016-09-01

    Purpose: To apply a voxel-based (VB) approach aimed at exploring local dose differences associated with late radiation-induced lung damage (RILD). Methods and Materials: An interinstitutional database of 98 patients who were Hodgkin lymphoma (HL) survivors treated with postchemotherapy supradiaphragmatic radiation therapy was analyzed in the study. Eighteen patients experienced late RILD, classified according to the Radiation Therapy Oncology Group scoring system. Each patient's computed tomographic (CT) scan was normalized to a single reference case anatomy (common coordinate system, CCS) through a log-diffeomorphic approach. The obtained deformation fields were used to map the dose of each patient into the CCS. The coregistration robustness and the dose mapping accuracy were evaluated by geometric and dose scores. Two different statistical mapping schemes for nonparametric multiple permutation inference on dose maps were applied, and the corresponding P<.05 significance lung subregions were generated. A receiver operating characteristic (ROC)-based test was performed on the mean dose extracted from each subregion. Results: The coregistration process resulted in a geometrically robust and accurate dose warping. A significantly higher dose was consistently delivered to RILD patients in voxel clusters near the peripheral medial-basal portion of the lungs. The area under the ROC curves (AUC) from the mean dose of the voxel clusters was higher than the corresponding AUC derived from the total lung mean dose. Conclusions: We implemented a framework including a robust registration process and a VB approach accounting for the multiple comparison problem in dose-response modeling, and applied it to a cohort of HL survivors to explore a local dose–RILD relationship in the lungs. Patients with RILD received a significantly greater dose in parenchymal regions where low doses (∼6 Gy) were delivered. Interestingly, the relation between differences in the high-dose

  13. Benchmarking Non-Hardware Balance-of-System (Soft) Costs for U.S. Photovoltaic Systems Using a Bottom-Up Approach and Installer Survey

    Energy Technology Data Exchange (ETDEWEB)

    Ardani, Kristen [National Renewable Energy Lab. (NREL), Golden, CO (United States); Margolis, Robert [National Renewable Energy Lab. (NREL), Golden, CO (United States); Feldman, David [National Renewable Energy Lab. (NREL), Golden, CO (United States); Ong, Sean [National Renewable Energy Lab. (NREL), Golden, CO (United States); Barbose, Galen [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Wiser, Ryan [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2012-11-01

    This report presents results from the first U.S. Department of Energy (DOE) sponsored, bottom-up data-collection and analysis of non-hardware balance-of-system costs—often referred to as “business process” or “soft” costs—for residential and commercial photovoltaic (PV) systems. Annual expenditure and labor-hour-productivity data are analyzed to benchmark 2010 soft costs related to the DOE priority areas of (1) customer acquisition; (2) permitting, inspection, and interconnection; (3) installation labor; and (4) installer labor for arranging third-party financing. Annual expenditure and labor-hour data were collected from 87 PV installers. After eliminating outliers, the survey sample consists of 75 installers, representing approximately 13% of all residential PV installations and 4% of all commercial installations added in 2010. Including assumed permitting fees, in 2010 the average soft costs benchmarked in this analysis total $1.50/W for residential systems (ranging from $0.66/W to $1.66/W between the 20th and 80th percentiles). For commercial systems, the median 2010 benchmarked soft costs (including assumed permitting fees) are $0.99/W for systems smaller than 250 kW (ranging from $0.51/W to $1.45/W between the 20th and 80th percentiles) and $0.25/W for systems larger than 250 kW (ranging from $0.17/W to $0.78/W between the 20th and 80th percentiles). Additional soft costs not benchmarked in the present analysis (e.g., installer profit, overhead, financing, and contracting) are significant and would add to these figures. The survey results provide a benchmark for measuring—and helping to accelerate—progress over the next decade toward achieving the DOE SunShot Initiative’s soft-cost-reduction targets. We conclude that the selected non-hardware business processes add considerable cost to U.S. PV systems, constituting 23% of residential PV system price, 17% of small commercial system price, and 5% of large commercial system price (in 2010

  14. The KMAT: Benchmarking Knowledge Management.

    Science.gov (United States)

    de Jager, Martha

    Provides an overview of knowledge management and benchmarking, including the benefits and methods of benchmarking (e.g., competitive, cooperative, collaborative, and internal benchmarking). Arthur Andersen's KMAT (Knowledge Management Assessment Tool) is described. The KMAT is a collaborative benchmarking tool, designed to help organizations make…

  15. EPRI depletion benchmark calculations using PARAGON

    International Nuclear Information System (INIS)

    Kucukboyaci, Vefa N.

    2015-01-01

    Highlights: • PARAGON depletion calculations are benchmarked against the EPRI reactivity decrement experiments. • Benchmarks cover a wide range of enrichments, burnups, cooling times, and burnable absorbers, and different depletion and storage conditions. • Results from PARAGON-SCALE scheme are more conservative relative to the benchmark data. • ENDF/B-VII based data reduces the excess conservatism and brings the predictions closer to benchmark reactivity decrement values. - Abstract: In order to conservatively apply burnup credit in spent fuel pool criticality analyses, code validation for both fresh and used fuel is required. Fresh fuel validation is typically done by modeling experiments from the “International Handbook.” A depletion validation can determine a bias and bias uncertainty for the worth of the isotopes not found in the fresh fuel critical experiments. Westinghouse’s burnup credit methodology uses PARAGON™ (Westinghouse 2-D lattice physics code) and its 70-group cross-section library, which have been benchmarked, qualified, and licensed both as a standalone transport code and as a nuclear data source for core design simulations. A bias and bias uncertainty for the worth of depletion isotopes, however, are not available for PARAGON. Instead, the 5% decrement approach for depletion uncertainty is used, as set forth in the Kopp memo. Recently, EPRI developed a set of benchmarks based on a large set of power distribution measurements to ascertain reactivity biases. The depletion reactivity has been used to create 11 benchmark cases for 10, 20, 30, 40, 50, and 60 GWd/MTU and 3 cooling times 100 h, 5 years, and 15 years. These benchmark cases are analyzed with PARAGON and the SCALE package and sensitivity studies are performed using different cross-section libraries based on ENDF/B-VI.3 and ENDF/B-VII data to assess that the 5% decrement approach is conservative for determining depletion uncertainty

  16. Benchmarking the Netherlands. Benchmarking for growth

    International Nuclear Information System (INIS)

    2003-01-01

    This is the fourth edition of the Ministry of Economic Affairs' publication 'Benchmarking the Netherlands', which aims to assess the competitiveness of the Dutch economy. The methodology and objective of the benchmarking remain the same. The basic conditions for economic activity (institutions, regulation, etc.) in a number of benchmark countries are compared in order to learn from the solutions found by other countries for common economic problems. This publication is devoted entirely to the potential output of the Dutch economy. In other words, its ability to achieve sustainable growth and create work over a longer period without capacity becoming an obstacle. This is important because economic growth is needed to increase prosperity in the broad sense and meeting social needs. Prosperity in both a material (per capita GDP) and immaterial (living environment, environment, health, etc) sense, in other words. The economy's potential output is determined by two structural factors: the growth of potential employment and the structural increase in labour productivity. Analysis by the Netherlands Bureau for Economic Policy Analysis (CPB) shows that in recent years the increase in the capacity for economic growth has been realised mainly by increasing the supply of labour and reducing the equilibrium unemployment rate. In view of the ageing of the population in the coming years and decades the supply of labour is unlikely to continue growing at the pace we have become accustomed to in recent years. According to a number of recent studies, to achieve a respectable rate of sustainable economic growth the aim will therefore have to be to increase labour productivity. To realise this we have to focus on for six pillars of economic policy: (1) human capital, (2) functioning of markets, (3) entrepreneurship, (4) spatial planning, (5) innovation, and (6) sustainability. These six pillars determine the course for economic policy aiming at higher productivity growth. Throughout

  17. Benchmarking the Netherlands. Benchmarking for growth

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2003-01-01

    This is the fourth edition of the Ministry of Economic Affairs' publication 'Benchmarking the Netherlands', which aims to assess the competitiveness of the Dutch economy. The methodology and objective of the benchmarking remain the same. The basic conditions for economic activity (institutions, regulation, etc.) in a number of benchmark countries are compared in order to learn from the solutions found by other countries for common economic problems. This publication is devoted entirely to the potential output of the Dutch economy. In other words, its ability to achieve sustainable growth and create work over a longer period without capacity becoming an obstacle. This is important because economic growth is needed to increase prosperity in the broad sense and meeting social needs. Prosperity in both a material (per capita GDP) and immaterial (living environment, environment, health, etc) sense, in other words. The economy's potential output is determined by two structural factors: the growth of potential employment and the structural increase in labour productivity. Analysis by the Netherlands Bureau for Economic Policy Analysis (CPB) shows that in recent years the increase in the capacity for economic growth has been realised mainly by increasing the supply of labour and reducing the equilibrium unemployment rate. In view of the ageing of the population in the coming years and decades the supply of labour is unlikely to continue growing at the pace we have become accustomed to in recent years. According to a number of recent studies, to achieve a respectable rate of sustainable economic growth the aim will therefore have to be to increase labour productivity. To realise this we have to focus on for six pillars of economic policy: (1) human capital, (2) functioning of markets, (3) entrepreneurship, (4) spatial planning, (5) innovation, and (6) sustainability. These six pillars determine the course for economic policy aiming at higher productivity

  18. Hanford Site Composite Analysis Technical Approach Description: Groundwater Pathway Dose Calculation.

    Energy Technology Data Exchange (ETDEWEB)

    Morgans, D. L. [CH2M Hill Plateau Remediation Company, Richland, WA (United States); Lindberg, S. L. [Intera Inc., Austin, TX (United States)

    2017-09-20

    The purpose of this technical approach document (TAD) is to document the assumptions, equations, and methods used to perform the groundwater pathway radiological dose calculations for the revised Hanford Site Composite Analysis (CA). DOE M 435.1-1, states, “The composite analysis results shall be used for planning, radiation protection activities, and future use commitments to minimize the likelihood that current low-level waste disposal activities will result in the need for future corrective or remedial actions to adequately protect the public and the environment.”

  19. Benchmarking in Mobarakeh Steel Company

    Directory of Open Access Journals (Sweden)

    Sasan Ghasemi

    2008-05-01

    Full Text Available Benchmarking is considered as one of the most effective ways of improving performance incompanies. Although benchmarking in business organizations is a relatively new concept and practice, ithas rapidly gained acceptance worldwide. This paper introduces the benchmarking project conducted in Esfahan’s Mobarakeh Steel Company, as the first systematic benchmarking project conducted in Iran. It aimsto share the process deployed for the benchmarking project in this company and illustrate how the projectsystematic implementation led to succes.

  20. Benchmarking in Mobarakeh Steel Company

    OpenAIRE

    Sasan Ghasemi; Mohammad Nazemi; Mehran Nejati

    2008-01-01

    Benchmarking is considered as one of the most effective ways of improving performance in companies. Although benchmarking in business organizations is a relatively new concept and practice, it has rapidly gained acceptance worldwide. This paper introduces the benchmarking project conducted in Esfahan's Mobarakeh Steel Company, as the first systematic benchmarking project conducted in Iran. It aims to share the process deployed for the benchmarking project in this company and illustrate how th...

  1. Shielding Benchmark Computational Analysis

    International Nuclear Information System (INIS)

    Hunter, H.T.; Slater, C.O.; Holland, L.B.; Tracz, G.; Marshall, W.J.; Parsons, J.L.

    2000-01-01

    Over the past several decades, nuclear science has relied on experimental research to verify and validate information about shielding nuclear radiation for a variety of applications. These benchmarks are compared with results from computer code models and are useful for the development of more accurate cross-section libraries, computer code development of radiation transport modeling, and building accurate tests for miniature shielding mockups of new nuclear facilities. When documenting measurements, one must describe many parts of the experimental results to allow a complete computational analysis. Both old and new benchmark experiments, by any definition, must provide a sound basis for modeling more complex geometries required for quality assurance and cost savings in nuclear project development. Benchmarks may involve one or many materials and thicknesses, types of sources, and measurement techniques. In this paper the benchmark experiments of varying complexity are chosen to study the transport properties of some popular materials and thicknesses. These were analyzed using three-dimensional (3-D) models and continuous energy libraries of MCNP4B2, a Monte Carlo code developed at Los Alamos National Laboratory, New Mexico. A shielding benchmark library provided the experimental data and allowed a wide range of choices for source, geometry, and measurement data. The experimental data had often been used in previous analyses by reputable groups such as the Cross Section Evaluation Working Group (CSEWG) and the Organization for Economic Cooperation and Development/Nuclear Energy Agency Nuclear Science Committee (OECD/NEANSC)

  2. Statistical approaches to forecast gamma dose rates by using measurements from the atmosphere

    International Nuclear Information System (INIS)

    Jeong, H.J.; Hwang, W. T.; Kim, E.H.; Han, M.H.

    2008-01-01

    In this paper, the results obtained by inter-comparing several statistical techniques for estimating gamma dose rates, such as an exponential moving average model, a seasonal exponential smoothing model and an artificial neural networks model, are reported. Seven years of gamma dose rates data measured in Daejeon City, Korea, were divided into two parts to develop the models and validate the effectiveness of the generated predictions by the techniques mentioned above. Artificial neural networks model shows the best forecasting capability among the three statistical models. The reason why the artificial neural networks model provides a superior prediction to the other models would be its ability for a non-linear approximation. To replace the gamma dose rates when missing data for an environmental monitoring system occurs, the moving average model and the seasonal exponential smoothing model can be better because they are faster and easier for applicability than the artificial neural networks model. These kinds of statistical approaches will be helpful for a real-time control of radio emissions or for an environmental quality assessment. (authors)

  3. The Key Events Dose-Response Framework: a cross-disciplinary mode-of-action based approach to examining dose-response and thresholds.

    Science.gov (United States)

    Julien, Elizabeth; Boobis, Alan R; Olin, Stephen S

    2009-09-01

    The ILSI Research Foundation convened a cross-disciplinary working group to examine current approaches for assessing dose-response and identifying safe levels of intake or exposure for four categories of bioactive agents-food allergens, nutrients, pathogenic microorganisms, and environmental chemicals. This effort generated a common analytical framework-the Key Events Dose-Response Framework (KEDRF)-for systematically examining key events that occur between the initial dose of a bioactive agent and the effect of concern. Individual key events are considered with regard to factors that influence the dose-response relationship and factors that underlie variability in that relationship. This approach illuminates the connection between the processes occurring at the level of fundamental biology and the outcomes observed at the individual and population levels. Thus, it promotes an evidence-based approach for using mechanistic data to reduce reliance on default assumptions, to quantify variability, and to better characterize biological thresholds. This paper provides an overview of the KEDRF and introduces a series of four companion papers that illustrate initial application of the approach to a range of bioactive agents.

  4. RISC-RAD. A European integrated approach to the problem of low doses

    International Nuclear Information System (INIS)

    Meunier, A.; Sabatier, L.; Atkinson, M.; Paretzke, H.; Bouffler, S.; Mullenders, L.

    2007-01-01

    Complete text of publication follows. Funded by the European Commission in the framework of a dedicated programme supporting research in the Nuclear sector (FP6 Euratom), the project RISC-RAD undertakes experimental and modelling studies ultimately to improve low dose radiation cancer risk assessment by exploring and providing evidence for the most appropriate radiation cancer risk projection and interpolation models. It started on 1st January 2004 and is running until 31 st October 2008. It mobilizes a consortium of 31 partners and is coordinated by Dr. Laure Sabatier from the French atomic energy commission. Indeed the project represents an unprecedented attempt to integrate horizontally the research on the effects of low doses of IR at the European level. A multipartner project supporting objective-driven research, RISC-RAD aims at contributing to bridge the remaining gap of scientific knowledge about effects of lows doses of ionizing radiation. It spans a large part of the research spectrum, including many topics addressed during the LOWRAD2007 conference. This presentation intends to give an account of the integrative aspects of the project, insights on the innovative solutions found to approach a complex and controversial scientific topic like the biological effects of low doses of ionizing radiation, and links with some areas of social studies on science.The concept of 'integration' implies the development of a new kind of activity in the research field, which crosses its traditional boundaries : controversies of several kinds must temporarily be overcome within the project management board in order to define and follow a common strategy. Among them, how to reconcile the creative part of fundamental research with the compliance to strict project planning rules has come up as a debate which questions the best way a significant collective and coordinated action can address the issue of the low dose cancer risk assessment on the long term. The knowledge and

  5. Comparison of linear and nonlinear programming approaches for "worst case dose" and "minmax" robust optimization of intensity-modulated proton therapy dose distributions.

    Science.gov (United States)

    Zaghian, Maryam; Cao, Wenhua; Liu, Wei; Kardar, Laleh; Randeniya, Sharmalee; Mohan, Radhe; Lim, Gino

    2017-03-01

    Robust optimization of intensity-modulated proton therapy (IMPT) takes uncertainties into account during spot weight optimization and leads to dose distributions that are resilient to uncertainties. Previous studies demonstrated benefits of linear programming (LP) for IMPT in terms of delivery efficiency by considerably reducing the number of spots required for the same quality of plans. However, a reduction in the number of spots may lead to loss of robustness. The purpose of this study was to evaluate and compare the performance in terms of plan quality and robustness of two robust optimization approaches using LP and nonlinear programming (NLP) models. The so-called "worst case dose" and "minmax" robust optimization approaches and conventional planning target volume (PTV)-based optimization approach were applied to designing IMPT plans for five patients: two with prostate cancer, one with skull-based cancer, and two with head and neck cancer. For each approach, both LP and NLP models were used. Thus, for each case, six sets of IMPT plans were generated and assessed: LP-PTV-based, NLP-PTV-based, LP-worst case dose, NLP-worst case dose, LP-minmax, and NLP-minmax. The four robust optimization methods behaved differently from patient to patient, and no method emerged as superior to the others in terms of nominal plan quality and robustness against uncertainties. The plans generated using LP-based robust optimization were more robust regarding patient setup and range uncertainties than were those generated using NLP-based robust optimization for the prostate cancer patients. However, the robustness of plans generated using NLP-based methods was superior for the skull-based and head and neck cancer patients. Overall, LP-based methods were suitable for the less challenging cancer cases in which all uncertainty scenarios were able to satisfy tight dose constraints, while NLP performed better in more difficult cases in which most uncertainty scenarios were hard to meet

  6. Deviating From the Benchmarks

    DEFF Research Database (Denmark)

    Rocha, Vera; Van Praag, Mirjam; Carneiro, Anabela

    This paper studies three related questions: To what extent otherwise similar startups employ different quantities and qualities of human capital at the moment of entry? How persistent are initial human capital choices over time? And how does deviating from human capital benchmarks influence firm......, founders human capital, and the ownership structure of startups (solo entrepreneurs versus entrepreneurial teams). We then study the survival implications of exogenous deviations from these benchmarks, based on spline models for survival data. Our results indicate that (especially negative) deviations from...... the benchmark can be substantial, are persistent over time, and hinder the survival of firms. The implications may, however, vary according to the sector and the ownership structure at entry. Given the stickiness of initial choices, wrong human capital decisions at entry turn out to be a close to irreversible...

  7. A first-principles approach to total-dose hardness assurance

    International Nuclear Information System (INIS)

    Fleetwood, D.M.

    1995-01-01

    A first-principles approach to radiation hardness assurance was described that provides the technical background to the present US and European total-dose radiation hardness assurance test methods for MOS technologies, TM 1019.4 and BS 22900. These test methods could not have been developed otherwise, as their existence depends not on a wealth of empirical comparisons of IC data from ground and space testing, but on a fundamental understanding of MOS defect growth and annealing processes. Rebound testing should become less of a problem for advanced MOS small-signal electronics technologies for systems with total dose requirements below 50--100 krad(SiO 2 ) because of trends toward much thinner gate oxides. For older technologies with thicker gate oxides and for power devices, rebound testing is unavoidable without detailed characterization studies to assess the impact of interface traps on devices response in space. The QML approach is promising for future hardened technologies. A sufficient understanding of process effects on radiation hardness has been developed that should be able to reduce testing costs in the future for hardened parts. Finally, it is hoped that the above discussions have demonstrated that the foundation for cost-effective hardness assurance tests is laid with studies of the basic mechanisms of radiation effects. Without a diligent assessment of new radiation effects mechanisms in future technologies, one cannot be assured that the present generation of radiation test standards will continue to apply

  8. HPCG Benchmark Technical Specification

    Energy Technology Data Exchange (ETDEWEB)

    Heroux, Michael Allen [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Dongarra, Jack [Univ. of Tennessee, Knoxville, TN (United States); Luszczek, Piotr [Univ. of Tennessee, Knoxville, TN (United States)

    2013-10-01

    The High Performance Conjugate Gradient (HPCG) benchmark [cite SNL, UTK reports] is a tool for ranking computer systems based on a simple additive Schwarz, symmetric Gauss-Seidel preconditioned conjugate gradient solver. HPCG is similar to the High Performance Linpack (HPL), or Top 500, benchmark [1] in its purpose, but HPCG is intended to better represent how today’s applications perform. In this paper we describe the technical details of HPCG: how it is designed and implemented, what code transformations are permitted and how to interpret and report results.

  9. Benchmarking for Best Practice

    CERN Document Server

    Zairi, Mohamed

    1998-01-01

    Benchmarking for Best Practice uses up-to-the-minute case-studies of individual companies and industry-wide quality schemes to show how and why implementation has succeeded. For any practitioner wanting to establish best practice in a wide variety of business areas, this book makes essential reading. .It is also an ideal textbook on the applications of TQM since it describes concepts, covers definitions and illustrates the applications with first-hand examples. Professor Mohamed Zairi is an international expert and leading figure in the field of benchmarking. His pioneering work in this area l

  10. Benchmarking Danish Industries

    DEFF Research Database (Denmark)

    Gammelgaard, Britta; Bentzen, Eric; Aagaard Andreassen, Mette

    2003-01-01

    compatible survey. The International Manufacturing Strategy Survey (IMSS) doesbring up the question of supply chain management, but unfortunately, we did not have access to thedatabase. Data from the members of the SCOR-model, in the form of benchmarked performance data,may exist, but are nonetheless...... not public. The survey is a cooperative project "Benchmarking DanishIndustries" with CIP/Aalborg University, the Danish Technological University, the DanishTechnological Institute and Copenhagen Business School as consortia partners. The project has beenfunded by the Danish Agency for Trade and Industry...

  11. [Do you mean benchmarking?].

    Science.gov (United States)

    Bonnet, F; Solignac, S; Marty, J

    2008-03-01

    The purpose of benchmarking is to settle improvement processes by comparing the activities to quality standards. The proposed methodology is illustrated by benchmark business cases performed inside medical plants on some items like nosocomial diseases or organization of surgery facilities. Moreover, the authors have built a specific graphic tool, enhanced with balance score numbers and mappings, so that the comparison between different anesthesia-reanimation services, which are willing to start an improvement program, is easy and relevant. This ready-made application is even more accurate as far as detailed tariffs of activities are implemented.

  12. RB reactor benchmark cores

    International Nuclear Information System (INIS)

    Pesic, M.

    1998-01-01

    A selected set of the RB reactor benchmark cores is presented in this paper. The first results of validation of the well-known Monte Carlo MCNP TM code and adjoining neutron cross section libraries are given. They confirm the idea for the proposal of the new U-D 2 O criticality benchmark system and support the intention to include this system in the next edition of the recent OECD/NEA Project: International Handbook of Evaluated Criticality Safety Experiment, in near future. (author)

  13. Benchmarking Non-Hardware Balance-of-System (Soft) Costs for U.S. Photovoltaic Systems, Using a Bottom-Up Approach and Installer Survey - Second Edition

    Energy Technology Data Exchange (ETDEWEB)

    Friedman, B.; Ardani, K.; Feldman, D.; Citron, R.; Margolis, R.; Zuboy, J.

    2013-10-01

    This report presents results from the second U.S. Department of Energy (DOE) sponsored, bottom-up data-collection and analysis of non-hardware balance-of-system costs -- often referred to as 'business process' or 'soft' costs -- for U.S. residential and commercial photovoltaic (PV) systems. In service to DOE's SunShot Initiative, annual expenditure and labor-hour-productivity data are analyzed to benchmark 2012 soft costs related to (1) customer acquisition and system design (2) permitting, inspection, and interconnection (PII). We also include an in-depth analysis of costs related to financing, overhead, and profit. Soft costs are both a major challenge and a major opportunity for reducing PV system prices and stimulating SunShot-level PV deployment in the United States. The data and analysis in this series of benchmarking reports are a step toward the more detailed understanding of PV soft costs required to track and accelerate these price reductions.

  14. Benchmarking and Performance Management

    Directory of Open Access Journals (Sweden)

    Adrian TANTAU

    2010-12-01

    Full Text Available The relevance of the chosen topic is explained by the meaning of the firm efficiency concept - the firm efficiency means the revealed performance (how well the firm performs in the actual market environment given the basic characteristics of the firms and their markets that are expected to drive their profitability (firm size, market power etc.. This complex and relative performance could be due to such things as product innovation, management quality, work organization, some other factors can be a cause even if they are not directly observed by the researcher. The critical need for the management individuals/group to continuously improve their firm/company’s efficiency and effectiveness, the need for the managers to know which are the success factors and the competitiveness determinants determine consequently, what performance measures are most critical in determining their firm’s overall success. Benchmarking, when done properly, can accurately identify both successful companies and the underlying reasons for their success. Innovation and benchmarking firm level performance are critical interdependent activities. Firm level variables, used to infer performance, are often interdependent due to operational reasons. Hence, the managers need to take the dependencies among these variables into account when forecasting and benchmarking performance. This paper studies firm level performance using financial ratio and other type of profitability measures. It uses econometric models to describe and then propose a method to forecast and benchmark performance.

  15. Surveys and Benchmarks

    Science.gov (United States)

    Bers, Trudy

    2012-01-01

    Surveys and benchmarks continue to grow in importance for community colleges in response to several factors. One is the press for accountability, that is, for colleges to report the outcomes of their programs and services to demonstrate their quality and prudent use of resources, primarily to external constituents and governing boards at the state…

  16. Assessing doses to terrestrial wildlife at a radioactive waste disposal site: inter-comparison of modelling approaches.

    Science.gov (United States)

    Johansen, M P; Barnett, C L; Beresford, N A; Brown, J E; Černe, M; Howard, B J; Kamboj, S; Keum, D-K; Smodiš, B; Twining, J R; Vandenhove, H; Vives i Batlle, J; Wood, M D; Yu, C

    2012-06-15

    Radiological doses to terrestrial wildlife were examined in this model inter-comparison study that emphasised factors causing variability in dose estimation. The study participants used varying modelling approaches and information sources to estimate dose rates and tissue concentrations for a range of biota types exposed to soil contamination at a shallow radionuclide waste burial site in Australia. Results indicated that the dominant factor causing variation in dose rate estimates (up to three orders of magnitude on mean total dose rates) was the soil-to-organism transfer of radionuclides that included variation in transfer parameter values as well as transfer calculation methods. Additional variation was associated with other modelling factors including: how participants conceptualised and modelled the exposure configurations (two orders of magnitude); which progeny to include with the parent radionuclide (typically less than one order of magnitude); and dose calculation parameters, including radiation weighting factors and dose conversion coefficients (typically less than one order of magnitude). Probabilistic approaches to model parameterisation were used to encompass and describe variable model parameters and outcomes. The study confirms the need for continued evaluation of the underlying mechanisms governing soil-to-organism transfer of radionuclides to improve estimation of dose rates to terrestrial wildlife. The exposure pathways and configurations available in most current codes are limited when considering instances where organisms access subsurface contamination through rooting, burrowing, or using different localised waste areas as part of their habitual routines. Crown Copyright © 2012. Published by Elsevier B.V. All rights reserved.

  17. Assessing doses to terrestrial wildlife at a radioactive waste disposal site: Inter-comparison of modelling approaches

    Energy Technology Data Exchange (ETDEWEB)

    Johansen, M.P., E-mail: mathew.johansen@ansto.gov.au [Australian Nuclear Science and Technology Organisation, Locked Bag 2001, Kirrawee DC, NSW, 2232 (Australia); Barnett, C.L., E-mail: clb@ceh.ac.uk [Centre for Ecology and Hydrology, Lancaster (United Kingdom); Beresford, N.A., E-mail: nab@ceh.ac.uk [Centre for Ecology and Hydrology, Lancaster (United Kingdom); Brown, J.E., E-mail: justin.brown@nrpa.no [Norwegian Radiation Protection Authority, Oesteraas (Norway); Cerne, M., E-mail: marko.cerne@ijs.si [Jozef Stefan Institute, Ljubljana (Slovenia); Howard, B.J., E-mail: bjho@ceh.ac.uk [Centre for Ecology and Hydrology, Lancaster (United Kingdom); Kamboj, S., E-mail: skamboj@anl.gov [Argonne National Laboratory, IL (United States); Keum, D.-K., E-mail: dkkeum@kaeri.re.kr [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of); Smodis, B. [Jozef Stefan Institute, Ljubljana (Slovenia); Twining, J.R., E-mail: jrt@ansto.gov.au [Australian Nuclear Science and Technology Organisation, Locked Bag 2001, Kirrawee DC, NSW, 2232 (Australia); Vandenhove, H., E-mail: hvandenh@sckcen.be [Belgian Nuclear Research Centre, Mol (Belgium); Vives i Batlle, J., E-mail: jvbatll@sckcen.be [Belgian Nuclear Research Centre, Mol (Belgium); Wood, M.D., E-mail: m.d.wood@salford.ac.uk [University of Salford, Manchester (United Kingdom); Yu, C., E-mail: cyu@anl.gov [Argonne National Laboratory, IL (United States)

    2012-06-15

    Radiological doses to terrestrial wildlife were examined in this model inter-comparison study that emphasised factors causing variability in dose estimation. The study participants used varying modelling approaches and information sources to estimate dose rates and tissue concentrations for a range of biota types exposed to soil contamination at a shallow radionuclide waste burial site in Australia. Results indicated that the dominant factor causing variation in dose rate estimates (up to three orders of magnitude on mean total dose rates) was the soil-to-organism transfer of radionuclides that included variation in transfer parameter values as well as transfer calculation methods. Additional variation was associated with other modelling factors including: how participants conceptualised and modelled the exposure configurations (two orders of magnitude); which progeny to include with the parent radionuclide (typically less than one order of magnitude); and dose calculation parameters, including radiation weighting factors and dose conversion coefficients (typically less than one order of magnitude). Probabilistic approaches to model parameterisation were used to encompass and describe variable model parameters and outcomes. The study confirms the need for continued evaluation of the underlying mechanisms governing soil-to-organism transfer of radionuclides to improve estimation of dose rates to terrestrial wildlife. The exposure pathways and configurations available in most current codes are limited when considering instances where organisms access subsurface contamination through rooting, burrowing, or using different localised waste areas as part of their habitual routines. - Highlights: Black-Right-Pointing-Pointer Assessment of modelled dose rates to terrestrial biota from radionuclides. Black-Right-Pointing-Pointer The substantial variation among current approaches is quantifiable. Black-Right-Pointing-Pointer The dominant variable was soil

  18. Low dose radiation effects: an integrative european approach (Risc-Rad Project) coordinated by the Cea

    International Nuclear Information System (INIS)

    Sabatier, L.

    2006-01-01

    RISC-RAD (Radiosensitivity of Individuals and Susceptibility to Cancer induced by ionizing Radiations) is an Integrated Project funded by the European Commission under 6. Framework Programme / EURATOM. RISC-RAD started on 1. January 2004 for a duration of four years. Coordinated by Cea (Dr Laure Sabatier), it involves 11 European countries (Austria, Denmark, Finland, France, Germany, Ireland, Italy, the Netherlands, Spain, Sweden and the United Kingdom) and 29 research institutions. Objectives: Exposures to low and protracted doses of ionizing radiation are very frequent in normal living environment, at work places, in industry and in medicine. Effects of these exposures on human health cannot be reliably assessed by epidemiological methods, nor is thoroughly understood by biologists. RISC-RAD project proposes to help bridging the gap of scientific knowledge about these effects. To achieve this goal, a necessary key step is to understand the basic mechanisms by which radiation induces cancer. Studying this multistage process in an integrated way, the project offers a new biological approach characterised by and clear-cut and objective-driven scientific policy: the project is focused on the effects of low doses (less than 100 mSv) and protracted doses of radiation. It aims at identifying new parameters that take into account the differences in radiation responses between individuals. A group of modelers works closely with the experimental teams in order to better quantify the risks associated with low and protracted doses. Research work is divided into five work packages interacting closely with each other. WP1 is dedicated to DNA damage. Ionizing Radiation (IR) produce a broad spectrum of base modifications and DNA strand breaks of different kinds, among which double-strand breaks and 'clustered damage' which is thought to be a major feature in biological effectiveness of IR. The aim of Work Package 1 is to improve understanding of the initial DNA damage induced by

  19. Benchmarking i den offentlige sektor

    DEFF Research Database (Denmark)

    Bukh, Per Nikolaj; Dietrichson, Lars; Sandalgaard, Niels

    2008-01-01

    I artiklen vil vi kort diskutere behovet for benchmarking i fraværet af traditionelle markedsmekanismer. Herefter vil vi nærmere redegøre for, hvad benchmarking er med udgangspunkt i fire forskellige anvendelser af benchmarking. Regulering af forsyningsvirksomheder vil blive behandlet, hvorefter...

  20. Eye lens dosimetry for fluoroscopically guided clinical procedures: practical approaches to protection and dose monitoring

    International Nuclear Information System (INIS)

    Martin, Colin J.

    2016-01-01

    Doses to the eye lenses of clinicians undertaking fluoroscopically guided procedures can exceed the dose annual limit of 20 mSv, so optimisation of radiation protection is essential. Ceiling-suspended shields and disposable radiation absorbing pads can reduce eye dose by factors of 2-7. Lead glasses that shield against exposures from the side can lower doses by 2.5-4.5 times. Training in effective use of protective devices is an essential element in achieving good protection and acceptable eye doses. Effective methods for dose monitoring are required to identify protection issues. Dosemeters worn adjacent to the eye provide the better option for interventional clinicians, but an unprotected dosemeter worn at the neck will give an indication of eye dose that is adequate for most interventional staff. Potential requirements for protective devices and dose monitoring can be determined from risk assessments using generic values for dose linked to examination workload. (author)

  1. Cloud benchmarking for performance

    OpenAIRE

    Varghese, Blesson; Akgun, Ozgur; Miguel, Ian; Thai, Long; Barker, Adam

    2014-01-01

    Date of Acceptance: 20/09/2014 How can applications be deployed on the cloud to achieve maximum performance? This question has become significant and challenging with the availability of a wide variety of Virtual Machines (VMs) with different performance capabilities in the cloud. The above question is addressed by proposing a six step benchmarking methodology in which a user provides a set of four weights that indicate how important each of the following groups: memory, processor, computa...

  2. Continual reassessment method for dose escalation clinical trials in oncology: a comparison of prior skeleton approaches using AZD3514 data.

    Science.gov (United States)

    James, Gareth D; Symeonides, Stefan N; Marshall, Jayne; Young, Julia; Clack, Glen

    2016-08-31

    The continual reassessment method (CRM) requires an underlying model of the dose-toxicity relationship ("prior skeleton") and there is limited guidance of what this should be when little is known about this association. In this manuscript the impact of applying the CRM with different prior skeleton approaches and the 3 + 3 method are compared in terms of ability to determine the true maximum tolerated dose (MTD) and number of patients allocated to sub-optimal and toxic doses. Post-hoc dose-escalation analyses on real-life clinical trial data on an early oncology compound (AZD3514), using the 3 + 3 method and CRM using six different prior skeleton approaches. All methods correctly identified the true MTD. The 3 + 3 method allocated six patients to both sub-optimal and toxic doses. All CRM approaches allocated four patients to sub-optimal doses. No patients were allocated to toxic doses from sigmoidal, two from conservative and five from other approaches. Prior skeletons for the CRM for phase 1 clinical trials are proposed in this manuscript and applied to a real clinical trial dataset. Highly accurate initial skeleton estimates may not be essential to determine the true MTD, and, as expected, all CRM methods out-performed the 3 + 3 method. There were differences in performance between skeletons. The choice of skeleton should depend on whether minimizing the number of patients allocated to suboptimal or toxic doses is more important. NCT01162395 , Trial date of first registration: July 13, 2010.

  3. Benchmarking reference services: an introduction.

    Science.gov (United States)

    Marshall, J G; Buchanan, H S

    1995-01-01

    Benchmarking is based on the common sense idea that someone else, either inside or outside of libraries, has found a better way of doing certain things and that your own library's performance can be improved by finding out how others do things and adopting the best practices you find. Benchmarking is one of the tools used for achieving continuous improvement in Total Quality Management (TQM) programs. Although benchmarking can be done on an informal basis, TQM puts considerable emphasis on formal data collection and performance measurement. Used to its full potential, benchmarking can provide a common measuring stick to evaluate process performance. This article introduces the general concept of benchmarking, linking it whenever possible to reference services in health sciences libraries. Data collection instruments that have potential application in benchmarking studies are discussed and the need to develop common measurement tools to facilitate benchmarking is emphasized.

  4. Benchmark validation of statistical models: Application to mediation analysis of imagery and memory.

    Science.gov (United States)

    MacKinnon, David P; Valente, Matthew J; Wurpts, Ingrid C

    2018-03-29

    This article describes benchmark validation, an approach to validating a statistical model. According to benchmark validation, a valid model generates estimates and research conclusions consistent with a known substantive effect. Three types of benchmark validation-(a) benchmark value, (b) benchmark estimate, and (c) benchmark effect-are described and illustrated with examples. Benchmark validation methods are especially useful for statistical models with assumptions that are untestable or very difficult to test. Benchmark effect validation methods were applied to evaluate statistical mediation analysis in eight studies using the established effect that increasing mental imagery improves recall of words. Statistical mediation analysis led to conclusions about mediation that were consistent with established theory that increased imagery leads to increased word recall. Benchmark validation based on established substantive theory is discussed as a general way to investigate characteristics of statistical models and a complement to mathematical proof and statistical simulation. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  5. MO-C-17A-10: Comparison of Dose Deformable Accumulation by Using Parallel and Serial Approaches

    Energy Technology Data Exchange (ETDEWEB)

    Gao, Z; Li, M; Wong, J [Morristown Medical Center, Morristown, NJ (United States)

    2014-06-15

    Purpose: The uncertainty of dose accumulation over multiple CT datasets with deformable fusion may have significant impact on clinical decisions. In this study, we investigate the difference of two dose summation approaches involving deformable fusion. Methods: Five patients, four external beam and one brachytherapy(BT), were chosen for the study. The BT patient was treated with CT-based HDR. The CT image sets acquired in the imageguidance process (8-11 CTs/patient) were used to determine the dose delivered to the four external beam patients. (prostate, pelvis, lung and head and neck). For the HDR patient (cervix), five CT image sets and the corresponding BT plans were used. In total 44 CT datasets and RT dose/plans were imported into the image fusion software MiM (6.0.4) for analysis.For each of the five clinical cases, the dose from each fraction was accumulated into the primary CT dataset by using both Parallel and Serial approaches. The dose-volume histogram (DVH) for CTV and selected organs-at-risks (OAR) were generated. The D95(CTV), OAR(mean) and OAR(max) for the four external beam cases the D90(CTV), and the max dose to bladder and rectum for the BT case were compared. Results: For the four external beam patients, the difference in D95(CTV) were <1.2% PD between the parallel and the serial approaches. The differences of the OAR(mean) and the OAR(max ) range from 0 to 3.7% and <1% PD respectively. For the HDR patient, the dose difference for D90 is 11% PD while that of the max dose to bladder and rectum were 11.5% and 23.3% respectively. Conclusion: For external beam treatments, the parallel and serial approaches have <5% difference probably because tumor volume and OAR have less changes from fraction to fraction. For the brachytherapy case, >10% dose difference between the two approaches was observed as significant volume changes of tumor and OAR were observed among treatment fractions.

  6. Evaluating secondary neutron doses of a refined shielded design for a medical cyclotron using the TLD approach

    International Nuclear Information System (INIS)

    Lin, Jye-Bin; Tseng, Hsien-Chun; Liu, Wen-Shan; Lin, Ding-Bang; Hsieh, Teng-San; Chen, Chien-Yi

    2013-01-01

    An increasing number of cyclotrons at medical centers in Taiwan have been installed to generate radiopharmaceutical products. An operating cyclotron generates immense amounts of secondary neutrons from reactions such the 18 O(p, n) 18 F, used in the production of FDG. This intense radiation can be hazardous to public health, particularly to medical personnel. To increase the yield of 18 F-FDG from 4200 GBq in 2005 to 48,600 GBq in 2011, Chung Shan Medical University Hospital (CSMUH) has prolonged irradiation time without changing the target or target current to meet requirements regarding the production 18 F. The CSMUH has redesigned the CTI Radioisotope Delivery System shield. The lack of data for a possible secondary neutron doses has increased due to newly designed cyclotron rooms. This work aims to evaluate secondary neutron doses at a CTI cyclotron center using a thermoluminescent dosimeter (TLD-600). Two-dimensional neutron doses were mapped and indicated that neutron doses were high as neutrons leaked through self-shielded blocks and through the L-shaped concrete shield in vault rooms. These neutron doses varied markedly among locations close to the H 2 18 O target. The Monte Carlo simulation and minimum detectable dose are also discussed and demonstrated the reliability of using the TLD-600 approach. Findings can be adopted by medical centers to identify radioactive hot spots and develop radiation protection. - Highlights: • Neutron doses were verified using TLD approach. • Neutron doses were increased at cyclotron centers. • Revised L-shaped shield suppresses effectively the neutrons. • Neutron dose can be attenuated to 1.13×10 6 %

  7. The metabolomic approach identifies a biological signature of low-dose chronic exposure to Cesium 137

    International Nuclear Information System (INIS)

    Grison, S.; Grandcolas, L.; Martin, J.C.

    2012-01-01

    Reports have described apparent biological effects of 137 Cs (the most persistent dispersed radionuclide) irradiation in people living in Chernobyl-contaminated territory. The sensitive analytical technology described here should now help assess the relation of this contamination to the observed effects. A rat model chronically exposed to 137 Cs through drinking water was developed to identify biomarkers of radiation-induced metabolic disorders, and the biological impact was evaluated by a metabolomic approach that allowed us to detect several hundred metabolites in biofluids and assess their association with disease states. After collection of plasma and urine from contaminated and non-contaminated rats at the end of the 9-months contamination period, analysis with a liquid chromatography coupled to mass spectrometry (LC-MS) system detected 742 features in urine and 1309 in plasma. Biostatistical discriminant analysis extracted a subset of 26 metabolite signals (2 urinary, 4 plasma non-polar, and 19 plasma polar metabolites) that in combination were able to predict from 68 up to 94% of the contaminated rats, depending on the prediction method used, with a misclassification rate as low as 5.3%. The difference in this metabolic score between the contaminated and non-contaminated rats was highly significant (P=0.019 after ANOVA cross-validation). In conclusion, our proof-of-principle study demonstrated for the first time the usefulness of a metabolomic approach for addressing biological effects of chronic low-dose contamination. We can conclude that a metabolomic signature discriminated 137 Cs-contaminated from control animals in our model. Further validation is nevertheless required together with full annotation of the metabolic indicators. (author)

  8. Benchmarking HIV health care

    DEFF Research Database (Denmark)

    Podlekareva, Daria; Reekie, Joanne; Mocroft, Amanda

    2012-01-01

    ABSTRACT: BACKGROUND: State-of-the-art care involving the utilisation of multiple health care interventions is the basis for an optimal long-term clinical prognosis for HIV-patients. We evaluated health care for HIV-patients based on four key indicators. METHODS: Four indicators of health care we...... document pronounced regional differences in adherence to guidelines and can help to identify gaps and direct target interventions. It may serve as a tool for assessment and benchmarking the clinical management of HIV-patients in any setting worldwide....

  9. Benchmarking Cloud Storage Systems

    OpenAIRE

    Wang, Xing

    2014-01-01

    With the rise of cloud computing, many cloud storage systems like Dropbox, Google Drive and Mega have been built to provide decentralized and reliable file storage. It is thus of prime importance to know their features, performance, and the best way to make use of them. In this context, we introduce BenchCloud, a tool designed as part of this thesis to conveniently and efficiently benchmark any cloud storage system. First, we provide a study of six commonly-used cloud storage systems to ident...

  10. The COST Benchmark

    DEFF Research Database (Denmark)

    Jensen, Christian Søndergaard; Tiesyte, Dalia; Tradisauskas, Nerius

    2006-01-01

    An infrastructure is emerging that enables the positioning of populations of on-line, mobile service users. In step with this, research in the management of moving objects has attracted substantial attention. In particular, quite a few proposals now exist for the indexing of moving objects...... takes into account that the available positions of the moving objects are inaccurate, an aspect largely ignored in previous indexing research. The concepts of data and query enlargement are introduced for addressing inaccuracy. As proof of concepts of the benchmark, the paper covers the application...

  11. A Benchmarking System for Domestic Water Use

    Directory of Open Access Journals (Sweden)

    Dexter V. L. Hunt

    2014-05-01

    Full Text Available The national demand for water in the UK is predicted to increase, exacerbated by a growing UK population, and home-grown demands for energy and food. When set against the context of overstretched existing supply sources vulnerable to droughts, particularly in increasingly dense city centres, the delicate balance of matching minimal demands with resource secure supplies becomes critical. When making changes to "internal" demands the role of technological efficiency and user behaviour cannot be ignored, yet existing benchmarking systems traditionally do not consider the latter. This paper investigates the practicalities of adopting a domestic benchmarking system (using a band rating that allows individual users to assess their current water use performance against what is possible. The benchmarking system allows users to achieve higher benchmarks through any approach that reduces water consumption. The sensitivity of water use benchmarks are investigated by making changes to user behaviour and technology. The impact of adopting localised supplies (i.e., Rainwater harvesting—RWH and Grey water—GW and including "external" gardening demands are investigated. This includes the impacts (in isolation and combination of the following: occupancy rates (1 to 4; roof size (12.5 m2 to 100 m2; garden size (25 m2 to 100 m2 and geographical location (North West, Midlands and South East, UK with yearly temporal effects (i.e., rainfall and temperature. Lessons learnt from analysis of the proposed benchmarking system are made throughout this paper, in particular its compatibility with the existing Code for Sustainable Homes (CSH accreditation system. Conclusions are subsequently drawn for the robustness of the proposed system.

  12. A new approach to the estimation of radiopharmaceutical radiation dose distributions

    International Nuclear Information System (INIS)

    Hetherington, E.L.R.; Wood, N.R.

    1975-03-01

    For a photon energy of 150 keV, the Monte Carlo technique of photon history simulation was used to obtain estimates of the dose distribution in a human phantom for three activity distributions relevant to diagnostic nuclear medicine. In this preliminary work, the number of photon histories considered was insufficient to produce complete dose contours and the dose distributions are presented in the form of colour-coded diagrams. The distribution obtained illustrate an important deficiency in the MIRD Schema for dose estimation. Although the Schema uses the same mathematical technique for calculating photon doses, the results are obtained as average values for the whole body and for complete organs. It is shown that the actual dose distributions, particularly those for the whole body may, differ significantly from the average value calculated using the MIRD Schema and published absorbed fractions. (author)

  13. Benchmarking HRA methods against different NPP simulator data

    International Nuclear Information System (INIS)

    Petkov, Gueorgui; Filipov, Kalin; Velev, Vladimir; Grigorov, Alexander; Popov, Dimiter; Lazarov, Lazar; Stoichev, Kosta

    2008-01-01

    The paper presents both international and Bulgarian experience in assessing HRA methods, underlying models approaches for their validation and verification by benchmarking HRA methods against different NPP simulator data. The organization, status, methodology and outlooks of the studies are described

  14. 75 FR 66057 - Waybill Data Released in Three-Benchmark Rail Rate Proceedings

    Science.gov (United States)

    2010-10-27

    ... (CSX Transp. II), 584 F.3d 1076 (DC Cir. 2009), the Board modified its simplified rail rate guidelines...- Benchmark approach for smaller rail rate disputes. The Three-Benchmark method compares a challenged rate of...: The RSAM and R/VC >180 benchmarks. See Rate Guidelines--Non-Coal Proceedings, (Rate Guidelines) 1 S.T...

  15. Probabilistic approach to external cloud dose calculations using onsite meteorological data

    International Nuclear Information System (INIS)

    Strenge, D.L.; Watson, E.C.; Bander, T.J.; Kennedy, W.E.

    1976-01-01

    A method is described for calculation of external total body and skin doses from accidental atmospheric releases of radionuclides based on hourly onsite meteorological data. The method involves calculation of dose values from a finite size cloud for each hourly observation for a given radionuclide inventory. These values are then used to determine the probability of occurrence of dose levels for specified release times ranging from one hour to 30 days

  16. A simplified approach for exit dose in vivo measurements in radiotherapy and its clinical application

    International Nuclear Information System (INIS)

    Banjade, D.P.; Shukri, A.; Tajuddin, A.A.; Shrestha, S.L.; Bhat, M.

    2002-01-01

    This is a study using LiF:Mg;Ti thermoluminescent dosimeter (TLD) rods in phantoms to investigate the effect of lack of backscatter on exit dose. Comparing the measured dose with anticipated dose calculated using tissue maximum ratio (TMR) or percentage depth dose (PDD) gives rise to a correction factor. This correction factor may be applied to in-vivo dosimetry results to derive true dose to a point within the patient. Measurements in a specially designed humanoid breast phantom as well as patients undergoing radiotherapy treatment were also been done. TLDs with reproducibility of within ±3% (1 SD) are irradiated in a series of measurements for 6 and 10 MV photon beams from a medical linear accelerator. The measured exit doses for the different phantom thickness for 6 MV beams are found to be lowered by 10.9 to 14.0% compared to the dose derived from theoretical estimation (normalized dose at d max ). The same measurements for 10 MV beams are lowered by 9.0 to 13.5%. The variations of measured exit dose for different field sizes are found to be within 2.5%. The exit doses with added backscatter material from 2 mm up to 15 cm, shows gradual increase and the saturated values agreed within 1.5% with the expected results for both beams. The measured exit doses in humanoid breast phantom as well as in the clinical trial on patients undergoing radiotherapy also agreed with the predicted results based on phantom measurements. The authors' viewpoint is that this technique provides sufficient information to design exit surface bolus to restore build down effect in cases where part of the exit surface is being considered as a target volume. It indicates that the technique could be translated for in vivo dose measurements, which may be a conspicuous step of quality assurance in clinical practice. Copyright (2002) Australasian College of Physical Scientists and Engineers in Medicine

  17. Core Benchmarks Descriptions

    International Nuclear Information System (INIS)

    Pavlovichev, A.M.

    2001-01-01

    Actual regulations while designing of new fuel cycles for nuclear power installations comprise a calculational justification to be performed by certified computer codes. It guarantees that obtained calculational results will be within the limits of declared uncertainties that are indicated in a certificate issued by Gosatomnadzor of Russian Federation (GAN) and concerning a corresponding computer code. A formal justification of declared uncertainties is the comparison of calculational results obtained by a commercial code with the results of experiments or of calculational tests that are calculated with an uncertainty defined by certified precision codes of MCU type or of other one. The actual level of international cooperation provides an enlarging of the bank of experimental and calculational benchmarks acceptable for a certification of commercial codes that are being used for a design of fuel loadings with MOX fuel. In particular, the work is practically finished on the forming of calculational benchmarks list for a certification of code TVS-M as applied to MOX fuel assembly calculations. The results on these activities are presented

  18. A benchmarking study

    Directory of Open Access Journals (Sweden)

    H. Groessing

    2015-02-01

    Full Text Available A benchmark study for permeability measurement is presented. In the past studies of other research groups which focused on the reproducibility of 1D-permeability measurements showed high standard deviations of the gained permeability values (25%, even though a defined test rig with required specifications was used. Within this study, the reproducibility of capacitive in-plane permeability testing system measurements was benchmarked by comparing results of two research sites using this technology. The reproducibility was compared by using a glass fibre woven textile and carbon fibre non crimped fabric (NCF. These two material types were taken into consideration due to the different electrical properties of glass and carbon with respect to dielectric capacitive sensors of the permeability measurement systems. In order to determine the unsaturated permeability characteristics as function of fibre volume content the measurements were executed at three different fibre volume contents including five repetitions. It was found that the stability and reproducibility of the presentedin-plane permeability measurement system is very good in the case of the glass fibre woven textiles. This is true for the comparison of the repetition measurements as well as for the comparison between the two different permeameters. These positive results were confirmed by a comparison to permeability values of the same textile gained with an older generation permeameter applying the same measurement technology. Also it was shown, that a correct determination of the grammage and the material density are crucial for correct correlation of measured permeability values and fibre volume contents.

  19. Benchmarking Using Basic DBMS Operations

    Science.gov (United States)

    Crolotte, Alain; Ghazal, Ahmad

    The TPC-H benchmark proved to be successful in the decision support area. Many commercial database vendors and their related hardware vendors used these benchmarks to show the superiority and competitive edge of their products. However, over time, the TPC-H became less representative of industry trends as vendors keep tuning their database to this benchmark-specific workload. In this paper, we present XMarq, a simple benchmark framework that can be used to compare various software/hardware combinations. Our benchmark model is currently composed of 25 queries that measure the performance of basic operations such as scans, aggregations, joins and index access. This benchmark model is based on the TPC-H data model due to its maturity and well-understood data generation capability. We also propose metrics to evaluate single-system performance and compare two systems. Finally we illustrate the effectiveness of this model by showing experimental results comparing two systems under different conditions.

  20. Benchmarking & European Sustainable Transport Policies

    DEFF Research Database (Denmark)

    Gudmundsson, H.

    2003-01-01

    , Benchmarking is one of the management tools that have recently been introduced in the transport sector. It is rapidly being applied to a wide range of transport operations, services and policies. This paper is a contribution to the discussion of the role of benchmarking in the future efforts to...... contribution to the discussions within the Eusponsored BEST Thematic Network (Benchmarking European Sustainable Transport) which ran from 2000 to 2003....

  1. Benchmarking in Czech Higher Education

    OpenAIRE

    Plaček Michal; Ochrana František; Půček Milan

    2015-01-01

    The first part of this article surveys the current experience with the use of benchmarking at Czech universities specializing in economics and management. The results indicate that collaborative benchmarking is not used on this level today, but most actors show some interest in its introduction. The expression of the need for it and the importance of benchmarking as a very suitable performance-management tool in less developed countries are the impetus for the second part of our article. Base...

  2. Power reactor pressure vessel benchmarks

    International Nuclear Information System (INIS)

    Rahn, F.J.

    1978-01-01

    A review is given of the current status of experimental and calculational benchmarks for use in understanding the radiation embrittlement effects in the pressure vessels of operating light water power reactors. The requirements of such benchmarks for application to pressure vessel dosimetry are stated. Recent developments in active and passive neutron detectors sensitive in the ranges of importance to embrittlement studies are summarized and recommendations for improvements in the benchmark are made. (author)

  3. Low dose effects of ionizing radiations in in vitro and in vivo biological systems: a multi-scale approach study

    International Nuclear Information System (INIS)

    Antoccia, A.; Berardinelli, F.; Argazzi, E.; Balata, M.; Bedogni, R.

    2011-01-01

    Long-term biological effects of low-dose radiation are little known nowadays and its carcinogenic risk is estimated on the assumption that risk remains linearly proportional to the radiation dose down to low-dose levels. However in the last 20 years this hypothesis has gradually begun to seem in contrast with a huge collection of experimental evidences, which has shown the presence of plethora of non-linear phenomena (including hypersensitivity and induced radioresistance, adaptive response, and non-targeted phenomena like bystander effect and genomic instability) occurring after low-dose irradiation. These phenomena might imply a non-linear behaviour of cancer risk curves in the low-dose region and question the validity of the Linear No-Threshold (LNT) model currently used for cancer risk assessment through extrapolation from existing high-dose data. Moreover only few information is available regarding the effects induced on cryo preserved cells by multi-year background radiation exposure, which might induce a radiation-damage accumulation, due to the inhibition of cellular repair mechanisms. In this framework, the multi-year Excalibur (Exposure effects at low doses of ionizing radiation in biological culture) experiment, funded by INFN-CNS5, has undertaken a multi-scale approach investigation on the biological effects induced in in vitro and in vivo biological systems, in culture and cryo preserved conditions, as a function of radiation quality (X/γ-rays, protons, He-4 ions of various energies) and dose, with particular emphasis on the low-dose region and non-linear phenomena, in terms of different biological endpoints.

  4. Standard Guide for Benchmark Testing of Light Water Reactor Calculations

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    2010-01-01

    1.1 This guide covers general approaches for benchmarking neutron transport calculations in light water reactor systems. A companion guide (Guide E2005) covers use of benchmark fields for testing neutron transport calculations and cross sections in well controlled environments. This guide covers experimental benchmarking of neutron fluence calculations (or calculations of other exposure parameters such as dpa) in more complex geometries relevant to reactor surveillance. Particular sections of the guide discuss: the use of well-characterized benchmark neutron fields to provide an indication of the accuracy of the calculational methods and nuclear data when applied to typical cases; and the use of plant specific measurements to indicate bias in individual plant calculations. Use of these two benchmark techniques will serve to limit plant-specific calculational uncertainty, and, when combined with analytical uncertainty estimates for the calculations, will provide uncertainty estimates for reactor fluences with ...

  5. Benchmarking for the Effective Use of Student Evaluation Data

    Science.gov (United States)

    Smithson, John; Birks, Melanie; Harrison, Glenn; Nair, Chenicheri Sid; Hitchins, Marnie

    2015-01-01

    Purpose: The purpose of this paper is to examine current approaches to interpretation of student evaluation data and present an innovative approach to developing benchmark targets for the effective and efficient use of these data. Design/Methodology/Approach: This article discusses traditional approaches to gathering and using student feedback…

  6. MOx Depletion Calculation Benchmark

    International Nuclear Information System (INIS)

    San Felice, Laurence; Eschbach, Romain; Dewi Syarifah, Ratna; Maryam, Seif-Eddine; Hesketh, Kevin

    2016-01-01

    Under the auspices of the NEA Nuclear Science Committee (NSC), the Working Party on Scientific Issues of Reactor Systems (WPRS) has been established to study the reactor physics, fuel performance, radiation transport and shielding, and the uncertainties associated with modelling of these phenomena in present and future nuclear power systems. The WPRS has different expert groups to cover a wide range of scientific issues in these fields. The Expert Group on Reactor Physics and Advanced Nuclear Systems (EGRPANS) was created in 2011 to perform specific tasks associated with reactor physics aspects of present and future nuclear power systems. EGRPANS provides expert advice to the WPRS and the nuclear community on the development needs (data and methods, validation experiments, scenario studies) for different reactor systems and also provides specific technical information regarding: core reactivity characteristics, including fuel depletion effects; core power/flux distributions; Core dynamics and reactivity control. In 2013 EGRPANS published a report that investigated fuel depletion effects in a Pressurised Water Reactor (PWR). This was entitled 'International Comparison of a Depletion Calculation Benchmark on Fuel Cycle Issues' NEA/NSC/DOC(2013) that documented a benchmark exercise for UO 2 fuel rods. This report documents a complementary benchmark exercise that focused on PuO 2 /UO 2 Mixed Oxide (MOX) fuel rods. The results are especially relevant to the back-end of the fuel cycle, including irradiated fuel transport, reprocessing, interim storage and waste repository. Saint-Laurent B1 (SLB1) was the first French reactor to use MOx assemblies. SLB1 is a 900 MWe PWR, with 30% MOx fuel loading. The standard MOx assemblies, used in Saint-Laurent B1 reactor, include three zones with different plutonium enrichments, high Pu content (5.64%) in the center zone, medium Pu content (4.42%) in the intermediate zone and low Pu content (2.91%) in the peripheral zone

  7. Benchmarking Academic Anatomic Pathologists

    Directory of Open Access Journals (Sweden)

    Barbara S. Ducatman MD

    2016-10-01

    Full Text Available The most common benchmarks for faculty productivity are derived from Medical Group Management Association (MGMA or Vizient-AAMC Faculty Practice Solutions Center ® (FPSC databases. The Association of Pathology Chairs has also collected similar survey data for several years. We examined the Association of Pathology Chairs annual faculty productivity data and compared it with MGMA and FPSC data to understand the value, inherent flaws, and limitations of benchmarking data. We hypothesized that the variability in calculated faculty productivity is due to the type of practice model and clinical effort allocation. Data from the Association of Pathology Chairs survey on 629 surgical pathologists and/or anatomic pathologists from 51 programs were analyzed. From review of service assignments, we were able to assign each pathologist to a specific practice model: general anatomic pathologists/surgical pathologists, 1 or more subspecialties, or a hybrid of the 2 models. There were statistically significant differences among academic ranks and practice types. When we analyzed our data using each organization’s methods, the median results for the anatomic pathologists/surgical pathologists general practice model compared to MGMA and FPSC results for anatomic and/or surgical pathology were quite close. Both MGMA and FPSC data exclude a significant proportion of academic pathologists with clinical duties. We used the more inclusive FPSC definition of clinical “full-time faculty” (0.60 clinical full-time equivalent and above. The correlation between clinical full-time equivalent effort allocation, annual days on service, and annual work relative value unit productivity was poor. This study demonstrates that effort allocations are variable across academic departments of pathology and do not correlate well with either work relative value unit effort or reported days on service. Although the Association of Pathology Chairs–reported median work relative

  8. Self-benchmarking Guide for Cleanrooms: Metrics, Benchmarks, Actions

    Energy Technology Data Exchange (ETDEWEB)

    Mathew, Paul; Sartor, Dale; Tschudi, William

    2009-07-13

    This guide describes energy efficiency metrics and benchmarks that can be used to track the performance of and identify potential opportunities to reduce energy use in laboratory buildings. This guide is primarily intended for personnel who have responsibility for managing energy use in existing laboratory facilities - including facilities managers, energy managers, and their engineering consultants. Additionally, laboratory planners and designers may also use the metrics and benchmarks described in this guide for goal-setting in new construction or major renovation. This guide provides the following information: (1) A step-by-step outline of the benchmarking process. (2) A set of performance metrics for the whole building as well as individual systems. For each metric, the guide provides a definition, performance benchmarks, and potential actions that can be inferred from evaluating this metric. (3) A list and descriptions of the data required for computing the metrics. This guide is complemented by spreadsheet templates for data collection and for computing the benchmarking metrics. This guide builds on prior research supported by the national Laboratories for the 21st Century (Labs21) program, supported by the U.S. Department of Energy and the U.S. Environmental Protection Agency. Much of the benchmarking data are drawn from the Labs21 benchmarking database and technical guides. Additional benchmark data were obtained from engineering experts including laboratory designers and energy managers.

  9. Self-benchmarking Guide for Laboratory Buildings: Metrics, Benchmarks, Actions

    Energy Technology Data Exchange (ETDEWEB)

    Mathew, Paul; Greenberg, Steve; Sartor, Dale

    2009-07-13

    This guide describes energy efficiency metrics and benchmarks that can be used to track the performance of and identify potential opportunities to reduce energy use in laboratory buildings. This guide is primarily intended for personnel who have responsibility for managing energy use in existing laboratory facilities - including facilities managers, energy managers, and their engineering consultants. Additionally, laboratory planners and designers may also use the metrics and benchmarks described in this guide for goal-setting in new construction or major renovation. This guide provides the following information: (1) A step-by-step outline of the benchmarking process. (2) A set of performance metrics for the whole building as well as individual systems. For each metric, the guide provides a definition, performance benchmarks, and potential actions that can be inferred from evaluating this metric. (3) A list and descriptions of the data required for computing the metrics. This guide is complemented by spreadsheet templates for data collection and for computing the benchmarking metrics. This guide builds on prior research supported by the national Laboratories for the 21st Century (Labs21) program, supported by the U.S. Department of Energy and the U.S. Environmental Protection Agency. Much of the benchmarking data are drawn from the Labs21 benchmarking database and technical guides. Additional benchmark data were obtained from engineering experts including laboratory designers and energy managers.

  10. Benchmarking Organisational Capability using The 20 Keys

    Directory of Open Access Journals (Sweden)

    Dino Petrarolo

    2012-01-01

    Full Text Available Organisations have over the years implemented many improvement initiatives, many of which were applied individually with no real, lasting improvement. Approaches such as quality control, team activities, setup reduction and many more seldom changed the fundamental constitution or capability of an organisation. Leading companies in the world have come to realise that an integrated approach is required which focuses on improving more than one factor at the same time - by recognising the importance of synergy between different improvement efforts and the need for commitment at all levels of the company to achieve total system-wide improvement.

    The 20 Keys approach offers a way to look at the strenqth of organisations and to systemically improve it, one step at a time by focusing on 20 different but interrelated aspects. One feature of the approach is the benchmarking system which forms the main focus of this paper. The benchmarking system is introduced as an important part of the 20 Keys philosophy in measuring organisational strength. Benchmarking results from selected South African companies are provided, as well as one company's results achieved through the adoption of the 20 Keys philosophy.

  11. A Systems Genetic Approach to Identify Low Dose Radiation-Induced Lymphoma Susceptibility/DOE2013FinalReport

    Energy Technology Data Exchange (ETDEWEB)

    Balmain, Allan [University of California, San Francisco; Song, Ihn Young [University of California, San Francisco

    2013-05-15

    The ultimate goal of this project is to identify the combinations of genetic variants that confer an individual's susceptibility to the effects of low dose (0.1 Gy) gamma-radiation, in particular with regard to tumor development. In contrast to the known effects of high dose radiation in cancer induction, the responses to low dose radiation (defined as 0.1 Gy or less) are much less well understood, and have been proposed to involve a protective anti-tumor effect in some in vivo scientific models. These conflicting results confound attempts to develop predictive models of the risk of exposure to low dose radiation, particularly when combined with the strong effects of inherited genetic variants on both radiation effects and cancer susceptibility. We have used a Systems Genetics approach in mice that combines genetic background analysis with responses to low and high dose radiation, in order to develop insights that will allow us to reconcile these disparate observations. Using this comprehensive approach we have analyzed normal tissue gene expression (in this case the skin and thymus), together with the changes that take place in this gene expression architecture a) in response to low or high- dose radiation and b) during tumor development. Additionally, we have demonstrated that using our expression analysis approach in our genetically heterogeneous/defined radiation-induced tumor mouse models can uniquely identify genes and pathways relevant to human T-ALL, and uncover interactions between common genetic variants of genes which may lead to tumor susceptibility.

  12. Shielding benchmark test

    International Nuclear Information System (INIS)

    Kawai, Masayoshi

    1984-01-01

    Iron data in JENDL-2 have been tested by analyzing shielding benchmark experiments for neutron transmission through iron block performed at KFK using CF-252 neutron source and at ORNL using collimated neutron beam from reactor. The analyses are made by a shielding analysis code system RADHEAT-V4 developed at JAERI. The calculated results are compared with the measured data. As for the KFK experiments, the C/E values are about 1.1. For the ORNL experiments, the calculated values agree with the measured data within an accuracy of 33% for the off-center geometry. The d-t neutron transmission measurements through carbon sphere made at LLNL are also analyzed preliminarily by using the revised JENDL data for fusion neutronics calculation. (author)

  13. A novel approach for estimating ingested dose associated with paracetamol overdose.

    Science.gov (United States)

    Zurlinden, Todd J; Heard, Kennon; Reisfeld, Brad

    2016-04-01

    In cases of paracetamol (acetaminophen, APAP) overdose, an accurate estimate of tissue-specific paracetamol pharmacokinetics (PK) and ingested dose can offer health care providers important information for the individualized treatment and follow-up of affected patients. Here a novel methodology is presented to make such estimates using a standard serum paracetamol measurement and a computational framework. The core component of the computational framework was a physiologically-based pharmacokinetic (PBPK) model developed and evaluated using an extensive set of human PK data. Bayesian inference was used for parameter and dose estimation, allowing the incorporation of inter-study variability, and facilitating the calculation of uncertainty in model outputs. Simulations of paracetamol time course concentrations in the blood were in close agreement with experimental data under a wide range of dosing conditions. Also, predictions of administered dose showed good agreement with a large collection of clinical and emergency setting PK data over a broad dose range. In addition to dose estimation, the platform was applied for the determination of optimal blood sampling times for dose reconstruction and quantitation of the potential role of paracetamol conjugate measurement on dose estimation. Current therapies for paracetamol overdose rely on a generic methodology involving the use of a clinical nomogram. By using the computational framework developed in this study, serum sample data, and the individual patient's anthropometric and physiological information, personalized serum and liver pharmacokinetic profiles and dose estimate could be generated to help inform an individualized overdose treatment and follow-up plan. © 2015 The British Pharmacological Society.

  14. Benchmarking foreign electronics technologies

    Energy Technology Data Exchange (ETDEWEB)

    Bostian, C.W.; Hodges, D.A.; Leachman, R.C.; Sheridan, T.B.; Tsang, W.T.; White, R.M.

    1994-12-01

    This report has been drafted in response to a request from the Japanese Technology Evaluation Center`s (JTEC) Panel on Benchmarking Select Technologies. Since April 1991, the Competitive Semiconductor Manufacturing (CSM) Program at the University of California at Berkeley has been engaged in a detailed study of quality, productivity, and competitiveness in semiconductor manufacturing worldwide. The program is a joint activity of the College of Engineering, the Haas School of Business, and the Berkeley Roundtable on the International Economy, under sponsorship of the Alfred P. Sloan Foundation, and with the cooperation of semiconductor producers from Asia, Europe and the United States. Professors David A. Hodges and Robert C. Leachman are the project`s Co-Directors. The present report for JTEC is primarily based on data and analysis drawn from that continuing program. The CSM program is being conducted by faculty, graduate students and research staff from UC Berkeley`s Schools of Engineering and Business, and Department of Economics. Many of the participating firms are represented on the program`s Industry Advisory Board. The Board played an important role in defining the research agenda. A pilot study was conducted in 1991 with the cooperation of three semiconductor plants. The research plan and survey documents were thereby refined. The main phase of the CSM benchmarking study began in mid-1992 and will continue at least through 1997. reports are presented on the manufacture of integrated circuits; data storage; wireless technology; human-machine interfaces; and optoelectronics. Selected papers are indexed separately for inclusion in the Energy Science and Technology Database.

  15. SSI and structural benchmarks

    International Nuclear Information System (INIS)

    Philippacopoulos, A.J.; Miller, C.A.; Costantino, C.J.; Graves, H.

    1987-01-01

    This paper presents the latest results of the ongoing program entitled, Standard Problems for Structural Computer Codes, currently being worked on at BNL for the USNRC, Office of Nuclear Regulatory Research. During FY 1986, efforts were focussed on three tasks, namely, (1) an investigation of ground water effects on the response of Category I structures, (2) the Soil-Structure Interaction Workshop and (3) studies on structural benchmarks associated with Category I structures. The objective of the studies on ground water effects is to verify the applicability and the limitations of the SSI methods currently used by the industry in performing seismic evaluations of nuclear plants which are located at sites with high water tables. In a previous study by BNL (NUREG/CR-4588), it has been concluded that the pore water can influence significantly the soil-structure interaction process. This result, however, is based on the assumption of fully saturated soil profiles. Consequently, the work was further extended to include cases associated with variable water table depths. In this paper, results related to cut-off depths beyond which the pore water effects can be ignored in seismic calculations, are addressed. Comprehensive numerical data are given for soil configurations typical to those encountered in nuclear plant sites. These data were generated by using a modified version of the SLAM code which is capable of handling problems related to the dynamic response of saturated soils. Further, the paper presents some key aspects of the Soil-Structure Interaction Workshop (NUREG/CP-0054) which was held in Bethesda, MD on June 1, 1986. Finally, recent efforts related to the task on the structural benchmarks are described

  16. Multiscale benchmarking of drug delivery vectors.

    Science.gov (United States)

    Summers, Huw D; Ware, Matthew J; Majithia, Ravish; Meissner, Kenith E; Godin, Biana; Rees, Paul

    2016-10-01

    Cross-system comparisons of drug delivery vectors are essential to ensure optimal design. An in-vitro experimental protocol is presented that separates the role of the delivery vector from that of its cargo in determining the cell response, thus allowing quantitative comparison of different systems. The technique is validated through benchmarking of the dose-response of human fibroblast cells exposed to the cationic molecule, polyethylene imine (PEI); delivered as a free molecule and as a cargo on the surface of CdSe nanoparticles and Silica microparticles. The exposure metrics are converted to a delivered dose with the transport properties of the different scale systems characterized by a delivery time, τ. The benchmarking highlights an agglomeration of the free PEI molecules into micron sized clusters and identifies the metric determining cell death as the total number of PEI molecules presented to cells, determined by the delivery vector dose and the surface density of the cargo. Copyright © 2016 Elsevier Inc. All rights reserved.

  17. Toward a unified approach to dose-response modeling in ecotoxicology.

    Science.gov (United States)

    Ritz, Christian

    2010-01-01

    This study reviews dose-response models that are used in ecotoxicology. The focus lies on clarification of differences and similarities between models, and as a side effect, their different guises in ecotoxicology are unravelled. A look at frequently used dose-response models reveals major discrepancies, among other things in naming conventions. Therefore, there is a need for a unified view on dose-response modeling in order to improve the understanding of it and to facilitate communication and comparison of findings across studies, thus realizing its full potential. This study attempts to establish a general framework that encompasses most dose-response models that are of interest to ecotoxicologists in practice. The framework includes commonly used models such as the log-logistic and Weibull models, but also features entire suites of models as found in various guidance documents. An outline on how the proposed framework can be implemented in statistical software systems is also provided.

  18. Concept and approaches used in assessing individual and collective doses from releases of radioactive effluents

    International Nuclear Information System (INIS)

    1988-06-01

    To guide on the applications of the principles for limiting radioactive releases contained in Safety Series 77, the Agency is in the process of preparing a number of safety guides. The first one is this present document which deals with the principal aspects of the methods for the assessment of the individual and collective dose. It aims at giving a general guidance to those responsible for establishing programmes for the determination of individual doses as well as collective doses in connection with licensing a site for a nuclear installation. The document is concerned with the principles applied for calculating individual and collective doses from routine releases of radionuclides to the atmosphere and hydrosphere but not releases directly to the geosphere, as in waste management. These areas will be covered by other Agency publications. 75 refs, figs and tabs

  19. Benchmarking and energy management schemes in SMEs

    Energy Technology Data Exchange (ETDEWEB)

    Huenges Wajer, Boudewijn [SenterNovem (Netherlands); Helgerud, Hans Even [New Energy Performance AS (Norway); Lackner, Petra [Austrian Energy Agency (Austria)

    2007-07-01

    Many companies are reluctant to focus on energy management or to invest in energy efficiency measures. Nevertheless, there are many good examples proving that the right approach to implementing energy efficiency can very well be combined with the business-priorities of most companies. SMEs in particular can benefit from a facilitated European approach because they normally have a lack of resources and time to invest in energy efficiency. In the EU supported pilot project BESS, 60 SMEs from 11 European countries of the food and drink industries successfully tested a package of interactive instruments which offers such a facilitated approach. A number of pilot companies show a profit increase of 3 up to 10 %. The package includes a user-friendly and web based E-learning scheme for implementing energy management as well as a benchmarking module for company specific comparison of energy performance indicators. Moreover, it has several practical and tested tools to support the cycle of continuous improvement of energy efficiency in the company such as checklists, sector specific measure lists, templates for auditing and energy conservation plans. An important feature and also a key trigger for companies is the possibility for SMEs to benchmark anonymously their energy situation against others of the same sector. SMEs can participate in a unique web based benchmarking system to interactively benchmark in a way which fully guarantees confidentiality and safety of company data. Furthermore, the available data can contribute to a bottom-up approach to support the objectives of (national) monitoring and targeting and thereby also contributing to the EU Energy Efficiency and Energy Services Directive. A follow up project to expand the number of participating SMEs of various sectors is currently being developed.

  20. Review for session K - benchmarks

    International Nuclear Information System (INIS)

    McCracken, A.K.

    1980-01-01

    Eight of the papers to be considered in Session K are directly concerned, at least in part, with the Pool Critical Assembly (P.C.A.) benchmark at Oak Ridge. The remaining seven papers in this session, the subject of this review, are concerned with a variety of topics related to the general theme of Benchmarks and will be considered individually

  1. Internal Benchmarking for Institutional Effectiveness

    Science.gov (United States)

    Ronco, Sharron L.

    2012-01-01

    Internal benchmarking is an established practice in business and industry for identifying best in-house practices and disseminating the knowledge about those practices to other groups in the organization. Internal benchmarking can be done with structures, processes, outcomes, or even individuals. In colleges or universities with multicampuses or a…

  2. Entropy-based benchmarking methods

    NARCIS (Netherlands)

    Temurshoev, Umed

    2012-01-01

    We argue that benchmarking sign-volatile series should be based on the principle of movement and sign preservation, which states that a bench-marked series should reproduce the movement and signs in the original series. We show that the widely used variants of Denton (1971) method and the growth

  3. Benchmark simulation models, quo vadis?

    DEFF Research Database (Denmark)

    Jeppsson, U.; Alex, J; Batstone, D. J.

    2013-01-01

    As the work of the IWA Task Group on Benchmarking of Control Strategies for wastewater treatment plants (WWTPs) is coming to an end, it is essential to disseminate the knowledge gained. For this reason, all authors of the IWA Scientific and Technical Report on benchmarking have come together to p...

  4. Benchmark for Strategic Performance Improvement.

    Science.gov (United States)

    Gohlke, Annette

    1997-01-01

    Explains benchmarking, a total quality management tool used to measure and compare the work processes in a library with those in other libraries to increase library performance. Topics include the main groups of upper management, clients, and staff; critical success factors for each group; and benefits of benchmarking. (Author/LRW)

  5. Benchmarking: A Process for Improvement.

    Science.gov (United States)

    Peischl, Thomas M.

    One problem with the outcome-based measures used in higher education is that they measure quantity but not quality. Benchmarking, or the use of some external standard of quality to measure tasks, processes, and outputs, is partially solving that difficulty. Benchmarking allows for the establishment of a systematic process to indicate if outputs…

  6. Benchmark job – Watch out!

    CERN Multimedia

    Staff Association

    2017-01-01

    On 12 December 2016, in Echo No. 259, we already discussed at length the MERIT and benchmark jobs. Still, we find that a couple of issues warrant further discussion. Benchmark job – administrative decision on 1 July 2017 On 12 January 2017, the HR Department informed all staff members of a change to the effective date of the administrative decision regarding benchmark jobs. The benchmark job title of each staff member will be confirmed on 1 July 2017, instead of 1 May 2017 as originally announced in HR’s letter on 18 August 2016. Postponing the administrative decision by two months will leave a little more time to address the issues related to incorrect placement in a benchmark job. Benchmark job – discuss with your supervisor, at the latest during the MERIT interview In order to rectify an incorrect placement in a benchmark job, it is essential that the supervisor and the supervisee go over the assigned benchmark job together. In most cases, this placement has been done autom...

  7. Extraction of CT dose information from DICOM metadata: automated Matlab-based approach.

    Science.gov (United States)

    Dave, Jaydev K; Gingold, Eric L

    2013-01-01

    The purpose of this study was to extract exposure parameters and dose-relevant indexes of CT examinations from information embedded in DICOM metadata. DICOM dose report files were identified and retrieved from a PACS. An automated software program was used to extract from these files information from the structured elements in the DICOM metadata relevant to exposure. Extracting information from DICOM metadata eliminated potential errors inherent in techniques based on optical character recognition, yielding 100% accuracy.

  8. A Technical Approach to Expedited Processing of NTPR Radiation Dose Assessments

    Science.gov (United States)

    2011-10-01

    all organs except thyroid, liver , gall bladder, and bile duct . These four organs are associated with low screening doses that are easily exceeded when... liver /gallbladder/ bile duct (25 occurrences) and thyroid (20 occurrences). This large number of occurrences is due, in part, to cancers of the liver ...to the gall bladder, bile duct , and acute lymphocytic leukemia do not satisfy the well-below criterion when the initial doses (neutron and gamma

  9. Low energy isomers of (H2O)25 from a hierarchical method based on Monte Carlo temperature basin paving and molecular tailoring approaches benchmarked by MP2 calculations

    International Nuclear Information System (INIS)

    Sahu, Nityananda; Gadre, Shridhar R.; Rakshit, Avijit; Bandyopadhyay, Pradipta; Miliordos, Evangelos; Xantheas, Sotiris S.

    2014-01-01

    We report new global minimum candidate structures for the (H 2 O) 25 cluster that are lower in energy than the ones reported previously and correspond to hydrogen bonded networks with 42 hydrogen bonds and an interior, fully coordinated water molecule. These were obtained as a result of a hierarchical approach based on initial Monte Carlo Temperature Basin Paving sampling of the cluster's Potential Energy Surface with the Effective Fragment Potential, subsequent geometry optimization using the Molecular Tailoring Approach with the fragments treated at the second order Møller-Plesset (MP2) perturbation (MTA-MP2) and final refinement of the entire cluster at the MP2 level of theory. The MTA-MP2 optimized cluster geometries, constructed from the fragments, were found to be within 2 O) 25 cluster. In addition, the grafting of the MTA-MP2 energies yields electronic energies that are within <0.3 kcal/mol from the MP2 energies of the entire cluster while preserving their energy rank order. Finally, the MTA-MP2 approach was found to reproduce the MP2 harmonic vibrational frequencies, constructed from the fragments, quite accurately when compared to the MP2 ones of the entire cluster in both the HOH bending and the OH stretching regions of the spectra

  10. Service quality benchmarking via a novel approach based on fuzzy ELECTRE III and IPA: an empirical case involving the Italian public healthcare context.

    Science.gov (United States)

    La Fata, Concetta Manuela; Lupo, Toni; Piazza, Tommaso

    2017-11-21

    A novel fuzzy-based approach which combines ELECTRE III along with the Importance-Performance Analysis (IPA) is proposed in the present work to comparatively evaluate the service quality in the public healthcare context. Specifically, ELECTRE III is firstly considered to compare the service performance of examined hospitals in a noncompensatory manner. Afterwards, IPA is employed to support the service quality management to point out improvement needs and their priorities. The proposed approach also incorporates features of the Fuzzy Set Theory so as to address the possible uncertainty, subjectivity and vagueness of involved experts in evaluating the service quality. The model is applied to five major Sicilian public hospitals, and strengths and criticalities of the delivered service are finally highlighted and discussed. Although several approaches combining multi-criteria methods have already been proposed in the literature to evaluate the service performance in the healthcare field, to the best of the authors' knowledge the present work represents the first attempt at comparing service performance of alternatives in a noncompensatory manner in the investigated context.

  11. Human factors reliability benchmark exercise

    International Nuclear Information System (INIS)

    Poucet, A.

    1989-08-01

    The Joint Research Centre of the European Commission has organised a Human Factors Reliability Benchmark Exercise (HF-RBE) with the aim of assessing the state of the art in human reliability modelling and assessment. Fifteen teams from eleven countries, representing industry, utilities, licensing organisations and research institutes, participated in the HF-RBE. The HF-RBE was organised around two study cases: (1) analysis of routine functional Test and Maintenance (TPM) procedures: with the aim of assessing the probability of test induced failures, the probability of failures to remain unrevealed and the potential to initiate transients because of errors performed in the test; (2) analysis of human actions during an operational transient: with the aim of assessing the probability that the operators will correctly diagnose the malfunctions and take proper corrective action. This report summarises the contributions received from the participants and analyses these contributions on a comparative basis. The aim of this analysis was to compare the procedures, modelling techniques and quantification methods used, to obtain insight in the causes and magnitude of the variability observed in the results, to try to identify preferred human reliability assessment approaches and to get an understanding of the current state of the art in the field identifying the limitations that are still inherent to the different approaches

  12. Advancing Dose-Response Assessment Methods for Environmental Regulatory Impact Analysis: A Bayesian Belief Network Approach Applied to Inorganic Arsenic.

    Science.gov (United States)

    Zabinski, Joseph W; Garcia-Vargas, Gonzalo; Rubio-Andrade, Marisela; Fry, Rebecca C; Gibson, Jacqueline MacDonald

    2016-05-10

    Dose-response functions used in regulatory risk assessment are based on studies of whole organisms and fail to incorporate genetic and metabolomic data. Bayesian belief networks (BBNs) could provide a powerful framework for incorporating such data, but no prior research has examined this possibility. To address this gap, we develop a BBN-based model predicting birthweight at gestational age from arsenic exposure via drinking water and maternal metabolic indicators using a cohort of 200 pregnant women from an arsenic-endemic region of Mexico. We compare BBN predictions to those of prevailing slope-factor and reference-dose approaches. The BBN outperforms prevailing approaches in balancing false-positive and false-negative rates. Whereas the slope-factor approach had 2% sensitivity and 99% specificity and the reference-dose approach had 100% sensitivity and 0% specificity, the BBN's sensitivity and specificity were 71% and 30%, respectively. BBNs offer a promising opportunity to advance health risk assessment by incorporating modern genetic and metabolomic data.

  13. Model based energy benchmarking for glass furnace

    International Nuclear Information System (INIS)

    Sardeshpande, Vishal; Gaitonde, U.N.; Banerjee, Rangan

    2007-01-01

    Energy benchmarking of processes is important for setting energy efficiency targets and planning energy management strategies. Most approaches used for energy benchmarking are based on statistical methods by comparing with a sample of existing plants. This paper presents a model based approach for benchmarking of energy intensive industrial processes and illustrates this approach for industrial glass furnaces. A simulation model for a glass furnace is developed using mass and energy balances, and heat loss equations for the different zones and empirical equations based on operating practices. The model is checked with field data from end fired industrial glass furnaces in India. The simulation model enables calculation of the energy performance of a given furnace design. The model results show the potential for improvement and the impact of different operating and design preferences on specific energy consumption. A case study for a 100 TPD end fired furnace is presented. An achievable minimum energy consumption of about 3830 kJ/kg is estimated for this furnace. The useful heat carried by glass is about 53% of the heat supplied by the fuel. Actual furnaces operating at these production scales have a potential for reduction in energy consumption of about 20-25%

  14. How to Advance TPC Benchmarks with Dependability Aspects

    Science.gov (United States)

    Almeida, Raquel; Poess, Meikel; Nambiar, Raghunath; Patil, Indira; Vieira, Marco

    Transactional systems are the core of the information systems of most organizations. Although there is general acknowledgement that failures in these systems often entail significant impact both on the proceeds and reputation of companies, the benchmarks developed and managed by the Transaction Processing Performance Council (TPC) still maintain their focus on reporting bare performance. Each TPC benchmark has to pass a list of dependability-related tests (to verify ACID properties), but not all benchmarks require measuring their performances. While TPC-E measures the recovery time of some system failures, TPC-H and TPC-C only require functional correctness of such recovery. Consequently, systems used in TPC benchmarks are tuned mostly for performance. In this paper we argue that nowadays systems should be tuned for a more comprehensive suite of dependability tests, and that a dependability metric should be part of TPC benchmark publications. The paper discusses WHY and HOW this can be achieved. Two approaches are introduced and discussed: augmenting each TPC benchmark in a customized way, by extending each specification individually; and pursuing a more unified approach, defining a generic specification that could be adjoined to any TPC benchmark.

  15. A probabilistic approach to quantify the uncertainties in internal dose assessment using response surface and neural network

    International Nuclear Information System (INIS)

    Baek, M.; Lee, S.K.; Lee, U.C.; Kang, C.S.

    1996-01-01

    A probabilistic approach is formulated to assess the internal radiation exposure following the intake of radioisotopes. This probabilistic approach consists of 4 steps as follows: (1) screening, (2) quantification of uncertainties, (3) propagation of uncertainties, and (4) analysis of output. The approach has been applied for Pu-induced internal dose assessment and a multi-compartment dosimetric model is used for internal transport. In this approach, surrogate models of original system are constructed using response and neural network. And the results of these surrogate models are compared with those of original model. Each surrogate model well approximates the original model. The uncertainty and sensitivity analysis of the model parameters are evaluated in this process. Dominant contributors to each organ are identified and the results show that this approach could serve a good tool of assessing the internal radiation exposure

  16. A benchmark of co-flow and cyclic deposition/etch approaches for the selective epitaxial growth of tensile-strained Si:P

    Science.gov (United States)

    Hartmann, J. M.; Veillerot, M.; Prévitali, B.

    2017-10-01

    We have compared co-flow and cyclic deposition/etch processes for the selective epitaxial growth of Si:P layers. High growth rates, relatively low resistivities and significant amounts of tensile strain (up to 10 nm min-1, 0.55 mOhm cm and a strain equivalent to 1.06% of substitutional C in Si:C layers) were obtained at 700 °C, 760 Torr with a co-flow approach and a SiH2Cl2 + PH3 + HCl chemistry. This approach was successfully used to thicken the sources and drains regions of n-type fin-shaped Field Effect Transistors. Meanwhile, the (Si2H6 + PH3/HCl + GeH4) CDE process evaluated yielded at 600 °C, 80 Torr even lower resistivities (0.4 mOhm cm, typically), at the cost however of the tensile strain which was lost due to (i) the incorporation of Ge atoms (1.5%, typically) into the lattice during the selective etch steps and (ii) a reduction by a factor of two of the P atomic concentration in CDE layers compared to that in layers grown in a single step (5 × 1020 cm-3 compared to 1021 cm-3).

  17. Benchmarking school nursing practice: the North West Regional Benchmarking Group

    OpenAIRE

    Littler, Nadine; Mullen, Margaret; Beckett, Helen; Freshney, Alice; Pinder, Lynn

    2016-01-01

    It is essential that the quality of care is reviewed regularly through robust processes such as benchmarking to ensure all outcomes and resources are evidence-based so that children and young people’s needs are met effectively. This article provides an example of the use of benchmarking in school nursing practice. Benchmarking has been defined as a process for finding, adapting and applying best practices (Camp, 1994). This concept was first adopted in the 1970s ‘from industry where it was us...

  18. A Benchmark for Banks’ Strategy in Online Presence – An Innovative Approach Based on Elements of Search Engine Optimization (SEO and Machine Learning Techniques

    Directory of Open Access Journals (Sweden)

    Camelia Elena CIOLAC

    2011-06-01

    Full Text Available This paper aims to offer a new decision tool to assist banks in evaluating their efficiency of Internet presence and in planning the IT investments towards gaining better Internet popularity. The methodology used in this paper goes beyond the simple website interface analysis and uses web crawling as a source for collecting website performance data and employed web technologies and servers. The paper complements this technical perspective with a proposed scorecard used to assess the efforts of banks in Internet presence that reflects the banks’ commitment to Internet as a distribution channel. An innovative approach based on Machine Learning Techniques, the K-Nearest Neighbor Algorithm, is proposed by the author to estimate the Internet Popularity that a bank is likely to achieve based on its size and efforts in Internet presence.

  19. CLINICAL AND PHARMACOLOGICAL APPROACHES TO OPTIMIZE THE DOSING REGIMEN OF ANTIBACTERIAL DRUGS IN PEDIATRICS

    Directory of Open Access Journals (Sweden)

    Natal’ya B. Lazareva

    2018-01-01

    Full Text Available The rational use of antibacterial drugs in children implies an adequate choice of the necessary medication, its dosing regimen, and the duration of treatment in order to achieve maximum efficacy and minimize toxic effects. The knowledge of pharmacokinetic and pharmacodynamic profiles of the antibacterial drug plays a crucial role for optimizing the dosing regimen. The strategy of individual choice of the dosing regimen, taking into account the principles of pharmacokinetics and pharmacodynamics, can be especially effective in patients with the expectedly changed parameters of pharmacokinetics and in infections caused by bacteria strains with low sensitivity to antibiotics. The review presents a contemporary view of pharmacokinetic and pharmacodynamic profiles of antibacterial drugs most commonly used in pediatrics and their relationship to the clinical efficacy of the administered therapy.

  20. Benchmarking Nuclear Power Plants

    International Nuclear Information System (INIS)

    Jakic, I.

    2016-01-01

    One of the main tasks an owner have is to keep its business competitive on the market while delivering its product. Being owner of nuclear power plant bear the same (or even more complex and stern) responsibility due to safety risks and costs. In the past, nuclear power plant managements could (partly) ignore profit or it was simply expected and to some degree assured through the various regulatory processes governing electricity rate design. It is obvious now that, with the deregulation, utility privatization and competitive electricity market, key measure of success used at nuclear power plants must include traditional metrics of successful business (return on investment, earnings and revenue generation) as well as those of plant performance, safety and reliability. In order to analyze business performance of (specific) nuclear power plant, benchmarking, as one of the well-established concept and usual method was used. Domain was conservatively designed, with well-adjusted framework, but results have still limited application due to many differences, gaps and uncertainties. (author).

  1. Virtual machine performance benchmarking.

    Science.gov (United States)

    Langer, Steve G; French, Todd

    2011-10-01

    The attractions of virtual computing are many: reduced costs, reduced resources and simplified maintenance. Any one of these would be compelling for a medical imaging professional attempting to support a complex practice on limited resources in an era of ever tightened reimbursement. In particular, the ability to run multiple operating systems optimized for different tasks (computational image processing on Linux versus office tasks on Microsoft operating systems) on a single physical machine is compelling. However, there are also potential drawbacks. High performance requirements need to be carefully considered if they are to be executed in an environment where the running software has to execute through multiple layers of device drivers before reaching the real disk or network interface. Our lab has attempted to gain insight into the impact of virtualization on performance by benchmarking the following metrics on both physical and virtual platforms: local memory and disk bandwidth, network bandwidth, and integer and floating point performance. The virtual performance metrics are compared to baseline performance on "bare metal." The results are complex, and indeed somewhat surprising.

  2. AER benchmark specification sheet

    International Nuclear Information System (INIS)

    Aszodi, A.; Toth, S.

    2009-01-01

    In the VVER-440/213 type reactors, the core outlet temperature field is monitored with in-core thermocouples, which are installed above 210 fuel assemblies. These measured temperatures are used in determination of the fuel assembly powers and they have important role in the reactor power limitation. For these reasons, correct interpretation of the thermocouple signals is an important question. In order to interpret the signals in correct way, knowledge of the coolant mixing in the assembly heads is necessary. Computational Fluid Dynamics (CFD) codes and experiments can help to understand better these mixing processes and they can provide information which can support the more adequate interpretation of the thermocouple signals. This benchmark deals with the 3D CFD modeling of the coolant mixing in the heads of the profiled fuel assemblies with 12.2 mm rod pitch. Two assemblies of the 23rd cycle of the Paks NPP's Unit 3 are investigated. One of them has symmetrical pin power profile and another possesses inclined profile. (authors)

  3. AER Benchmark Specification Sheet

    International Nuclear Information System (INIS)

    Aszodi, A.; Toth, S.

    2009-01-01

    In the WWER-440/213 type reactors, the core outlet temperature field is monitored with in-core thermocouples, which are installed above 210 fuel assemblies. These measured temperatures are used in determination of the fuel assembly powers and they have important role in the reactor power limitation. For these reasons, correct interpretation of the thermocouple signals is an important question. In order to interpret the signals in correct way, knowledge of the coolant mixing in the assembly heads is necessary. Computational fluid dynamics codes and experiments can help to understand better these mixing processes and they can provide information which can support the more adequate interpretation of the thermocouple signals. This benchmark deals with the 3D computational fluid dynamics modeling of the coolant mixing in the heads of the profiled fuel assemblies with 12.2 mm rod pitch. Two assemblies of the twenty third cycle of the Paks NPPs Unit 3 are investigated. One of them has symmetrical pin power profile and another possesses inclined profile. (Authors)

  4. Benchmarking biofuels; Biobrandstoffen benchmarken

    Energy Technology Data Exchange (ETDEWEB)

    Croezen, H.; Kampman, B.; Bergsma, G.

    2012-03-15

    A sustainability benchmark for transport biofuels has been developed and used to evaluate the various biofuels currently on the market. For comparison, electric vehicles, hydrogen vehicles and petrol/diesel vehicles were also included. A range of studies as well as growing insight are making it ever clearer that biomass-based transport fuels may have just as big a carbon footprint as fossil fuels like petrol or diesel, or even bigger. At the request of Greenpeace Netherlands, CE Delft has brought together current understanding on the sustainability of fossil fuels, biofuels and electric vehicles, with particular focus on the performance of the respective energy carriers on three sustainability criteria, with the first weighing the heaviest: (1) Greenhouse gas emissions; (2) Land use; and (3) Nutrient consumption [Dutch] Greenpeace Nederland heeft CE Delft gevraagd een duurzaamheidsmeetlat voor biobrandstoffen voor transport te ontwerpen en hierop de verschillende biobrandstoffen te scoren. Voor een vergelijk zijn ook elektrisch rijden, rijden op waterstof en rijden op benzine of diesel opgenomen. Door onderzoek en voortschrijdend inzicht blijkt steeds vaker dat transportbrandstoffen op basis van biomassa soms net zoveel of zelfs meer broeikasgassen veroorzaken dan fossiele brandstoffen als benzine en diesel. CE Delft heeft voor Greenpeace Nederland op een rijtje gezet wat de huidige inzichten zijn over de duurzaamheid van fossiele brandstoffen, biobrandstoffen en elektrisch rijden. Daarbij is gekeken naar de effecten van de brandstoffen op drie duurzaamheidscriteria, waarbij broeikasgasemissies het zwaarst wegen: (1) Broeikasgasemissies; (2) Landgebruik; en (3) Nutriëntengebruik.

  5. Benchmarking in academic pharmacy departments.

    Science.gov (United States)

    Bosso, John A; Chisholm-Burns, Marie; Nappi, Jean; Gubbins, Paul O; Ross, Leigh Ann

    2010-10-11

    Benchmarking in academic pharmacy, and recommendations for the potential uses of benchmarking in academic pharmacy departments are discussed in this paper. Benchmarking is the process by which practices, procedures, and performance metrics are compared to an established standard or best practice. Many businesses and industries use benchmarking to compare processes and outcomes, and ultimately plan for improvement. Institutions of higher learning have embraced benchmarking practices to facilitate measuring the quality of their educational and research programs. Benchmarking is used internally as well to justify the allocation of institutional resources or to mediate among competing demands for additional program staff or space. Surveying all chairs of academic pharmacy departments to explore benchmarking issues such as department size and composition, as well as faculty teaching, scholarly, and service productivity, could provide valuable information. To date, attempts to gather this data have had limited success. We believe this information is potentially important, urge that efforts to gather it should be continued, and offer suggestions to achieve full participation.

  6. Benchmarking Post-Hartree–Fock Methods To Describe the Nonlinear Optical Properties of Polymethines: An Investigation of the Accuracy of Algebraic Diagrammatic Construction (ADC) Approaches

    KAUST Repository

    Knippenberg, Stefan

    2016-10-07

    Third-order nonlinear optical (NLO) properties of polymethine dyes have been widely studied for applications such as all-optical switching. However, the limited accuracy of the current computational methodologies has prevented a comprehensive understanding of the nature of the lowest excited states and their influence on the molecular optical and NLO properties. Here, attention is paid to the lowest excited-state energies and their energetic ratio, as these characteristics impact the figure-of-merit for all-optical switching. For a series of model polymethines, we compare several algebraic diagrammatic construction (ADC) schemes for the polarization propagator with approximate second-order coupled cluster (CC2) theory, the widely used INDO/MRDCI approach and the symmetry-adapted cluster configuration interaction (SAC-CI) algorithm incorporating singles and doubles linked excitation operators (SAC-CI SD-R). We focus in particular on the ground-to-excited state transition dipole moments and the corresponding state dipole moments, since these quantities are found to be of utmost importance for an effective description of the third-order polarizability γ and two-photon absorption spectra. A sum-overstates expression has been used, which is found to quickly converge. While ADC(3/2) has been found to be the most appropriate method to calculate these properties, CC2 performs poorly.

  7. Equilibrium Partitioning Sediment Benchmarks (ESBs) for the ...

    Science.gov (United States)

    This document describes procedures to determine the concentrations of nonionic organic chemicals in sediment interstitial waters. In previous ESB documents, the general equilibrium partitioning (EqP) approach was chosen for the derivation of sediment benchmarks because it accounts for the varying bioavailability of chemicals in different sediments and allows for the incorporation of the appropriate biological effects concentration. This provides for the derivation of benchmarks that are causally linked to the specific chemical, applicable across sediments, and appropriately protective of benthic organisms.  This equilibrium partitioning sediment benchmark (ESB) document was prepared by scientists from the Atlantic Ecology Division, Mid-Continent Ecology Division, and Western Ecology Division, the Office of Water, and private consultants. The document describes procedures to determine the interstitial water concentrations of nonionic organic chemicals in contaminated sediments. Based on these concentrations, guidance is provided on the derivation of toxic units to assess whether the sediments are likely to cause adverse effects to benthic organisms. The equilibrium partitioning (EqP) approach was chosen because it is based on the concentrations of chemical(s) that are known to be harmful and bioavailable in the environment.  This document, and five others published over the last nine years, will be useful for the Program Offices, including Superfund, a

  8. A nested case-control approach to interactions between radiation dose and other factors as causes of cancer

    Energy Technology Data Exchange (ETDEWEB)

    Land, Charles E [Department of Epidemiology, Radiation Epidemiology Branch, US National Cancer Institute, Bethesda, MD (United States)

    1992-04-01

    Often a nested case-control study is the most practicable approach to estimating the interaction of two cancer risk factors in a large cohort. If one of the factors has already been evaluated for the entire cohort, however, more information is already available about its relationship to risk than could be obtained from a nested study. A modified case-control approach is proposed, in which information about the second, unknown factor is sought for cases and controls matched on the first factor. The approach requires, for interaction models other than the multiplicative, a nonstandard analytical approach incorporating cohort-based information about the first factor. The problem is discussed in the context of breast cancer risk in a defined cohort of female Japanese atomic bomb survivors, in relation to radiation dose and reproductive history. (author)

  9. Issues in Benchmark Metric Selection

    Science.gov (United States)

    Crolotte, Alain

    It is true that a metric can influence a benchmark but will esoteric metrics create more problems than they will solve? We answer this question affirmatively by examining the case of the TPC-D metric which used the much debated geometric mean for the single-stream test. We will show how a simple choice influenced the benchmark and its conduct and, to some extent, DBMS development. After examining other alternatives our conclusion is that the “real” measure for a decision-support benchmark is the arithmetic mean.

  10. California commercial building energy benchmarking

    Energy Technology Data Exchange (ETDEWEB)

    Kinney, Satkartar; Piette, Mary Ann

    2003-07-01

    Building energy benchmarking is the comparison of whole-building energy use relative to a set of similar buildings. It provides a useful starting point for individual energy audits and for targeting buildings for energy-saving measures in multiple-site audits. Benchmarking is of interest and practical use to a number of groups. Energy service companies and performance contractors communicate energy savings potential with ''typical'' and ''best-practice'' benchmarks while control companies and utilities can provide direct tracking of energy use and combine data from multiple buildings. Benchmarking is also useful in the design stage of a new building or retrofit to determine if a design is relatively efficient. Energy managers and building owners have an ongoing interest in comparing energy performance to others. Large corporations, schools, and government agencies with numerous facilities also use benchmarking methods to compare their buildings to each other. The primary goal of Task 2.1.1 Web-based Benchmarking was the development of a web-based benchmarking tool, dubbed Cal-Arch, for benchmarking energy use in California commercial buildings. While there were several other benchmarking tools available to California consumers prior to the development of Cal-Arch, there were none that were based solely on California data. Most available benchmarking information, including the Energy Star performance rating, were developed using DOE's Commercial Building Energy Consumption Survey (CBECS), which does not provide state-level data. Each database and tool has advantages as well as limitations, such as the number of buildings and the coverage by type, climate regions and end uses. There is considerable commercial interest in benchmarking because it provides an inexpensive method of screening buildings for tune-ups and retrofits. However, private companies who collect and manage consumption data are concerned that the

  11. A Heterogeneous Medium Analytical Benchmark

    International Nuclear Information System (INIS)

    Ganapol, B.D.

    1999-01-01

    A benchmark, called benchmark BLUE, has been developed for one-group neutral particle (neutron or photon) transport in a one-dimensional sub-critical heterogeneous plane parallel medium with surface illumination. General anisotropic scattering is accommodated through the Green's Function Method (GFM). Numerical Fourier transform inversion is used to generate the required Green's functions which are kernels to coupled integral equations that give the exiting angular fluxes. The interior scalar flux is then obtained through quadrature. A compound iterative procedure for quadrature order and slab surface source convergence provides highly accurate benchmark qualities (4- to 5- places of accuracy) results

  12. Is the fixed-dose combination of telmisartan and hydrochlorothiazide a good approach to treat hypertension?

    Directory of Open Access Journals (Sweden)

    Marc P Maillard

    2007-07-01

    Full Text Available Marc P Maillard, Michel BurnierService of Nephrology, Department of Internal Medicine, Lausanne University Hospital, SwitzerlandAbstract: Blockade of the renin-angiotensin system with selective AT1 receptor antagonists is recognized as an effective mean to lower blood pressure in hypertensive patients. Among the class of AT1 receptor antagonists, telmisartan offers the advantage of a very long half-life. This enables blood pressure control over 24 hours using once-daily administration. The combination of telmisartan with hydrochlorothiazide is a logical step because numerous previous studies have demonstrated that sodium depletion enhances the antihypertensive efficacy of drugs interfering with the activity of the renin-angiotensin system (RAS. In accordance with past experience using similar compounds blocking the RAS, several controlled studies have now demonstrated that the fixed-dose combination of telmisartan/hydrochlorothiazide is superior in lowering blood pressure than either telmisartan or hydrochlorothiazide alone. Of clinical interest also is the observation that the excellent clinical tolerance of the angiotensin II receptor antagonist is not affected by the association of the low-dose thiazide. Thus telmisartan/hydrochlorothiazide is an effective and well-tolerated antihypertensive combination. Finally, the development of fixed-dose combinations should improve drug adherence because of the one-pill-a-day regimen.Keywords: telmisartan, hydrochlorothiazide, fixed-dose combinations, antihypertensive agent, safety, compliance

  13. Clinical approaches involving thrombopoietin to shorten the period of thrombocytopenia after high-dose chemotherapy

    NARCIS (Netherlands)

    Tijssen, Marloes R.; van der Schoot, C. Ellen; Voermans, Carlijn; Zwaginga, Jaap Jan

    2006-01-01

    High-dose chemotherapy followed by a peripheral blood stem cell transplant is successfully used for a wide variety of malignancies. A major drawback, however, is the delay in platelet recovery. Several clinical strategies using thrombopoietin (Tpo) have been developed in an attempt to speed up

  14. Doing the math: A simple approach to topical timolol dosing for infantile hemangiomas.

    Science.gov (United States)

    Dalla Costa, Renata; Prindaville, Brea; Wiss, Karen

    2018-03-01

    Topical timolol maleate has recently gained popularity as a treatment for superficial infantile hemangiomas, but calculating a safe dose of timolol can be time consuming, which may limit the medication's use in fast-paced clinical environments. This report offers a simplified calculation of the maximum daily safe dosage as 1 drop of medication per kilogram of body weight. © 2018 Wiley Periodicals, Inc.

  15. IsoGeneGUI : Multiple approaches for dose-response analysis of microarray data using R

    NARCIS (Netherlands)

    Otava, Martin; Sengupta, Rudradev; Shkedy, Ziv; Lin, Dan; Pramana, Setia; Verbeke, Tobias; Haldermans, Philippe; Hothorn, Ludwig A.; Gerhard, Daniel; Kuiper, Rebecca M.; Klinglmueller, Florian; Kasim, Adetayo

    2017-01-01

    The analysis of transcriptomic experiments with ordered covariates, such as dose-response data, has become a central topic in bioinformatics, in particular in omics studies. Consequently, multiple R packages on CRAN and Bioconductor are designed to analyse microarray data from various perspectives

  16. BENCHMARKING – BETWEEN TRADITIONAL & MODERN BUSINESS ENVIRONMENT

    Directory of Open Access Journals (Sweden)

    Mihaela Ungureanu

    2011-09-01

    Full Text Available The concept of benchmarking requires a continuous process of performance improvement of different organizations in order to obtain superiority towards those perceived as market leader’s competitors. This superiority can always be questioned, its relativity originating in the quick growing evolution of the economic environment. The approach supports innovation in relation with traditional methods and it is based on the will of those managers who want to determine limits and seek excellence. The end of the twentieth century is the period of broad expression of benchmarking in various areas and its transformation from a simple quantitative analysis tool, to a resource of information on performance and quality of goods and services.

  17. Benchmarking and Learning in Public Healthcare

    DEFF Research Database (Denmark)

    Buckmaster, Natalie; Mouritsen, Jan

    2017-01-01

    This research investigates the effects of learning-oriented benchmarking in public healthcare settings. Benchmarking is a widely adopted yet little explored accounting practice that is part of the paradigm of New Public Management. Extant studies are directed towards mandated coercive benchmarking...... applications. The present study analyses voluntary benchmarking in a public setting that is oriented towards learning. The study contributes by showing how benchmarking can be mobilised for learning and offers evidence of the effects of such benchmarking for performance outcomes. It concludes that benchmarking...... can enable learning in public settings but that this requires actors to invest in ensuring that benchmark data are directed towards improvement....

  18. Low dose evaluation of the antiandrogen flutamide following a Mode of Action approach

    International Nuclear Information System (INIS)

    Sarrabay, A.; Hilmi, C.; Tinwell, H.; Schorsch, F.; Pallardy, M.; Bars, R.; Rouquié, D.

    2015-01-01

    ABSTRACT: The dose–response characterization of endocrine mediated toxicity is an on-going debate which is controversial when exploring the nature of the dose–response curve and the effect at the low-end of the curve. To contribute to this debate we have assessed the effects of a wide range of dose levels of the antiandrogen flutamide (FLU) on 7-week male Wistar rats. FLU was administered by oral gavage at doses of 0, 0.001, 0.01, 0.1, 1 and 10 mg/kg/day for 28 days. To evaluate the reproducibility, the study was performed 3 times. The molecular initiating event (MIE; AR antagonism), the key events (LH increase, Leydig cell proliferation and hyperplasia increases) and associated events involved in the mode of action (MOA) of FLU induced testicular toxicity were characterized to address the dose response concordance. Results showed no effects at low doses (< 0.1 mg/kg/day) for the different key events studied. The histopathological changes (Leydig cell hyperplasia) observed at 1 and 10 mg/kg/day were associated with an increase in steroidogenesis gene expression in the testis from 1 mg/kg/day, as well as an increase in testosterone blood level at 10 mg/kg/day. Each key event dose–response was in good concordance with the MOA of FLU on the testis. From the available results, only monotonic dose–response curves were observed for the MIE, the key events, associated events and in effects observed in other sex related tissues. All the results, so far, show that the reference endocrine disruptor FLU induces threshold effects in a standard 28-day toxicity study on adult male rats. - Highlights: • Dose–response characterization of endocrine mediated toxicity is an on-going debate. • A wide range of dose levels of flutamide was evaluated on young adult male rats. • Flutamide induces threshold effects using on standard and molecular tools.

  19. Low dose evaluation of the antiandrogen flutamide following a Mode of Action approach

    Energy Technology Data Exchange (ETDEWEB)

    Sarrabay, A. [INSERM, Université Paris-Sud, Faculté de Pharmacie, Châtenay-Malabry (France); UniverSud, INSERM, UMR-996 “Inflammation, Chemokines and Immunopathology”, Châtenay-Malabry (France); Bayer SAS, 16, rue Jean Marie Leclair, 69009 Lyon (France); Hilmi, C.; Tinwell, H.; Schorsch, F. [Bayer SAS, 16, rue Jean Marie Leclair, 69009 Lyon (France); Pallardy, M. [INSERM, Université Paris-Sud, Faculté de Pharmacie, Châtenay-Malabry (France); UniverSud, INSERM, UMR-996 “Inflammation, Chemokines and Immunopathology”, Châtenay-Malabry (France); Bars, R. [Bayer SAS, 16, rue Jean Marie Leclair, 69009 Lyon (France); Rouquié, D., E-mail: david.rouquie@bayer.com [Bayer SAS, 16, rue Jean Marie Leclair, 69009 Lyon (France)

    2015-12-15

    ABSTRACT: The dose–response characterization of endocrine mediated toxicity is an on-going debate which is controversial when exploring the nature of the dose–response curve and the effect at the low-end of the curve. To contribute to this debate we have assessed the effects of a wide range of dose levels of the antiandrogen flutamide (FLU) on 7-week male Wistar rats. FLU was administered by oral gavage at doses of 0, 0.001, 0.01, 0.1, 1 and 10 mg/kg/day for 28 days. To evaluate the reproducibility, the study was performed 3 times. The molecular initiating event (MIE; AR antagonism), the key events (LH increase, Leydig cell proliferation and hyperplasia increases) and associated events involved in the mode of action (MOA) of FLU induced testicular toxicity were characterized to address the dose response concordance. Results showed no effects at low doses (< 0.1 mg/kg/day) for the different key events studied. The histopathological changes (Leydig cell hyperplasia) observed at 1 and 10 mg/kg/day were associated with an increase in steroidogenesis gene expression in the testis from 1 mg/kg/day, as well as an increase in testosterone blood level at 10 mg/kg/day. Each key event dose–response was in good concordance with the MOA of FLU on the testis. From the available results, only monotonic dose–response curves were observed for the MIE, the key events, associated events and in effects observed in other sex related tissues. All the results, so far, show that the reference endocrine disruptor FLU induces threshold effects in a standard 28-day toxicity study on adult male rats. - Highlights: • Dose–response characterization of endocrine mediated toxicity is an on-going debate. • A wide range of dose levels of flutamide was evaluated on young adult male rats. • Flutamide induces threshold effects using on standard and molecular tools.

  20. Benchmarking to improve the quality of cystic fibrosis care.

    Science.gov (United States)

    Schechter, Michael S

    2012-11-01

    Benchmarking involves the ascertainment of healthcare programs with most favorable outcomes as a means to identify and spread effective strategies for delivery of care. The recent interest in the development of patient registries for patients with cystic fibrosis (CF) has been fueled in part by an interest in using them to facilitate benchmarking. This review summarizes reports of how benchmarking has been operationalized in attempts to improve CF care. Although certain goals of benchmarking can be accomplished with an exclusive focus on registry data analysis, benchmarking programs in Germany and the United States have supplemented these data analyses with exploratory interactions and discussions to better understand successful approaches to care and encourage their spread throughout the care network. Benchmarking allows the discovery and facilitates the spread of effective approaches to care. It provides a pragmatic alternative to traditional research methods such as randomized controlled trials, providing insights into methods that optimize delivery of care and allowing judgments about the relative effectiveness of different therapeutic approaches.

  1. Benchmarking: a method for continuous quality improvement in health.

    Science.gov (United States)

    Ettorchi-Tardy, Amina; Levif, Marie; Michel, Philippe

    2012-05-01

    Benchmarking, a management approach for implementing best practices at best cost, is a recent concept in the healthcare system. The objectives of this paper are to better understand the concept and its evolution in the healthcare sector, to propose an operational definition, and to describe some French and international experiences of benchmarking in the healthcare sector. To this end, we reviewed the literature on this approach's emergence in the industrial sector, its evolution, its fields of application and examples of how it has been used in the healthcare sector. Benchmarking is often thought to consist simply of comparing indicators and is not perceived in its entirety, that is, as a tool based on voluntary and active collaboration among several organizations to create a spirit of competition and to apply best practices. The key feature of benchmarking is its integration within a comprehensive and participatory policy of continuous quality improvement (CQI). Conditions for successful benchmarking focus essentially on careful preparation of the process, monitoring of the relevant indicators, staff involvement and inter-organizational visits. Compared to methods previously implemented in France (CQI and collaborative projects), benchmarking has specific features that set it apart as a healthcare innovation. This is especially true for healthcare or medical-social organizations, as the principle of inter-organizational visiting is not part of their culture. Thus, this approach will need to be assessed for feasibility and acceptability before it is more widely promoted.

  2. Thermal Performance Benchmarking: Annual Report

    Energy Technology Data Exchange (ETDEWEB)

    Feng, Xuhui [National Renewable Energy Laboratory (NREL), Golden, CO (United States). Transportation and Hydrogen Systems Center

    2017-10-19

    In FY16, the thermal performance of the 2014 Honda Accord Hybrid power electronics thermal management systems were benchmarked. Both experiments and numerical simulation were utilized to thoroughly study the thermal resistances and temperature distribution in the power module. Experimental results obtained from the water-ethylene glycol tests provided the junction-to-liquid thermal resistance. The finite element analysis (FEA) and computational fluid dynamics (CFD) models were found to yield a good match with experimental results. Both experimental and modeling results demonstrate that the passive stack is the dominant thermal resistance for both the motor and power electronics systems. The 2014 Accord power electronics systems yield steady-state thermal resistance values around 42- 50 mm to the 2nd power K/W, depending on the flow rates. At a typical flow rate of 10 liters per minute, the thermal resistance of the Accord system was found to be about 44 percent lower than that of the 2012 Nissan LEAF system that was benchmarked in FY15. The main reason for the difference is that the Accord power module used a metalized-ceramic substrate and eliminated the thermal interface material layers. FEA models were developed to study the transient performance of 2012 Nissan LEAF, 2014 Accord, and two other systems that feature conventional power module designs. The simulation results indicate that the 2012 LEAF power module has lowest thermal impedance at a time scale less than one second. This is probably due to moving low thermally conductive materials further away from the heat source and enhancing the heat spreading effect from the copper-molybdenum plate close to the insulated gate bipolar transistors. When approaching steady state, the Honda system shows lower thermal impedance. Measurement results of the thermal resistance of the 2015 BMW i3 power electronic system indicate that the i3 insulated gate bipolar transistor module has significantly lower junction

  3. TH-AB-201-10: Portal Dosimetry with Elekta IViewDose:Performance of the Simplified Commissioning Approach Versus Full Commissioning

    Energy Technology Data Exchange (ETDEWEB)

    Kydonieos, M; Folgueras, A; Florescu, L; Cybulski, T; Marinos, N; Thompson, G; Sayeed, A [Elekta Limited, Crawley, West Sussex (United Kingdom); Rozendaal, R; Olaciregui-Ruiz, I [Netherlands Cancer Institute - Antoni van Leeuwenhoek, Amsterdam, Noord-Holland (Netherlands); Subiel, A; Patallo, I Silvestre [National Physical Laboratory, London (United Kingdom)

    2016-06-15

    Purpose: Elekta recently developed a solution for in-vivo EPID dosimetry (iViewDose, Elekta AB, Stockholm, Sweden) in conjunction with the Netherlands Cancer Institute (NKI). This uses a simplified commissioning approach via Template Commissioning Models (TCMs), consisting of a subset of linac-independent pre-defined parameters. This work compares the performance of iViewDose using a TCM commissioning approach with that corresponding to full commissioning. Additionally, the dose reconstruction based on the simplified commissioning approach is validated via independent dose measurements. Methods: Measurements were performed at the NKI on a VersaHD™ (Elekta AB, Stockholm, Sweden). Treatment plans were generated with Pinnacle 9.8 (Philips Medical Systems, Eindhoven, The Netherlands). A farmer chamber dose measurement and two EPID images were used to create a linac-specific commissioning model based on a TCM. A complete set of commissioning measurements was collected and a full commissioning model was created.The performance of iViewDose based on the two commissioning approaches was compared via a series of set-to-work tests in a slab phantom. In these tests, iViewDose reconstructs and compares EPID to TPS dose for square fields, IMRT and VMAT plans via global gamma analysis and isocentre dose difference. A clinical VMAT plan was delivered to a homogeneous Octavius 4D phantom (PTW, Freiburg, Germany). Dose was measured with the Octavius 1500 array and VeriSoft software was used for 3D dose reconstruction. EPID images were acquired. TCM-based iViewDose and 3D Octavius dose distributions were compared against the TPS. Results: For both the TCM-based and the full commissioning approaches, the pass rate, mean γ and dose difference were >97%, <0.5 and <2.5%, respectively. Equivalent gamma analysis results were obtained for iViewDose (TCM approach) and Octavius for a VMAT plan. Conclusion: iViewDose produces similar results with the simplified and full commissioning

  4. Performance Targets and External Benchmarking

    DEFF Research Database (Denmark)

    Friis, Ivar; Hansen, Allan; Vámosi, Tamás S.

    Research on relative performance measures, transfer pricing, beyond budgeting initiatives, target costing, piece rates systems and value based management has for decades underlined the importance of external benchmarking in performance management. Research conceptualises external benchmarking...... as a market mechanism that can be brought inside the firm to provide incentives for continuous improvement and the development of competitive advances. However, whereas extant research primarily has focused on the importance and effects of using external benchmarks, less attention has been directed towards...... the conditions upon which the market mechanism is performing within organizations. This paper aims to contribute to research by providing more insight to the conditions for the use of external benchmarking as an element in performance management in organizations. Our study explores a particular type of external...

  5. Benchmarking and Sustainable Transport Policy

    DEFF Research Database (Denmark)

    Gudmundsson, Henrik; Wyatt, Andrew; Gordon, Lucy

    2004-01-01

    Order to learn from the best. In 2000 the European Commission initiated research to explore benchmarking as a tool to promote policies for ‘sustainable transport’. This paper reports findings and recommendations on how to address this challenge. The findings suggest that benchmarking is a valuable...... tool that may indeed help to move forward the transport policy agenda. However, there are major conditions and limitations. First of all it is not always so straightforward to delimit, measure and compare transport services in order to establish a clear benchmark. Secondly ‘sustainable transport......’ evokes a broad range of concerns that are hard to address fully at the level of specific practices. Thirdly policies are not directly comparable across space and context. For these reasons attempting to benchmark ‘sustainable transport policies’ against one another would be a highly complex task, which...

  6. A new approach to assess the doses to the population in the 30 km-zone after the Chernobyl accident

    International Nuclear Information System (INIS)

    Mueck, K.; Proehl, G.; Meckbach, R.; Likhtarev, I.; Kovgan, L.; Golikov, S.

    2000-01-01

    Very few data are available to assess the exposure of the population within the 30 km zone. While gamma dose rate measurements were performed in most villages within the exclusion zone, virtually no measurements of activity concentration in air or the concentration of various radionuclides relevant to the inhalation exposure were carried out due to lack of monitoring capacity. Also practically no data on activity concentration in foodstuffs are available to evaluate the exposure by ingestion of contaminated foodstuff in the various settlements. To reconstruct the exposure in the various settlements for epidemiological purposes, in particular with regard to the internal exposure, a new approach was developed. Based on the fact that there was only dry deposition within the 30 km zone during the main fallout period, and that the deposition of 137 Cs which was extensively determined in the various settlements at a later stage after the accident, the integrated activity concentration in air of 137 Cs was derived from measurements of 137 Cs per unit area for each settlement. The ratios of the various radioisotopes to 137 Cs (radionuclide vector) were determined in a two-level approach in which in a first step an evaluation of isotopic ratios for each isotope group was performed and in a second step the ratios of one guide isotope of each element group relative to 137 Cs was determined in dependence on distance and direction of the plume. Preliminary results indicate that the effective dose due to inhalation amounted to 8-13 times the external exposure and the exposure caused by ingestion to about 2-2.5 times the external exposure in a village close to the reactor site and evacuated early or exposed early in the passage of the plume. Under those circumstances, the total effective dose is more than 10-15 times greater than the external exposure. In villages at further distances from the site or predominantly exposed by a longer passage of the plume and evacuated at later

  7. A model-based approach of scatter dose contributions and efficiency of apron shielding for radiation protection in CT.

    Science.gov (United States)

    Weber, N; Monnin, P; Elandoy, C; Ding, S

    2015-12-01

    Given the contribution of scattered radiations to patient dose in CT, apron shielding is often used for radiation protection. In this study the efficiency of apron was assessed with a model-based approach of the contributions of the four scatter sources in CT, i.e. external scattered radiations from the tube and table, internal scatter from the patient and backscatter from the shielding. For this purpose, CTDI phantoms filled with thermoluminescent dosimeters were scanned without apron, and then with an apron at 0, 2.5 and 5 cm from the primary field. Scatter from the tube was measured separately in air. The scatter contributions were separated and mathematically modelled. The protective efficiency of the apron was low, only 1.5% in scatter dose reduction on average. The apron at 0 cm from the beam lowered the dose by 7.5% at the phantom bottom but increased the dose by 2% at the top (backscatter) and did not affect the centre. When the apron was placed at 2.5 or 5 cm, the results were intermediate to the one obtained with the shielding at 0 cm and without shielding. The apron effectiveness is finally limited to the small fraction of external scattered radiation. Copyright © 2015 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  8. A sub-sampled approach to extremely low-dose STEM

    Energy Technology Data Exchange (ETDEWEB)

    Stevens, A. [OptimalSensing, Southlake, Texas 76092, USA; Duke University, ECE, Durham, North Carolina 27708, USA; Luzi, L. [Rice University, ECE, Houston, Texas 77005, USA; Yang, H. [Lawrence Berkeley National Laboratory, Berkeley, California 94720, USA; Kovarik, L. [Pacific NW National Laboratory, Richland, Washington 99354, USA; Mehdi, B. L. [Pacific NW National Laboratory, Richland, Washington 99354, USA; University of Liverpool, Materials Engineering, Liverpool L69 3GH, United Kingdom; Liyu, A. [Pacific NW National Laboratory, Richland, Washington 99354, USA; Gehm, M. E. [Duke University, ECE, Durham, North Carolina 27708, USA; Browning, N. D. [Pacific NW National Laboratory, Richland, Washington 99354, USA; University of Liverpool, Materials Engineering, Liverpool L69 3GH, United Kingdom

    2018-01-22

    The inpainting of randomly sub-sampled images acquired by scanning transmission electron microscopy (STEM) is an attractive method for imaging under low-dose conditions (≤ 1 e-2) without changing either the operation of the microscope or the physics of the imaging process. We show that 1) adaptive sub-sampling increases acquisition speed, resolution, and sensitivity; and 2) random (non-adaptive) sub-sampling is equivalent, but faster than, traditional low-dose techniques. Adaptive sub-sampling opens numerous possibilities for the analysis of beam sensitive materials and in-situ dynamic processes at the resolution limit of the aberration corrected microscope and is demonstrated here for the analysis of the node distribution in metal-organic frameworks (MOFs).

  9. Modelling of Biota Dose Effects. Report of Working Group 6 Biota Dose Effects Modelling of EMRAS II Topical Heading Reference Approaches for Biota Dose Assessment. Environmental Modelling for RAdiation Safety (EMRAS II) Programme

    International Nuclear Information System (INIS)

    2014-07-01

    Environmental assessment models are used for evaluating the radiological impact of actual and potential releases of radionuclides to the environment. They are essential tools for use in the regulatory control of routine discharges to the environment and in planning the measures to be taken in the event of accidental releases. They are also used for predicting the impact of releases which may occur far into the future, for example, from underground radioactive waste repositories. It is important to verify, to the extent possible, the reliability of the predictions of such models by a comparison with measured values in the environment or with the predictions of other models. The IAEA has been organizing programmes on international model testing since the 1980s. These programmes have contributed to a general improvement in models, in the transfer of data and in the capabilities of modellers in Member States. IAEA publications on this subject over the past three decades demonstrate the comprehensive nature of the programmes and record the associated advances which have been made. From 2009 to 2011, the IAEA organized a project entitled Environmental Modelling for RAdiation Safety (EMRAS II), which concentrated on the improvement of environmental transfer models and the development of reference approaches to estimate the radiological impacts on humans, as well as on flora and fauna, arising from radionuclides in the environment. Different aspects were addressed by nine working groups covering three themes: reference approaches for human dose assessment, reference approaches for biota dose assessment and approaches for addressing emergency situations. This publication describes the work of the Biota Effects Modelling Working Group

  10. Modelling of Biota Dose Effects. Report of Working Group 6 Biota Dose Effects Modelling of EMRAS II Topical Heading Reference Approaches for Biota Dose Assessment. Environmental Modelling for RAdiation Safety (EMRAS II) Programme

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2014-07-15

    Environmental assessment models are used for evaluating the radiological impact of actual and potential releases of radionuclides to the environment. They are essential tools for use in the regulatory control of routine discharges to the environment and in planning the measures to be taken in the event of accidental releases. They are also used for predicting the impact of releases which may occur far into the future, for example, from underground radioactive waste repositories. It is important to verify, to the extent possible, the reliability of the predictions of such models by a comparison with measured values in the environment or with the predictions of other models. The IAEA has been organizing programmes on international model testing since the 1980s. These programmes have contributed to a general improvement in models, in the transfer of data and in the capabilities of modellers in Member States. IAEA publications on this subject over the past three decades demonstrate the comprehensive nature of the programmes and record the associated advances which have been made. From 2009 to 2011, the IAEA organized a project entitled Environmental Modelling for RAdiation Safety (EMRAS II), which concentrated on the improvement of environmental transfer models and the development of reference approaches to estimate the radiological impacts on humans, as well as on flora and fauna, arising from radionuclides in the environment. Different aspects were addressed by nine working groups covering three themes: reference approaches for human dose assessment, reference approaches for biota dose assessment and approaches for addressing emergency situations. This publication describes the work of the Biota Effects Modelling Working Group.

  11. Benchmarking: contexts and details matter.

    Science.gov (United States)

    Zheng, Siyuan

    2017-07-05

    Benchmarking is an essential step in the development of computational tools. We take this opportunity to pitch in our opinions on tool benchmarking, in light of two correspondence articles published in Genome Biology.Please see related Li et al. and Newman et al. correspondence articles: www.dx.doi.org/10.1186/s13059-017-1256-5 and www.dx.doi.org/10.1186/s13059-017-1257-4.

  12. Handbook of critical experiments benchmarks

    International Nuclear Information System (INIS)

    Durst, B.M.; Bierman, S.R.; Clayton, E.D.

    1978-03-01

    Data from critical experiments have been collected together for use as benchmarks in evaluating calculational techniques and nuclear data. These benchmarks have been selected from the numerous experiments performed on homogeneous plutonium systems. No attempt has been made to reproduce all of the data that exists. The primary objective in the collection of these data is to present representative experimental data defined in a concise, standardized format that can easily be translated into computer code input

  13. Analysis of Benchmark 2 results

    International Nuclear Information System (INIS)

    Bacha, F.; Lefievre, B.; Maillard, J.; Silva, J.

    1994-01-01

    The code GEANT315 has been compared to different codes in two benchmarks. We analyze its performances through our results, especially in the thick target case. In spite of gaps in nucleus-nucleus interaction theories at intermediate energies, benchmarks allow possible improvements of physical models used in our codes. Thereafter, a scheme of radioactive waste burning system is studied. (authors). 4 refs., 7 figs., 1 tab

  14. Systems engineering approach for the reuse of metallic waste from NPP decommissioning and dose evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Seo, Hyung Woo; Kim, Chang Lak [KEPCO International Nuclear Graduate School, Ulsan (Korea, Republic of)

    2017-03-15

    The oldest commercial reactor in South Korea, Kori-1 Nuclear Power Plant (NPP), will be shut down in 2017. Proper treatment for decommissioning wastes is one of the key factors to decommission a plant successfully. Particularly important is the recycling of clearance level or very low level radioactively contaminated metallic wastes, which contributes to waste minimization and the reduction of disposal volume. The aim of this study is to introduce a conceptual design of a recycle system and to evaluate the doses incurred through defined work flows. The various architecture diagrams were organized to define operational procedures and tasks. Potential exposure scenarios were selected in accordance with the recycle system, and the doses were evaluated with the RESRAD-RECYCLE computer code. By using this tool, the important scenarios and radionuclides as well as impacts of radionuclide characteristics and partitioning factors are analyzed. Moreover, dose analysis can be used to provide information on the necessary decontamination, radiation protection process, and allowable concentration limits for exposure scenarios.

  15. The Fukushima releases: an inverse modelling approach to assess the source term by using gamma dose rate observations

    Science.gov (United States)

    Saunier, Olivier; Mathieu, Anne; Didier, Damien; Tombette, Marilyne; Quélo, Denis; Winiarek, Victor; Bocquet, Marc

    2013-04-01

    The Chernobyl nuclear accident and more recently the Fukushima accident highlighted that the largest source of error on consequences assessment is the source term estimation including the time evolution of the release rate and its distribution between radioisotopes. Inverse modelling methods have proved to be efficient to assess the source term due to accidental situation (Gudiksen, 1989, Krysta and Bocquet, 2007, Stohl et al 2011, Winiarek et al 2012). These methods combine environmental measurements and atmospheric dispersion models. They have been recently applied to the Fukushima accident. Most existing approaches are designed to use air sampling measurements (Winiarek et al, 2012) and some of them use also deposition measurements (Stohl et al, 2012, Winiarek et al, 2013). During the Fukushima accident, such measurements are far less numerous and not as well distributed within Japan than the dose rate measurements. To efficiently document the evolution of the contamination, gamma dose rate measurements were numerous, well distributed within Japan and they offered a high temporal frequency. However, dose rate data are not as easy to use as air sampling measurements and until now they were not used in inverse modelling approach. Indeed, dose rate data results from all the gamma emitters present in the ground and in the atmosphere in the vicinity of the receptor. They do not allow one to determine the isotopic composition or to distinguish the plume contribution from wet deposition. The presented approach proposes a way to use dose rate measurement in inverse modeling approach without the need of a-priori information on emissions. The method proved to be efficient and reliable when applied on the Fukushima accident. The emissions for the 8 main isotopes Xe-133, Cs-134, Cs-136, Cs-137, Ba-137m, I-131, I-132 and Te-132 have been assessed. The Daiichi power plant events (such as ventings, explosions…) known to have caused atmospheric releases are well identified in

  16. Benchmarks for GADRAS performance validation

    International Nuclear Information System (INIS)

    Mattingly, John K.; Mitchell, Dean James; Rhykerd, Charles L. Jr.

    2009-01-01

    The performance of the Gamma Detector Response and Analysis Software (GADRAS) was validated by comparing GADRAS model results to experimental measurements for a series of benchmark sources. Sources for the benchmark include a plutonium metal sphere, bare and shielded in polyethylene, plutonium oxide in cans, a highly enriched uranium sphere, bare and shielded in polyethylene, a depleted uranium shell and spheres, and a natural uranium sphere. The benchmark experimental data were previously acquired and consist of careful collection of background and calibration source spectra along with the source spectra. The calibration data were fit with GADRAS to determine response functions for the detector in each experiment. A one-dimensional model (pie chart) was constructed for each source based on the dimensions of the benchmark source. The GADRAS code made a forward calculation from each model to predict the radiation spectrum for the detector used in the benchmark experiment. The comparisons between the GADRAS calculation and the experimental measurements are excellent, validating that GADRAS can correctly predict the radiation spectra for these well-defined benchmark sources.

  17. Issues in benchmarking human reliability analysis methods: A literature review

    International Nuclear Information System (INIS)

    Boring, Ronald L.; Hendrickson, Stacey M.L.; Forester, John A.; Tran, Tuan Q.; Lois, Erasmia

    2010-01-01

    There is a diversity of human reliability analysis (HRA) methods available for use in assessing human performance within probabilistic risk assessments (PRA). Due to the significant differences in the methods, including the scope, approach, and underlying models, there is a need for an empirical comparison investigating the validity and reliability of the methods. To accomplish this empirical comparison, a benchmarking study comparing and evaluating HRA methods in assessing operator performance in simulator experiments is currently underway. In order to account for as many effects as possible in the construction of this benchmarking study, a literature review was conducted, reviewing past benchmarking studies in the areas of psychology and risk assessment. A number of lessons learned through these studies is presented in order to aid in the design of future HRA benchmarking endeavors.

  18. Issues in benchmarking human reliability analysis methods : a literature review.

    Energy Technology Data Exchange (ETDEWEB)

    Lois, Erasmia (US Nuclear Regulatory Commission); Forester, John Alan; Tran, Tuan Q. (Idaho National Laboratory, Idaho Falls, ID); Hendrickson, Stacey M. Langfitt; Boring, Ronald L. (Idaho National Laboratory, Idaho Falls, ID)

    2008-04-01

    There is a diversity of human reliability analysis (HRA) methods available for use in assessing human performance within probabilistic risk assessment (PRA). Due to the significant differences in the methods, including the scope, approach, and underlying models, there is a need for an empirical comparison investigating the validity and reliability of the methods. To accomplish this empirical comparison, a benchmarking study is currently underway that compares HRA methods with each other and against operator performance in simulator studies. In order to account for as many effects as possible in the construction of this benchmarking study, a literature review was conducted, reviewing past benchmarking studies in the areas of psychology and risk assessment. A number of lessons learned through these studies are presented in order to aid in the design of future HRA benchmarking endeavors.

  19. Benchmarking for On-Scalp MEG Sensors.

    Science.gov (United States)

    Xie, Minshu; Schneiderman, Justin F; Chukharkin, Maxim L; Kalabukhov, Alexei; Riaz, Bushra; Lundqvist, Daniel; Whitmarsh, Stephen; Hamalainen, Matti; Jousmaki, Veikko; Oostenveld, Robert; Winkler, Dag

    2017-06-01

    We present a benchmarking protocol for quantitatively comparing emerging on-scalp magnetoencephalography (MEG) sensor technologies to their counterparts in state-of-the-art MEG systems. As a means of validation, we compare a high-critical-temperature superconducting quantum interference device (high T c SQUID) with the low- T c SQUIDs of an Elekta Neuromag TRIUX system in MEG recordings of auditory and somatosensory evoked fields (SEFs) on one human subject. We measure the expected signal gain for the auditory-evoked fields (deeper sources) and notice some unfamiliar features in the on-scalp sensor-based recordings of SEFs (shallower sources). The experimental results serve as a proof of principle for the benchmarking protocol. This approach is straightforward, general to various on-scalp MEG sensors, and convenient to use on human subjects. The unexpected features in the SEFs suggest on-scalp MEG sensors may reveal information about neuromagnetic sources that is otherwise difficult to extract from state-of-the-art MEG recordings. As the first systematically established on-scalp MEG benchmarking protocol, magnetic sensor developers can employ this method to prove the utility of their technology in MEG recordings. Further exploration of the SEFs with on-scalp MEG sensors may reveal unique information about their sources.

  20. Optimized dose distribution of a high dose rate vaginal cylinder

    International Nuclear Information System (INIS)

    Li Zuofeng; Liu, Chihray; Palta, Jatinder R.

    1998-01-01

    Purpose: To present a comparison of optimized dose distributions for a set of high-dose-rate (HDR) vaginal cylinders calculated by a commercial treatment-planning system with benchmark calculations using Monte-Carlo-calculated dosimetry data. Methods and Materials: Optimized dose distributions using both an isotropic and an anisotropic dose calculation model were obtained for a set of HDR vaginal cylinders. Mathematical optimization techniques available in the computer treatment-planning system were used to calculate dwell times and positions. These dose distributions were compared with benchmark calculations with TG43 formalism and using Monte-Carlo-calculated data. The same dwell times and positions were used for a quantitative comparison of dose calculated with three dose models. Results: The isotropic dose calculation model can result in discrepancies as high as 50%. The anisotropic dose calculation model compared better with benchmark calculations. The differences were more significant at the apex of the vaginal cylinder, which is typically used as the prescription point. Conclusion: Dose calculation models available in a computer treatment-planning system must be evaluated carefully to ensure their correct application. It should also be noted that when optimized dose distribution at a distance from the cylinder surface is calculated using an accurate dose calculation model, the vaginal mucosa dose becomes significantly higher, and therefore should be carefully monitored

  1. Dynamic benchmarking of simulation codes

    International Nuclear Information System (INIS)

    Henry, R.E.; Paik, C.Y.; Hauser, G.M.

    1996-01-01

    Computer simulation of nuclear power plant response can be a full-scope control room simulator, an engineering simulator to represent the general behavior of the plant under normal and abnormal conditions, or the modeling of the plant response to conditions that would eventually lead to core damage. In any of these, the underlying foundation for their use in analysing situations, training of vendor/utility personnel, etc. is how well they represent what has been known from industrial experience, large integral experiments and separate effects tests. Typically, simulation codes are benchmarked with some of these; the level of agreement necessary being dependent upon the ultimate use of the simulation tool. However, these analytical models are computer codes, and as a result, the capabilities are continually enhanced, errors are corrected, new situations are imposed on the code that are outside of the original design basis, etc. Consequently, there is a continual need to assure that the benchmarks with important transients are preserved as the computer code evolves. Retention of this benchmarking capability is essential to develop trust in the computer code. Given the evolving world of computer codes, how is this retention of benchmarking capabilities accomplished? For the MAAP4 codes this capability is accomplished through a 'dynamic benchmarking' feature embedded in the source code. In particular, a set of dynamic benchmarks are included in the source code and these are exercised every time the archive codes are upgraded and distributed to the MAAP users. Three different types of dynamic benchmarks are used: plant transients; large integral experiments; and separate effects tests. Each of these is performed in a different manner. The first is accomplished by developing a parameter file for the plant modeled and an input deck to describe the sequence; i.e. the entire MAAP4 code is exercised. The pertinent plant data is included in the source code and the computer

  2. How to achieve and prove performance improvement - 15 years of experience in German wastewater benchmarking.

    Science.gov (United States)

    Bertzbach, F; Franz, T; Möller, K

    2012-01-01

    This paper shows the results of performance improvement, which have been achieved in benchmarking projects in the wastewater industry in Germany over the last 15 years. A huge number of changes in operational practice and also in achieved annual savings can be shown, induced in particular by benchmarking at process level. Investigation of this question produces some general findings for the inclusion of performance improvement in a benchmarking project and for the communication of its results. Thus, we elaborate on the concept of benchmarking at both utility and process level, which is still a necessary distinction for the integration of performance improvement into our benchmarking approach. To achieve performance improvement via benchmarking it should be made quite clear that this outcome depends, on one hand, on a well conducted benchmarking programme and, on the other, on the individual situation within each participating utility.

  3. The basic approaches to evaluation of effects of the long-therm radiation exposure in a range of 'low' doses

    International Nuclear Information System (INIS)

    Takhauov, R.M.; Karpov, A.B.; Litvyakov, N.V.

    2010-01-01

    for evaluation the genetic effects of radiation exposure. DNA bank donors are workers of Siberian Group of Chemical Enterprises (SGCE) their descendants and also residents of the nearby territories. Taking into account the value of the accumulated material, it should be noted that DNA bank is one of the world's biggest biological material storage obtained from the exposed to long-term radiation influence in the range of 'low' doses. Due to present approaches using for evaluation of traditional and proposal stochastic effects of long-term radiation exposure in 'low' doses we can obtain the objective information of fundamental character. On the basis of this data it is possibility the additional of any radiation safety postulates and the development of the most importance diseases modern prophylactic strategy for populations exposuring radiation.

  4. Dose and risk assessment approach for the Fernald CERCLA D ampersand D Project

    International Nuclear Information System (INIS)

    Throckmorton, J.D.; Clark, T.R.; Waligora, S.J. Jr.; Haaker, R.F.

    1994-01-01

    At the Fernald Environmental Management Project (FEMP) the uranium processing facilities used from the 1952 through 1989 are near or beyond their intended design life. These current conditions present an increasing probability for future releases of hazardous substances to the environment. To support a decision by the U.S. Department of Energy (DOE) and the Environmental Protection Agency (EPA) to remediate the buildings, a dose and risk assessment was performed to determine the extent of exposure that would be associated with the controlled decontamination and dismantlement (D ampersand D) of the Fernald facilities. A conceptual risk assessment model was developed, with exposure mechanisms and associated pathways for each potential receptor. The three receptor groups were defined as: the remediation workers, other on-site workers (those not performing D ampersand D), and off-site residents. For use in the conceptual model, an airborne source term was developed through process knowledge, other historical information and data, and air sample data from within the facilities. Individual and collective doses and risks were developed for each receptor and for each population group. The risk assessment demonstrated that all exposures resulting from the action would be within the acceptable DOE administrative control level of 2.0 rem per year for occupational workers and the acceptable EPA risk range from 10 -6 to 10 -4 for the general public

  5. Penalized weighted least-squares approach for low-dose x-ray computed tomography

    Science.gov (United States)

    Wang, Jing; Li, Tianfang; Lu, Hongbing; Liang, Zhengrong

    2006-03-01

    The noise of low-dose computed tomography (CT) sinogram follows approximately a Gaussian distribution with nonlinear dependence between the sample mean and variance. The noise is statistically uncorrelated among detector bins at any view angle. However the correlation coefficient matrix of data signal indicates a strong signal correlation among neighboring views. Based on above observations, Karhunen-Loeve (KL) transform can be used to de-correlate the signal among the neighboring views. In each KL component, a penalized weighted least-squares (PWLS) objective function can be constructed and optimal sinogram can be estimated by minimizing the objective function, followed by filtered backprojection (FBP) for CT image reconstruction. In this work, we compared the KL-PWLS method with an iterative image reconstruction algorithm, which uses the Gauss-Seidel iterative calculation to minimize the PWLS objective function in image domain. We also compared the KL-PWLS with an iterative sinogram smoothing algorithm, which uses the iterated conditional mode calculation to minimize the PWLS objective function in sinogram space, followed by FBP for image reconstruction. Phantom experiments show a comparable performance of these three PWLS methods in suppressing the noise-induced artifacts and preserving resolution in reconstructed images. Computer simulation concurs with the phantom experiments in terms of noise-resolution tradeoff and detectability in low contrast environment. The KL-PWLS noise reduction may have the advantage in computation for low-dose CT imaging, especially for dynamic high-resolution studies.

  6. Iterative raw measurements restoration method with penalized weighted least squares approach for low-dose CT

    Science.gov (United States)

    Takahashi, Hisashi; Goto, Taiga; Hirokawa, Koichi; Miyazaki, Osamu

    2014-03-01

    Statistical iterative reconstruction and post-log data restoration algorithms for CT noise reduction have been widely studied and these techniques have enabled us to reduce irradiation doses while maintaining image qualities. In low dose scanning, electronic noise becomes obvious and it results in some non-positive signals in raw measurements. The nonpositive signal should be converted to positive signal so that it can be log-transformed. Since conventional conversion methods do not consider local variance on the sinogram, they have difficulty of controlling the strength of the filtering. Thus, in this work, we propose a method to convert the non-positive signal to the positive signal by mainly controlling the local variance. The method is implemented in two separate steps. First, an iterative restoration algorithm based on penalized weighted least squares is used to mitigate the effect of electronic noise. The algorithm preserves the local mean and reduces the local variance induced by the electronic noise. Second, smoothed raw measurements by the iterative algorithm are converted to the positive signal according to a function which replaces the non-positive signal with its local mean. In phantom studies, we confirm that the proposed method properly preserves the local mean and reduce the variance induced by the electronic noise. Our technique results in dramatically reduced shading artifacts and can also successfully cooperate with the post-log data filter to reduce streak artifacts.

  7. New approach to explain results of the low dose radiation on the Raphanus sativus

    Energy Technology Data Exchange (ETDEWEB)

    Kurisu, Y.; Yoshioka, K.; Yoshida, S.; Murata, I.; Takahashi, A. [Osaka University, Graduate School of Engineering, Department of Nuclear Engineering, Suita, Osaka (Japan)

    2002-01-01

    Recently, the researches on radiation hormesis toward the animals and plants are made abundantly. The radiation hormesis effect is that subharmful doses of radiation may evoke a stimulatory response in any organism. We did irradiation experiments of fusion (DD and DT) neutron, thermal and fast neutron, and 60-cobalt gamma-ray to the dry seeds of Raphanus stivus, and examined whether radiation hormesis effects appeared by measuring germination rate, the length of a hypocotyl and a root and the total weight on the 7th day from starring cultivation. The evaluation of radiation hormesis effects was done by using relative effectiveness which is the ratio of the mean of the measurement objects of the irradiation group to that of non-irradiation group. In the Raphanus stivus the radiation hormesis effects of the measured objects were only turned up in seed groups irradiated by the fusion (D-T) neutron. We have confirmed that absorbed dose range where the effects are revealed is from 1 cGy to 10 Gy and there the relative effectiveness is from 1.05 to 1.25. In this research the model about radiation hormesis effect on Raphanus sativus confirmed in irradiation of D-T neutrons is proposed. And it is apparent that radiation from radio activated seeds influences hormesis effect on Raphanus sativus. (author)

  8. Simulation of dose deposition in stereotactic synchrotron radiation therapy: a fast approach combining Monte Carlo and deterministic algorithms

    Energy Technology Data Exchange (ETDEWEB)

    Smekens, F; Freud, N; Letang, J M; Babot, D [CNDRI (Nondestructive Testing using Ionizing Radiations) Laboratory, INSA-Lyon, 69621 Villeurbanne Cedex (France); Adam, J-F; Elleaume, H; Esteve, F [INSERM U-836, Equipe 6 ' Rayonnement Synchrotron et Recherche Medicale' , Institut des Neurosciences de Grenoble (France); Ferrero, C; Bravin, A [European Synchrotron Radiation Facility, Grenoble (France)], E-mail: francois.smekens@insa-lyon.fr

    2009-08-07

    A hybrid approach, combining deterministic and Monte Carlo (MC) calculations, is proposed to compute the distribution of dose deposited during stereotactic synchrotron radiation therapy treatment. The proposed approach divides the computation into two parts: (i) the dose deposited by primary radiation (coming directly from the incident x-ray beam) is calculated in a deterministic way using ray casting techniques and energy-absorption coefficient tables and (ii) the dose deposited by secondary radiation (Rayleigh and Compton scattering, fluorescence) is computed using a hybrid algorithm combining MC and deterministic calculations. In the MC part, a small number of particle histories are simulated. Every time a scattering or fluorescence event takes place, a splitting mechanism is applied, so that multiple secondary photons are generated with a reduced weight. The secondary events are further processed in a deterministic way, using ray casting techniques. The whole simulation, carried out within the framework of the Monte Carlo code Geant4, is shown to converge towards the same results as the full MC simulation. The speed of convergence is found to depend notably on the splitting multiplicity, which can easily be optimized. To assess the performance of the proposed algorithm, we compare it to state-of-the-art MC simulations, accelerated by the track length estimator technique (TLE), considering a clinically realistic test case. It is found that the hybrid approach is significantly faster than the MC/TLE method. The gain in speed in a test case was about 25 for a constant precision. Therefore, this method appears to be suitable for treatment planning applications.

  9. Identification and selection of benchmark sites on litholitic soils of ...

    African Journals Online (AJOL)

    An approach to identify benchmarks for different ecological situations in the grassland biome is described. The approach is illustrated by using information on vegetation change, role of habitat factors and the relative palatability differences between the species of the vegetation on shallow soils of the litholitic complexes in ...

  10. Regional Competitive Intelligence: Benchmarking and Policymaking

    OpenAIRE

    Huggins , Robert

    2010-01-01

    Benchmarking exercises have become increasingly popular within the sphere of regional policymaking in recent years. The aim of this paper is to analyse the concept of regional benchmarking and its links with regional policymaking processes. It develops a typology of regional benchmarking exercises and regional benchmarkers, and critically reviews the literature, both academic and policy oriented. It is argued that critics who suggest regional benchmarking is a flawed concept and technique fai...

  11. Benchmarking of human resources management

    Directory of Open Access Journals (Sweden)

    David M. Akinnusi

    2008-11-01

    Full Text Available This paper reviews the role of human resource management (HRM which, today, plays a strategic partnership role in management. The focus of the paper is on HRM in the public sector, where much hope rests on HRM as a means of transforming the public service and achieving much needed service delivery. However, a critical evaluation of HRM practices in the public sector reveals that these services leave much to be desired. The paper suggests the adoption of benchmarking as a process to revamp HRM in the public sector so that it is able to deliver on its promises. It describes the nature and process of benchmarking and highlights the inherent difficulties in applying benchmarking in HRM. It concludes with some suggestions for a plan of action. The process of identifying “best” practices in HRM requires the best collaborative efforts of HRM practitioners and academicians. If used creatively, benchmarking has the potential to bring about radical and positive changes in HRM in the public sector. The adoption of the benchmarking process is, in itself, a litmus test of the extent to which HRM in the public sector has grown professionally.

  12. Benchmark simulation models, quo vadis?

    Science.gov (United States)

    Jeppsson, U; Alex, J; Batstone, D J; Benedetti, L; Comas, J; Copp, J B; Corominas, L; Flores-Alsina, X; Gernaey, K V; Nopens, I; Pons, M-N; Rodríguez-Roda, I; Rosen, C; Steyer, J-P; Vanrolleghem, P A; Volcke, E I P; Vrecko, D

    2013-01-01

    As the work of the IWA Task Group on Benchmarking of Control Strategies for wastewater treatment plants (WWTPs) is coming to an end, it is essential to disseminate the knowledge gained. For this reason, all authors of the IWA Scientific and Technical Report on benchmarking have come together to provide their insights, highlighting areas where knowledge may still be deficient and where new opportunities are emerging, and to propose potential avenues for future development and application of the general benchmarking framework and its associated tools. The paper focuses on the topics of temporal and spatial extension, process modifications within the WWTP, the realism of models, control strategy extensions and the potential for new evaluation tools within the existing benchmark system. We find that there are major opportunities for application within all of these areas, either from existing work already being done within the context of the benchmarking simulation models (BSMs) or applicable work in the wider literature. Of key importance is increasing capability, usability and transparency of the BSM package while avoiding unnecessary complexity.

  13. An ALARA approach to collective dose control at RAPS-3 and 4

    International Nuclear Information System (INIS)

    Khandelwal, Narendra; Dhakar, P.C.; Gupta, Ashok

    2016-01-01

    At Rajasthan Atomic Power Station-3 and 4 (RAPS 3 and 4) immense efforts were put to achieve continual reduction in man rem consumption in every subsequent year and this could be done successfully through implementing design changes based on experience, reducing radiation levels by removing radioactive material from undesired locations, flushing of active system/equipment's, shielding of hot spots, minimizing D20 leaks, manpower optimization, improvement in work culture and strong management commitment etc. Radiation protection is a never ending process and scope is always available to reduce the doses at all levels. Routine radiation survey sometimes does not provide exact idea about potential of significant collective exposure that can take place when ambient radiation levels are comparable to background radiation levels

  14. Dose related risk and effect assessment model (DREAM) -- A more realistic approach to risk assessment of offshore discharges

    International Nuclear Information System (INIS)

    Johnsen, S.; Furuholt, E.

    1995-01-01

    Risk assessment of discharges from offshore oil and gas production to the marine environment features determination of potential environmental concentration (PEC) levels and no observed effect concentration (NOEC) levels. The PEC values are normally based on dilution of chemical components in the actual discharge source in the recipient, while the NOEC values are determined by applying a safety factor to acute toxic effects from laboratory tests. The DREAM concept focuses on realistic exposure doses as function of contact time and dilution, rather than fixed exposure concentrations of chemicals in long time exposure regimes. In its present state, the DREAM model is based on a number of assumptions with respect to the link between real life exposure doses and effects observed in laboratory tests. A research project has recently been initiated to develop the concept further, with special focus on chronic effects of different chemical compounds on the marine ecosystem. One of the questions that will be addressed is the link between exposure time, dose, concentration and effect. Validation of the safety factors applied for transforming acute toxic data into NOEC values will also be included. The DREAM model has been used by Statoil for risk assessment of discharges from new and existing offshore oil and gas production fields, and has been found to give a much more realistic results than conventional risk assessment tools. The presentation outlines the background for the DREAM approach, describes the model in its present state, discusses further developments and applications, and shows a number of examples on the performance of DREAM

  15. Two computational approaches for Monte Carlo based shutdown dose rate calculation with applications to the JET fusion machine

    Energy Technology Data Exchange (ETDEWEB)

    Petrizzi, L.; Batistoni, P.; Migliori, S. [Associazione EURATOM ENEA sulla Fusione, Frascati (Roma) (Italy); Chen, Y.; Fischer, U.; Pereslavtsev, P. [Association FZK-EURATOM Forschungszentrum Karlsruhe (Germany); Loughlin, M. [EURATOM/UKAEA Fusion Association, Culham Science Centre, Abingdon, Oxfordshire, OX (United Kingdom); Secco, A. [Nice Srl Via Serra 33 Camerano Casasco AT (Italy)

    2003-07-01

    In deuterium-deuterium (D-D) and deuterium-tritium (D-T) fusion plasmas neutrons are produced causing activation of JET machine components. For safe operation and maintenance it is important to be able to predict the induced activation and the resulting shut down dose rates. This requires a suitable system of codes which is capable of simulating both the neutron induced material activation during operation and the decay gamma radiation transport after shut-down in the proper 3-D geometry. Two methodologies to calculate the dose rate in fusion devices have been developed recently and applied to fusion machines, both using the MCNP Monte Carlo code. FZK has developed a more classical approach, the rigorous 2-step (R2S) system in which MCNP is coupled to the FISPACT inventory code with an automated routing. ENEA, in collaboration with the ITER Team, has developed an alternative approach, the direct 1 step method (D1S). Neutron and decay gamma transport are handled in one single MCNP run, using an ad hoc cross section library. The intention was to tightly couple the neutron induced production of a radio-isotope and the emission of its decay gammas for an accurate spatial distribution and a reliable calculated statistical error. The two methods have been used by the two Associations to calculate the dose rate in five positions of JET machine, two inside the vacuum chamber and three outside, at cooling times between 1 second and 1 year after shutdown. The same MCNP model and irradiation conditions have been assumed. The exercise has been proposed and financed in the frame of the Fusion Technological Program of the JET machine. The scope is to supply the designers with the most reliable tool and data to calculate the dose rate on fusion machines. Results showed that there is a good agreement: the differences range between 5-35%. The next step to be considered in 2003 will be an exercise in which the comparison will be done with dose-rate data from JET taken during and

  16. Two computational approaches for Monte Carlo based shutdown dose rate calculation with applications to the JET fusion machine

    International Nuclear Information System (INIS)

    Petrizzi, L.; Batistoni, P.; Migliori, S.; Chen, Y.; Fischer, U.; Pereslavtsev, P.; Loughlin, M.; Secco, A.

    2003-01-01

    In deuterium-deuterium (D-D) and deuterium-tritium (D-T) fusion plasmas neutrons are produced causing activation of JET machine components. For safe operation and maintenance it is important to be able to predict the induced activation and the resulting shut down dose rates. This requires a suitable system of codes which is capable of simulating both the neutron induced material activation during operation and the decay gamma radiation transport after shut-down in the proper 3-D geometry. Two methodologies to calculate the dose rate in fusion devices have been developed recently and applied to fusion machines, both using the MCNP Monte Carlo code. FZK has developed a more classical approach, the rigorous 2-step (R2S) system in which MCNP is coupled to the FISPACT inventory code with an automated routing. ENEA, in collaboration with the ITER Team, has developed an alternative approach, the direct 1 step method (D1S). Neutron and decay gamma transport are handled in one single MCNP run, using an ad hoc cross section library. The intention was to tightly couple the neutron induced production of a radio-isotope and the emission of its decay gammas for an accurate spatial distribution and a reliable calculated statistical error. The two methods have been used by the two Associations to calculate the dose rate in five positions of JET machine, two inside the vacuum chamber and three outside, at cooling times between 1 second and 1 year after shutdown. The same MCNP model and irradiation conditions have been assumed. The exercise has been proposed and financed in the frame of the Fusion Technological Program of the JET machine. The scope is to supply the designers with the most reliable tool and data to calculate the dose rate on fusion machines. Results showed that there is a good agreement: the differences range between 5-35%. The next step to be considered in 2003 will be an exercise in which the comparison will be done with dose-rate data from JET taken during and

  17. The Key Events Dose-Response Framework: A cross-Disciplinary Mode-of-Action Based Approach to Examining Does-Response and Thresholds

    Science.gov (United States)

    the ILSI Research Foundation conveded a cross-disciplinary working group to examine current approaches for assessing dose-response and identifying safe levels of intake or exposure for four categoreis of bioactive agents: food allergens, nutrients, pathogenic microorganisms, and ...

  18. International Criticality Safety Benchmark Evaluation Project (ICSBEP) - ICSBEP 2015 Handbook

    International Nuclear Information System (INIS)

    Bess, John D.

    2015-01-01

    The Criticality Safety Benchmark Evaluation Project (CSBEP) was initiated in October of 1992 by the United States Department of Energy (DOE). The project quickly became an international effort as scientists from other interested countries became involved. The International Criticality Safety Benchmark Evaluation Project (ICSBEP) became an official activity of the Nuclear Energy Agency (NEA) in 1995. This handbook contains criticality safety benchmark specifications that have been derived from experiments performed at various critical facilities around the world. The benchmark specifications are intended for use by criticality safety engineers to validate calculation techniques used to establish minimum subcritical margins for operations with fissile material and to determine criticality alarm requirements and placement. Many of the specifications are also useful for nuclear data testing. Example calculations are presented; however, these calculations do not constitute a validation of the codes or cross-section data. The evaluated criticality safety benchmark data are given in nine volumes. These volumes span approximately 69000 pages and contain 567 evaluations with benchmark specifications for 4874 critical, near-critical or subcritical configurations, 31 criticality alarm placement/shielding configurations with multiple dose points for each, and 207 configurations that have been categorised as fundamental physics measurements that are relevant to criticality safety applications. New to the handbook are benchmark specifications for neutron activation foil and thermoluminescent dosimeter measurements performed at the SILENE critical assembly in Valduc, France as part of a joint venture in 2010 between the US DOE and the French Alternative Energies and Atomic Energy Commission (CEA). A photograph of this experiment is shown on the front cover. Experiments that are found unacceptable for use as criticality safety benchmark experiments are discussed in these

  19. International benchmarking of electricity transmission by regulators: A contrast between theory and practice?

    International Nuclear Information System (INIS)

    Haney, Aoife Brophy; Pollitt, Michael G.

    2013-01-01

    Benchmarking of electricity networks has a key role in sharing the benefits of efficiency improvements with consumers and ensuring regulated companies earn a fair return on their investments. This paper analyses and contrasts the theory and practice of international benchmarking of electricity transmission by regulators. We examine the literature relevant to electricity transmission benchmarking and discuss the results of a survey of 25 national electricity regulators. While new panel data techniques aimed at dealing with unobserved heterogeneity and the validity of the comparator group look intellectually promising, our survey suggests that they are in their infancy for regulatory purposes. In electricity transmission, relative to electricity distribution, choosing variables is particularly difficult, because of the large number of potential variables to choose from. Failure to apply benchmarking appropriately may negatively affect investors’ willingness to invest in the future. While few of our surveyed regulators acknowledge that regulatory risk is currently an issue in transmission benchmarking, many more concede it might be. In the meantime new regulatory approaches – such as those based on tendering, negotiated settlements, a wider range of outputs or longer term grid planning – are emerging and will necessarily involve a reduced role for benchmarking. -- Highlights: •We discuss how to benchmark electricity transmission. •We report survey results from 25 national energy regulators. •Electricity transmission benchmarking is more challenging than benchmarking distribution. •Many regulators concede benchmarking may raise capital costs. •Many regulators are considering new regulatory approaches

  20. [Benchmarking and other functions of ROM: back to basics].

    Science.gov (United States)

    Barendregt, M

    2015-01-01

    Since 2011 outcome data in the Dutch mental health care have been collected on a national scale. This has led to confusion about the position of benchmarking in the system known as routine outcome monitoring (rom). To provide insight into the various objectives and uses of aggregated outcome data. A qualitative review was performed and the findings were analysed. Benchmarking is a strategy for finding best practices and for improving efficacy and it belongs to the domain of quality management. Benchmarking involves comparing outcome data by means of instrumentation and is relatively tolerant with regard to the validity of the data. Although benchmarking is a function of rom, it must be differentiated form other functions from rom. Clinical management, public accountability, research, payment for performance and information for patients are all functions of rom which require different ways of data feedback and which make different demands on the validity of the underlying data. Benchmarking is often wrongly regarded as being simply a synonym for 'comparing institutions'. It is, however, a method which includes many more factors; it can be used to improve quality and has a more flexible approach to the validity of outcome data and is less concerned than other rom functions about funding and the amount of information given to patients. Benchmarking can make good use of currently available outcome data.

  1. 3-D neutron transport benchmarks

    International Nuclear Information System (INIS)

    Takeda, T.; Ikeda, H.

    1991-03-01

    A set of 3-D neutron transport benchmark problems proposed by the Osaka University to NEACRP in 1988 has been calculated by many participants and the corresponding results are summarized in this report. The results of K eff , control rod worth and region-averaged fluxes for the four proposed core models, calculated by using various 3-D transport codes are compared and discussed. The calculational methods used were: Monte Carlo, Discrete Ordinates (Sn), Spherical Harmonics (Pn), Nodal Transport and others. The solutions of the four core models are quite useful as benchmarks for checking the validity of 3-D neutron transport codes

  2. Strategic behaviour under regulatory benchmarking

    Energy Technology Data Exchange (ETDEWEB)

    Jamasb, T. [Cambridge Univ. (United Kingdom). Dept. of Applied Economics; Nillesen, P. [NUON NV (Netherlands); Pollitt, M. [Cambridge Univ. (United Kingdom). Judge Inst. of Management

    2004-09-01

    In order to improve the efficiency of electricity distribution networks, some regulators have adopted incentive regulation schemes that rely on performance benchmarking. Although regulation benchmarking can influence the ''regulation game,'' the subject has received limited attention. This paper discusses how strategic behaviour can result in inefficient behaviour by firms. We then use the Data Envelopment Analysis (DEA) method with US utility data to examine implications of illustrative cases of strategic behaviour reported by regulators. The results show that gaming can have significant effects on the measured performance and profitability of firms. (author)

  3. Atomic Energy Research benchmark activity

    International Nuclear Information System (INIS)

    Makai, M.

    1998-01-01

    The test problems utilized in the validation and verification process of computer programs in Atomic Energie Research are collected into one bunch. This is the first step towards issuing a volume in which tests for VVER are collected, along with reference solutions and a number of solutions. The benchmarks do not include the ZR-6 experiments because they have been published along with a number of comparisons in the Final reports of TIC. The present collection focuses on operational and mathematical benchmarks which cover almost the entire range of reaktor calculation. (Author)

  4. Pre-evaluation of fusion shielding benchmark experiment

    International Nuclear Information System (INIS)

    Hayashi, K.; Handa, H.; Konno, C.

    1994-01-01

    Shielding benchmark experiment is very useful to test the design code and nuclear data for fusion devices. There are many types of benchmark experiments that should be done in fusion shielding problems, but time and budget are limited. Therefore it will be important to select and determine the effective experimental configurations by precalculation before the experiment. The authors did three types of pre-evaluation to determine the experimental assembly configurations of shielding benchmark experiments planned in FNS, JAERI. (1) Void Effect Experiment - The purpose of this experiment is to measure the local increase of dose and nuclear heating behind small void(s) in shield material. Dimension of the voids and its arrangements were decided as follows. Dose and nuclear heating were calculated both for with and without void(s). Minimum size of the void was determined so that the ratio of these two results may be larger than error of the measurement system. (2) Auxiliary Shield Experiment - The purpose of this experiment is to measure shielding properties of B 4 C, Pb, W, and dose around superconducting magnet (SCM). Thickness of B 4 C, Pb, W and their arrangement including multilayer configuration were determined. (3) SCM Nuclear Heating Experiment - The purpose of this experiment is to measure nuclear heating and dose distribution in SCM material. Because it is difficult to use liquid helium as a part of SCM mock up material, material composition of SCM mock up are surveyed to have similar nuclear heating property of real SCM composition

  5. Impact of quantitative feedback and benchmark selection on radiation use by cardiologists performing cardiac angiography

    International Nuclear Information System (INIS)

    Smith, I. R.; Cameron, J.; Brighouse, R. D.; Ryan, C. M.; Foster, K. A.; Rivers, J. T.

    2013-01-01

    Audit of and feedback on both group and individual data provided immediately after the point of care and compared with realistic benchmarks of excellence have been demonstrated to drive change. This study sought to evaluate the impact of immediate benchmarked quantitative case-based performance feedback on the clinical practice of cardiologists practicing at a private hospital in Brisbane, Australia. The participating cardiologists were assigned to one of two groups: Group 1 received patient and procedural details for review and Group 2 received Group 1 data plus detailed radiation data relating to the procedures and comparative benchmarks. In Group 2, Linear-by-Linear Association analysis suggests a link between change in radiation use and initial radiation dose category (p50.014) with only those initially 'challenged' by the benchmarks showing improvement. Those not 'challenged' by the benchmarks deteriorated in performance compared with those starting well below the benchmarks showing greatest increase in radiation use. Conversely, those blinded to their radiation use (Group 1) showed general improvement in radiation use throughout the study compared with those performing initially close to the benchmarks showing greatest improvement. This study shows that use of non-challenging benchmarks in case-based radiation risk feedback does not promote a reduction in radiation use; indeed, it may contribute to increased doses. Paradoxically, cardiologists who are aware of performance monitoring but blinded to individual case data appear to maintain, if not reduce, their radiation use. (authors)

  6. Relative implications of protective responses versus damage induction at low dose and low-dose-rate exposures, using the microdose approach

    Energy Technology Data Exchange (ETDEWEB)

    Feinendegen, L.E

    2003-07-01

    In reviewing tissue effects of low-dose radiation (1) absorbed dose to tissue is replaced by the sum of energy deposited with track events in cell-equivalent tissue micromasses, i.e. with microdose hits, in the number of exposed micromasses and (2) induced cell damage and adaptive protection are related to microdose hits in exposed micromasses for a given radiation quality. DNA damage increases with the number of microdose hits. They also can induce adaptive protection, mainly against endogenous DNA damage. This protection involves cellular defenses, DNA repair and damage removal. With increasing numbers of low linear energy transfer (LET) microdose hits in exposed micromasses, adaptive protection first tends to outweigh damage and then (above 200 mGy) fails and largely disappears. These experimental data predict that cancer risk coefficients derived by epidemiology at high-dose irradiation decline at low doses and dose rates when adaptive protection outdoes DNA damage. The dose-risk function should include both linear and non-linear terms at low doses. (author)

  7. Relative implications of protective responses versus damage induction at low dose and low-dose-rate exposures, using the microdose approach

    International Nuclear Information System (INIS)

    Feinendegen, L.E.

    2003-01-01

    In reviewing tissue effects of low-dose radiation (1) absorbed dose to tissue is replaced by the sum of energy deposited with track events in cell-equivalent tissue micromasses, i.e. with microdose hits, in the number of exposed micromasses and (2) induced cell damage and adaptive protection are related to microdose hits in exposed micromasses for a given radiation quality. DNA damage increases with the number of microdose hits. They also can induce adaptive protection, mainly against endogenous DNA damage. This protection involves cellular defenses, DNA repair and damage removal. With increasing numbers of low linear energy transfer (LET) microdose hits in exposed micromasses, adaptive protection first tends to outweigh damage and then (above 200 mGy) fails and largely disappears. These experimental data predict that cancer risk coefficients derived by epidemiology at high-dose irradiation decline at low doses and dose rates when adaptive protection outdoes DNA damage. The dose-risk function should include both linear and non-linear terms at low doses. (author)

  8. The role of dose limitation and optimization in intervention. Approaches to the remediation of contaminated sites in Germany

    International Nuclear Information System (INIS)

    Goldammer, W.; Helming, M.; Kuehnel, G.; Landfermann, H.-H.

    2000-01-01

    The clean-up of contaminated sites requires appropriate and efficient methodologies for the decision-making about priorities and extent of remedial measures, aiming at the two, usually conflicting, goals to protect people and the environment and to save money and other resources. Finding the cost-effective balance between these two primary objectives often is complicated by several factors. Sensible decision-making in this situation requires the use of appropriate methodologies and tools which assist in identifying and implementing the optimal solution. The paper discusses an approach developed in Germany to achieve environmentally sound and cost-effective solutions. A basic requirement within the German approach is the limitation of individual doses in order to limit inequity between people exposed. An Action Level of 1 mSv per annum is used in this sense for the identification of sites that require farther investigation and potentially remediation. On the basis of this individual dose related criterion secondary reference levels for directly measurable quantities such as activity concentrations have been derived, facilitating the practical application of the Action Level Concept. Decisions on remedial action, in particular for complex sites, are based on justification and optimization analyses. These take into consideration a variety of different contaminants and risks to humans and the environment arising on various exposure pathways. The optimization analyses, carried-out to identify optimal remediation options, address radiological risks as well as short and long term costs within a cost-benefit analysis framework. Other relevant factors of influence, e.g. chemical risks or ecological damage, are incorporated as well. Comprehensive methodologies utilizing probabilistic methods have been developed to assess site conditions and possible remediation options on this basis. The approaches developed are applied within the German uranium mine rehabilitation program

  9. The role of dose limitation and optimization in intervention. Approaches to the remediation of contaminated sites in Germany

    Energy Technology Data Exchange (ETDEWEB)

    Goldammer, W. [Brenk Systemplanung GmbH, Aachen (Germany); Helming, M.; Kuehnel, G.; Landfermann, H.-H. [Federal Ministry for the Environment, Nature Conservation and Nuclear Safety, Bonn (Germany)

    2000-05-01

    The clean-up of contaminated sites requires appropriate and efficient methodologies for the decision-making about priorities and extent of remedial measures, aiming at the two, usually conflicting, goals to protect people and the environment and to save money and other resources. Finding the cost-effective balance between these two primary objectives often is complicated by several factors. Sensible decision-making in this situation requires the use of appropriate methodologies and tools which assist in identifying and implementing the optimal solution. The paper discusses an approach developed in Germany to achieve environmentally sound and cost-effective solutions. A basic requirement within the German approach is the limitation of individual doses in order to limit inequity between people exposed. An Action Level of 1 mSv per annum is used in this sense for the identification of sites that require farther investigation and potentially remediation. On the basis of this individual dose related criterion secondary reference levels for directly measurable quantities such as activity concentrations have been derived, facilitating the practical application of the Action Level Concept. Decisions on remedial action, in particular for complex sites, are based on justification and optimization analyses. These take into consideration a variety of different contaminants and risks to humans and the environment arising on various exposure pathways. The optimization analyses, carried-out to identify optimal remediation options, address radiological risks as well as short and long term costs within a cost-benefit analysis framework. Other relevant factors of influence, e.g. chemical risks or ecological damage, are incorporated as well. Comprehensive methodologies utilizing probabilistic methods have been developed to assess site conditions and possible remediation options on this basis. The approaches developed are applied within the German uranium mine rehabilitation program

  10. Using chemical benchmarking to determine the persistence of chemicals in a Swedish lake.

    Science.gov (United States)

    Zou, Hongyan; Radke, Michael; Kierkegaard, Amelie; MacLeod, Matthew; McLachlan, Michael S

    2015-02-03

    It is challenging to measure the persistence of chemicals under field conditions. In this work, two approaches for measuring persistence in the field were compared: the chemical mass balance approach, and a novel chemical benchmarking approach. Ten pharmaceuticals, an X-ray contrast agent, and an artificial sweetener were studied in a Swedish lake. Acesulfame K was selected as a benchmark to quantify persistence using the chemical benchmarking approach. The 95% confidence intervals of the half-life for transformation in the lake system ranged from 780-5700 days for carbamazepine to benchmarking approach agreed well with those from the mass balance approach (1-21% difference), indicating that chemical benchmarking can be a valid and useful method to measure the persistence of chemicals under field conditions. Compared to the mass balance approach, the benchmarking approach partially or completely eliminates the need to quantify mass flow of chemicals, so it is particularly advantageous when the quantification of mass flow of chemicals is difficult. Furthermore, the benchmarking approach allows for ready comparison and ranking of the persistence of different chemicals.

  11. Benchmarked Library Websites Comparative Study

    KAUST Repository

    Ramli, Rindra M.; Tyhurst, Janis

    2015-01-01

    This presentation provides an analysis of services provided by the benchmarked library websites. The exploratory study includes comparison of these websites against a list of criterion and presents a list of services that are most commonly deployed by the selected websites. In addition to that, the investigators proposed a list of services that could be provided via the KAUST library website.

  12. Prismatic Core Coupled Transient Benchmark

    International Nuclear Information System (INIS)

    Ortensi, J.; Pope, M.A.; Strydom, G.; Sen, R.S.; DeHart, M.D.; Gougar, H.D.; Ellis, C.; Baxter, A.; Seker, V.; Downar, T.J.; Vierow, K.; Ivanov, K.

    2011-01-01

    The Prismatic Modular Reactor (PMR) is one of the High Temperature Reactor (HTR) design concepts that have existed for some time. Several prismatic units have operated in the world (DRAGON, Fort St. Vrain, Peach Bottom) and one unit is still in operation (HTTR). The deterministic neutronics and thermal-fluids transient analysis tools and methods currently available for the design and analysis of PMRs have lagged behind the state of the art compared to LWR reactor technologies. This has motivated the development of more accurate and efficient tools for the design and safety evaluations of the PMR. In addition to the work invested in new methods, it is essential to develop appropriate benchmarks to verify and validate the new methods in computer codes. The purpose of this benchmark is to establish a well-defined problem, based on a common given set of data, to compare methods and tools in core simulation and thermal hydraulics analysis with a specific focus on transient events. The benchmark-working group is currently seeking OECD/NEA sponsorship. This benchmark is being pursued and is heavily based on the success of the PBMR-400 exercise.

  13. Disruption of thyroid hormone functions by low dose exposure of tributyltin: an in vitro and in vivo approach.

    Science.gov (United States)

    Sharan, Shruti; Nikhil, Kumar; Roy, Partha

    2014-09-15

    Triorganotins, such as tributyltin chloride (TBTCl), are environmental contaminants that are commonly found in the antifouling paints used in ships and other vessels. The importance of TBTCl as an endocrine-disrupting chemical (EDC) in different animal models is well known; however, its adverse effects on the thyroid gland are less understood. Hence, in the present study, we aimed to evaluate the thyroid-disrupting effects of this chemical using both in vitro and in vivo approaches. We used HepG2 hepatocarcinoma cells for the in vitro studies, as they are a thyroid hormone receptor (TR)-positive and thyroid responsive cell line. For the in vivo studies, Swiss albino male mice were exposed to three doses of TBTCl (0.5, 5 and 50μg/kg/day) for 45days. TBTCl showed a hypo-thyroidal effect in vivo. Low-dose treatment of TBTCl exposure markedly decreased the serum thyroid hormone levels via the down-regulation of the thyroid peroxidase (TPO) and thyroglobulin (Tg) genes by 40% and 25%, respectively, while augmenting the thyroid stimulating hormone (TSH) levels. Thyroid-stimulating hormone receptor (TSHR) expression was up-regulated in the thyroid glands of treated mice by 6.6-fold relative to vehicle-treated mice (p<0.05). In the transient transactivation assays, TBTCl suppressed T3 mediated transcriptional activity in a dose-dependent manner. In addition, TBTCl was found to decrease the expression of TR. The present study thus indicates that low concentrations of TBTCl suppress TR transcription by disrupting the physiological concentrations of T3/T4, followed by the recruitment of NCoR to TR, providing a novel insight into the thyroid hormone-disrupting effects of this chemical. Copyright © 2014 Elsevier Inc. All rights reserved.

  14. DoReMi workshop on multidisciplinary approaches to evaluating cancer risks associated with low-dose internal contamination

    International Nuclear Information System (INIS)

    Laurier, D.; Guseva Canu, I.; Bertho, J.M.; Blanchardon, E.; Rage, E.; Baatout, S.; Bouffler, S.; Cardis, E.; Gomolka, M.; Kreuzer, M.; Hall, J.; Kesminiene, A.

    2012-01-01

    A workshop dedicated to cancer risks associated with low-dose internal contamination was organised in March 2011, in Paris, in the framework of the DoReMi (Low Dose Research towards Multidisciplinary Integration) European Network of Excellence. The aim was to identify the best epidemiological studies that provide an opportunity to develop a multidisciplinary approach to improve the evaluation of the cancer risk associated with internal contamination. This workshop provided an opportunity for in-depth discussions between researchers working in different fields including (but not limited to) epidemiology, dosimetry, biology and toxicology. Discussions confirmed the importance of research on the health effects of internal contamination. Several existing epidemiological studies provide a real possibility to improve the quantification of cancer risk associated with internal emitters. Areas for future multidisciplinary collaborations were identified, that should allow feasibility studies to be carried out in the near future. The goal of this paper is to present an overview of the presentations and discussions that took place during this workshop. (authors)

  15. The ambient dose equivalent at flight altitudes: a fit to a large set of data using a Bayesian approach

    International Nuclear Information System (INIS)

    Wissmann, F; Reginatto, M; Moeller, T

    2010-01-01

    The problem of finding a simple, generally applicable description of worldwide measured ambient dose equivalent rates at aviation altitudes between 8 and 12 km is difficult to solve due to the large variety of functional forms and parametrisations that are possible. We present an approach that uses Bayesian statistics and Monte Carlo methods to fit mathematical models to a large set of data and to compare the different models. About 2500 data points measured in the periods 1997-1999 and 2003-2006 were used. Since the data cover wide ranges of barometric altitude, vertical cut-off rigidity and phases in the solar cycle 23, we developed functions which depend on these three variables. Whereas the dependence on the vertical cut-off rigidity is described by an exponential, the dependences on barometric altitude and solar activity may be approximated by linear functions in the ranges under consideration. Therefore, a simple Taylor expansion was used to define different models and to investigate the relevance of the different expansion coefficients. With the method presented here, it is possible to obtain probability distributions for each expansion coefficient and thus to extract reliable uncertainties even for the dose rate evaluated. The resulting function agrees well with new measurements made at fixed geographic positions and during long haul flights covering a wide range of latitudes.

  16. Information approach to the assessment of mechanisms and action consequences of ionizing radiation in low doses on a living organism

    International Nuclear Information System (INIS)

    Bulanova, K.Ya.; Kundas, S.P.; Lobanok, L.M.; Konoplya, E.F.

    2006-01-01

    In order to reveal the regularities of interaction of the organism with low-intense ionizing radiation, cybernetic approaches are needed. Living organisms are a self-regulating system of behavioural type. The complexity of the organization is determined by the hierarchy of a controlling system. Relations between systems are not of physico-chemical nature; they are based on control, i. e. on information processes. In the information system, all the weak influences (including ionizing radiation) are perceived in the form of a signal. Signal information of a natural radiation background is vitally important for organisms as in cardioversion type, as bioradiation, it is used for management initiation, i. e. self-regulation, self-development and so on. In the case of a superfluous surge of information at man-caused impacts of ionizing radiation (up to 10 Gy) the information system loses its ability to solve information tasks quickly and begins to experience the state of tension. Brought to a very tensed state it is able to lose its balance, its stability, i. e. to die. The signal-information perception of radiation explains the effects of its low dose, the non-linear character of dependence of biologic response of irradiated dose, hormesis phenomenon, apoptosis, remote consequences of irradiation, bystander effect and other postradiation effects. (authors)

  17. Implementation of the New Approach for the Dose-Response Functions Development for the Case of Athens and Greece

    Science.gov (United States)

    Christodoulakis, J.; Tzanis, C. G.; Varotsos, C. A.; Kouremadas, G.

    2016-08-01

    Dose-response functions (DRFs) are functions used for estimating corrosion and/or soiling levels of materials used in constructions and cultural monuments. In order to achieve this, DRFs lean on ground-based measurements of specific air pollution and climatic parameters like nitrogen oxides, ozone, temperature and others. In DRAGON 3 2015 Symposium we presented a new approach which proposed a technique for using satellite-based data for the necessary parameters instead of ground-based expanding in this way: a) the usage of DRFs in cases/areas where there is no availability of in situ measurements, b) the applicability of satellite-based data. In this work we present mapping results of deterioration levels (corrosion and soiling) for the case of Athens, Greece but also for the whole Greece country.

  18. Benchmarking computer platforms for lattice QCD applications

    International Nuclear Information System (INIS)

    Hasenbusch, M.; Jansen, K.; Pleiter, D.; Wegner, P.; Wettig, T.

    2003-09-01

    We define a benchmark suite for lattice QCD and report on benchmark results from several computer platforms. The platforms considered are apeNEXT, CRAY T3E, Hitachi SR8000, IBM p690, PC-Clusters, and QCDOC. (orig.)

  19. Benchmarking computer platforms for lattice QCD applications

    International Nuclear Information System (INIS)

    Hasenbusch, M.; Jansen, K.; Pleiter, D.; Stueben, H.; Wegner, P.; Wettig, T.; Wittig, H.

    2004-01-01

    We define a benchmark suite for lattice QCD and report on benchmark results from several computer platforms. The platforms considered are apeNEXT, CRAY T3E; Hitachi SR8000, IBM p690, PC-Clusters, and QCDOC

  20. Increasing nursing students' understanding and accuracy with medical dose calculations: A collaborative approach.

    Science.gov (United States)

    Mackie, Jane E; Bruce, Catherine D

    2016-05-01

    Accurate calculation of medication dosages can be challenging for nursing students. Specific interventions related to types of errors made by nursing students may improve the learning of this important skill. The objective of this study was to determine areas of challenge for students in performing medication dosage calculations in order to design interventions to improve this skill. Strengths and weaknesses in the teaching and learning of medication dosage calculations were assessed. These data were used to create online interventions which were then measured for the impact on student ability to perform medication dosage calculations. The setting of the study is one university in Canada. The qualitative research participants were 8 nursing students from years 1-3 and 8 faculty members. Quantitative results are based on test data from the same second year clinical course during the academic years 2012 and 2013. Students and faculty participated in one-to-one interviews; responses were recorded and coded for themes. Tests were implemented and scored, then data were assessed to classify the types and number of errors. Students identified conceptual understanding deficits, anxiety, low self-efficacy, and numeracy skills as primary challenges in medication dosage calculations. Faculty identified long division as a particular content challenge, and a lack of online resources for students to practice calculations. Lessons and online resources designed as an intervention to target mathematical and concepts and skills led to improved results and increases in overall pass rates for second year students for medication dosage calculation tests. This study suggests that with concerted effort and a multi-modal approach to supporting nursing students, their abilities to calculate dosages can be improved. The positive results in this study also point to the promise of cross-discipline collaborations between nursing and education. Copyright © 2016 Elsevier Ltd. All rights

  1. Box modelling approach for evaluation of influence of ice transport of radionuclides for doses to man

    International Nuclear Information System (INIS)

    Iospje, M.

    2002-01-01

    Modelling of the ice transport of radionuclides, which is a unique pathway in the Arctic ocean and adjacent sea areas, is limited by necessity to describe complete processes of incorporation of radioactivity into ice and ice sediment. Freezing / melting processes and transport of 'clean' ice can be described with a good accuracy for relatively short time scale on the basis of the present level of modelling, but detailed description of the sediment entrainment into ice based on the Reynolds equations with attention to coagulation processes is limited by low concentration of particles (grease ice cannot be described) and time scale up to 5 . 10 -2 s (1 . 10 -9 y) what is not available for large time scale and ice masses. Adding the radioactivity incorporation into the ice with following description of transport and fate of radionuclides will lead to further increasing of the complexity of the modelling. Therefore, it is necessary to develop an alternative approach for purposes of radiological assessment on the basis of the box modelling to describe the incorporation of radioactivity into ice and ice sediment, transport of radioactivity by ice and incorporation of radioactivity into sea areas through melding processes. It is shown that the ice transport of radionuclides can be a significant factor for some scenarios and radionuclides. The influence of the ice transport increases with increasing K d values for radionuclides. It is necessary to note that the content and structure of the sediment load in ice vary within wide limits, and therefore, sensitivity and uncertainty analysis can improve the possibility to represent model results satisfactorily. (LN)

  2. BONFIRE: benchmarking computers and computer networks

    OpenAIRE

    Bouckaert, Stefan; Vanhie-Van Gerwen, Jono; Moerman, Ingrid; Phillips, Stephen; Wilander, Jerker

    2011-01-01

    The benchmarking concept is not new in the field of computing or computer networking. With “benchmarking tools”, one usually refers to a program or set of programs, used to evaluate the performance of a solution under certain reference conditions, relative to the performance of another solution. Since the 1970s, benchmarking techniques have been used to measure the performance of computers and computer networks. Benchmarking of applications and virtual machines in an Infrastructure-as-a-Servi...

  3. Benchmarking clinical photography services in the NHS.

    Science.gov (United States)

    Arbon, Giles

    2015-01-01

    Benchmarking is used in services across the National Health Service (NHS) using various benchmarking programs. Clinical photography services do not have a program in place and services have to rely on ad hoc surveys of other services. A trial benchmarking exercise was undertaken with 13 services in NHS Trusts. This highlights valuable data and comparisons that can be used to benchmark and improve services throughout the profession.

  4. Toxicological benchmarks for screening potential contaminants of concern for effects on aquatic biota: 1994 Revision

    Energy Technology Data Exchange (ETDEWEB)

    Suter, G.W. II [Oak Ridge National Lab., TN (United States); Mabrey, J.B. [University of West Florida, Pensacola, FL (United States)

    1994-07-01

    This report presents potential screening benchmarks for protection of aquatic life from contaminants in water. Because there is no guidance for screening benchmarks, a set of alternative benchmarks is presented herein. The alternative benchmarks are based on different conceptual approaches to estimating concentrations causing significant effects. For the upper screening benchmark, there are the acute National Ambient Water Quality Criteria (NAWQC) and the Secondary Acute Values (SAV). The SAV concentrations are values estimated with 80% confidence not to exceed the unknown acute NAWQC for those chemicals with no NAWQC. The alternative chronic benchmarks are the chronic NAWQC, the Secondary Chronic Value (SCV), the lowest chronic values for fish and daphnids from chronic toxicity tests, the estimated EC20 for a sensitive species, and the concentration estimated to cause a 20% reduction in the recruit abundance of largemouth bass. It is recommended that ambient chemical concentrations be compared to all of these benchmarks. If NAWQC are exceeded, the chemicals must be contaminants of concern because the NAWQC are applicable or relevant and appropriate requirements (ARARs). If NAWQC are not exceeded, but other benchmarks are, contaminants should be selected on the basis of the number of benchmarks exceeded and the conservatism of the particular benchmark values, as discussed in the text. To the extent that toxicity data are available, this report presents the alternative benchmarks for chemicals that have been detected on the Oak Ridge Reservation. It also presents the data used to calculate benchmarks and the sources of the data. It compares the benchmarks and discusses their relative conservatism and utility.

  5. SU-F-P-56: On a New Approach to Reconstruct the Patient Dose From Phantom Measurements

    International Nuclear Information System (INIS)

    Bangtsson, E; Vries, W de

    2016-01-01

    Purpose: The development of complex radiation treatment schemes emphasizes the need for advanced QA analysis methods to ensure patient safety. One such tool is the Delta4 DVH Anatomy software, where the patient dose is reconstructed from phantom measurements. Deviations in the measured dose are transferred to the patient anatomy and their clinical impact is evaluated in situ. Results from the original algorithm revealed weaknesses that may introduce artefacts in the reconstructed dose. These can lead to false negatives or obscure the effects of minor dose deviations from delivery failures. Here, we will present results from a new patient dose reconstruction algorithm. Methods: The main steps of the new algorithm are: (1) the dose delivered to a phantom is measured in a number of detector positions. (2) The measured dose is compared to an internally calculated dose distribution evaluated in said positions. The so-obtained dose difference is (3) used to calculate an energy fluence difference. This entity is (4) used as input to a patient dose correction calculation routine. Finally, the patient dose is reconstructed by adding said patient dose correction to the planned patient dose. The internal dose calculation in step (2) and (4) is based on the Pencil Beam algorithm. Results: The new patient dose reconstruction algorithm have been tested on a number of patients and the standard metrics dose deviation (DDev), distance-to-agreement (DTA) and Gamma index are improved when compared to the original algorithm. In a certain case the Gamma index (3%/3mm) increases from 72.9% to 96.6%. Conclusion: The patient dose reconstruction algorithm is improved. This leads to a reduction in non-physical artefacts in the reconstructed patient dose. As a consequence, the possibility to detect deviations in the dose that is delivered to the patient is improved. An increase in Gamma index for the PTV can be seen. The corresponding author is an employee of ScandiDos

  6. SU-F-P-56: On a New Approach to Reconstruct the Patient Dose From Phantom Measurements

    Energy Technology Data Exchange (ETDEWEB)

    Bangtsson, E [ScandiDos, Uppsala (Sweden); Vries, W de [University Medical Center Utrecht, Utrecht (Netherlands)

    2016-06-15

    Purpose: The development of complex radiation treatment schemes emphasizes the need for advanced QA analysis methods to ensure patient safety. One such tool is the Delta4 DVH Anatomy software, where the patient dose is reconstructed from phantom measurements. Deviations in the measured dose are transferred to the patient anatomy and their clinical impact is evaluated in situ. Results from the original algorithm revealed weaknesses that may introduce artefacts in the reconstructed dose. These can lead to false negatives or obscure the effects of minor dose deviations from delivery failures. Here, we will present results from a new patient dose reconstruction algorithm. Methods: The main steps of the new algorithm are: (1) the dose delivered to a phantom is measured in a number of detector positions. (2) The measured dose is compared to an internally calculated dose distribution evaluated in said positions. The so-obtained dose difference is (3) used to calculate an energy fluence difference. This entity is (4) used as input to a patient dose correction calculation routine. Finally, the patient dose is reconstructed by adding said patient dose correction to the planned patient dose. The internal dose calculation in step (2) and (4) is based on the Pencil Beam algorithm. Results: The new patient dose reconstruction algorithm have been tested on a number of patients and the standard metrics dose deviation (DDev), distance-to-agreement (DTA) and Gamma index are improved when compared to the original algorithm. In a certain case the Gamma index (3%/3mm) increases from 72.9% to 96.6%. Conclusion: The patient dose reconstruction algorithm is improved. This leads to a reduction in non-physical artefacts in the reconstructed patient dose. As a consequence, the possibility to detect deviations in the dose that is delivered to the patient is improved. An increase in Gamma index for the PTV can be seen. The corresponding author is an employee of ScandiDos.

  7. Journal Benchmarking for Strategic Publication Management and for Improving Journal Positioning in the World Ranking Systems

    Science.gov (United States)

    Moskovkin, Vladimir M.; Bocharova, Emilia A.; Balashova, Oksana V.

    2014-01-01

    Purpose: The purpose of this paper is to introduce and develop the methodology of journal benchmarking. Design/Methodology/ Approach: The journal benchmarking method is understood to be an analytic procedure of continuous monitoring and comparing of the advance of specific journal(s) against that of competing journals in the same subject area,…

  8. Academic library benchmarking in The Netherlands: a comparative study

    NARCIS (Netherlands)

    Voorbij, H.

    2009-01-01

    Purpose - This paper aims to describe some of the unique features of the Dutch academic library benchmarking system. Design/methodology/approach - The Dutch system is compared with similar projects in the USA, the UK and Germany. Findings - One of the most distinguishing features of the Dutch system

  9. WWER-1000 Burnup Credit Benchmark (CB5)

    International Nuclear Information System (INIS)

    Manolova, M.A.

    2002-01-01

    In the paper the specification of WWER-1000 Burnup Credit Benchmark first phase (depletion calculations), given. The second phase - criticality calculations for the WWER-1000 fuel pin cell, will be given after the evaluation of the results, obtained at the first phase. The proposed benchmark is a continuation of the WWER benchmark activities in this field (Author)

  10. Benchmarking and Learning in Public Healthcare

    DEFF Research Database (Denmark)

    Buckmaster, Natalie; Mouritsen, Jan

    2017-01-01

    This research investigates the effects of learning-oriented benchmarking in public healthcare settings. Benchmarking is a widely adopted yet little explored accounting practice that is part of the paradigm of New Public Management. Extant studies are directed towards mandated coercive benchmarking...

  11. Using Benchmarking To Strengthen the Assessment of Persistence.

    Science.gov (United States)

    McLachlan, Michael S; Zou, Hongyan; Gouin, Todd

    2017-01-03

    Chemical persistence is a key property for assessing chemical risk and chemical hazard. Current methods for evaluating persistence are based on laboratory tests. The relationship between the laboratory based estimates and persistence in the environment is often unclear, in which case the current methods for evaluating persistence can be questioned. Chemical benchmarking opens new possibilities to measure persistence in the field. In this paper we explore how the benchmarking approach can be applied in both the laboratory and the field to deepen our understanding of chemical persistence in the environment and create a firmer scientific basis for laboratory to field extrapolation of persistence test results.

  12. Geothermal Heat Pump Benchmarking Report

    Energy Technology Data Exchange (ETDEWEB)

    None

    1997-01-17

    A benchmarking study was conducted on behalf of the Department of Energy to determine the critical factors in successful utility geothermal heat pump programs. A Successful program is one that has achieved significant market penetration. Successfully marketing geothermal heat pumps has presented some major challenges to the utility industry. However, select utilities have developed programs that generate significant GHP sales. This benchmarking study concludes that there are three factors critical to the success of utility GHP marking programs: (1) Top management marketing commitment; (2) An understanding of the fundamentals of marketing and business development; and (3) An aggressive competitive posture. To generate significant GHP sales, competitive market forces must by used. However, because utilities have functioned only in a regulated arena, these companies and their leaders are unschooled in competitive business practices. Therefore, a lack of experience coupled with an intrinsically non-competitive culture yields an industry environment that impedes the generation of significant GHP sales in many, but not all, utilities.

  13. The development of code benchmarks

    International Nuclear Information System (INIS)

    Glass, R.E.

    1986-01-01

    Sandia National Laboratories has undertaken a code benchmarking effort to define a series of cask-like problems having both numerical solutions and experimental data. The development of the benchmarks includes: (1) model problem definition, (2) code intercomparison, and (3) experimental verification. The first two steps are complete and a series of experiments are planned. The experiments will examine the elastic/plastic behavior of cylinders for both the end and side impacts resulting from a nine meter drop. The cylinders will be made from stainless steel and aluminum to give a range of plastic deformations. This paper presents the results of analyses simulating the model's behavior using materials properties for stainless steel and aluminum

  14. Benchmarking Variable Selection in QSAR.

    Science.gov (United States)

    Eklund, Martin; Norinder, Ulf; Boyer, Scott; Carlsson, Lars

    2012-02-01

    Variable selection is important in QSAR modeling since it can improve model performance and transparency, as well as reduce the computational cost of model fitting and predictions. Which variable selection methods that perform well in QSAR settings is largely unknown. To address this question we, in a total of 1728 benchmarking experiments, rigorously investigated how eight variable selection methods affect the predictive performance and transparency of random forest models fitted to seven QSAR datasets covering different endpoints, descriptors sets, types of response variables, and number of chemical compounds. The results show that univariate variable selection methods are suboptimal and that the number of variables in the benchmarked datasets can be reduced with about 60 % without significant loss in model performance when using multivariate adaptive regression splines MARS and forward selection. Copyright © 2012 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  15. Closed-loop neuromorphic benchmarks

    CSIR Research Space (South Africa)

    Stewart, TC

    2015-11-01

    Full Text Available Benchmarks   Terrence C. Stewart 1* , Travis DeWolf 1 , Ashley Kleinhans 2 , Chris Eliasmith 1   1 University of Waterloo, Canada, 2 Council for Scientific and Industrial Research, South Africa   Submitted to Journal:   Frontiers in Neuroscience   Specialty... Eliasmith 1 1Centre for Theoretical Neuroscience, University of Waterloo, Waterloo, ON, Canada 2Mobile Intelligent Autonomous Systems group, Council for Scientific and Industrial Research, Pretoria, South Africa Correspondence*: Terrence C. Stewart Centre...

  16. Investible benchmarks & hedge fund liquidity

    OpenAIRE

    Freed, Marc S; McMillan, Ben

    2011-01-01

    A lack of commonly accepted benchmarks for hedge fund performance has permitted hedge fund managers to attribute to skill returns that may actually accrue from market risk factors and illiquidity. Recent innovations in hedge fund replication permits us to estimate the extent of this misattribution. Using an option-based model, we find evidence that the value of liquidity options that investors implicitly grant managers when they invest may account for part or even all hedge fund returns. C...

  17. Anticoagulation Endpoints with Clinical Implementation of Warfarin Pharmacogenetic Dosing in a Real-World Setting – A Proposal for a New Pharmacogenetic Dosing Approach

    Science.gov (United States)

    Arwood, Meghan J.; Deng, Jiexin; Drozda, Katarzyna; Pugach, Oksana; Nutescu, Edith A.; Schmidt, Stephan; Duarte, Julio D.; Cavallari, Larisa H.

    2016-01-01

    Achieving therapeutic anticoagulation efficiently with warfarin is important to reduce thrombotic and bleeding risks and is influenced by genotype. Utilizing data from a diverse population of 257 patients who received VKORC1 and CYP2C9 genotype-guided warfarin dosing, we aimed to examine genotype-associated differences in anticoagulation endpoints and derive a novel pharmacogenetic nomogram to more optimally dose warfarin. We observed significant differences across patients with 0, 1, or ≥2 reduced-function VKORC1 or CYP2C9 alleles, respectively, in time to achieve therapeutic international normalized ratio (INR) (7.8±5.8, 7.2±4.7, and 5.4±4.6 days, P=0.0004) and mean percentage of time in therapeutic range in the first 28 days (22.2, 27.8, and 32.2%, P=0.0127) with use of existing pharmacogenetic algorithms. These data suggest that more aggressive dosing is necessary for patients with 0 to 1 VKORC1/CYP2C9 variants to more efficiently achieve therapeutic anticoagulation. Herein, we provide a novel kinetic/pharmacodynamic-derived dosing nomogram optimized for a heterogeneous patient population. PMID:28032893

  18. REVISED STREAM CODE AND WASP5 BENCHMARK

    International Nuclear Information System (INIS)

    Chen, K

    2005-01-01

    STREAM is an emergency response code that predicts downstream pollutant concentrations for releases from the SRS area to the Savannah River. The STREAM code uses an algebraic equation to approximate the solution of the one dimensional advective transport differential equation. This approach generates spurious oscillations in the concentration profile when modeling long duration releases. To improve the capability of the STREAM code to model long-term releases, its calculation module was replaced by the WASP5 code. WASP5 is a US EPA water quality analysis program that simulates one-dimensional pollutant transport through surface water. Test cases were performed to compare the revised version of STREAM with the existing version. For continuous releases, results predicted by the revised STREAM code agree with physical expectations. The WASP5 code was benchmarked with the US EPA 1990 and 1991 dye tracer studies, in which the transport of the dye was measured from its release at the New Savannah Bluff Lock and Dam downstream to Savannah. The peak concentrations predicted by the WASP5 agreed with the measurements within ±20.0%. The transport times of the dye concentration peak predicted by the WASP5 agreed with the measurements within ±3.6%. These benchmarking results demonstrate that STREAM should be capable of accurately modeling releases from SRS outfalls

  19. Towards benchmarking an in-stream water quality model

    Directory of Open Access Journals (Sweden)

    2007-01-01

    Full Text Available A method of model evaluation is presented which utilises a comparison with a benchmark model. The proposed benchmarking concept is one that can be applied to many hydrological models but, in this instance, is implemented in the context of an in-stream water quality model. The benchmark model is defined in such a way that it is easily implemented within the framework of the test model, i.e. the approach relies on two applications of the same model code rather than the application of two separate model codes. This is illustrated using two case studies from the UK, the Rivers Aire and Ouse, with the objective of simulating a water quality classification, general quality assessment (GQA, which is based on dissolved oxygen, biochemical oxygen demand and ammonium. Comparisons between the benchmark and test models are made based on GQA, as well as a step-wise assessment against the components required in its derivation. The benchmarking process yields a great deal of important information about the performance of the test model and raises issues about a priori definition of the assessment criteria.

  20. Developing a benchmark for emotional analysis of music.

    Science.gov (United States)

    Aljanaki, Anna; Yang, Yi-Hsuan; Soleymani, Mohammad

    2017-01-01

    Music emotion recognition (MER) field rapidly expanded in the last decade. Many new methods and new audio features are developed to improve the performance of MER algorithms. However, it is very difficult to compare the performance of the new methods because of the data representation diversity and scarcity of publicly available data. In this paper, we address these problems by creating a data set and a benchmark for MER. The data set that we release, a MediaEval Database for Emotional Analysis in Music (DEAM), is the largest available data set of dynamic annotations (valence and arousal annotations for 1,802 songs and song excerpts licensed under Creative Commons with 2Hz time resolution). Using DEAM, we organized the 'Emotion in Music' task at MediaEval Multimedia Evaluation Campaign from 2013 to 2015. The benchmark attracted, in total, 21 active teams to participate in the challenge. We analyze the results of the benchmark: the winning algorithms and feature-sets. We also describe the design of the benchmark, the evaluation procedures and the data cleaning and transformations that we suggest. The results from the benchmark suggest that the recurrent neural network based approaches combined with large feature-sets work best for dynamic MER.

  1. Building America Research Benchmark Definition, Updated December 15, 2006

    Energy Technology Data Exchange (ETDEWEB)

    Hendron, R.

    2007-01-01

    To track progress toward aggressive multi-year whole-house energy savings goals of 40-70% and onsite power production of up to 30%, DOE's Residential Buildings Program and NREL developed the Building America Research Benchmark in consultation with the Building America industry teams. The Benchmark is generally consistent with mid-1990s standard practice, as reflected in the Home Energy Rating System (HERS) Technical Guidelines (RESNET 2002), with additional definitions that allow the analyst to evaluate all residential end-uses, an extension of the traditional HERS rating approach that focuses on space conditioning and hot water. Unlike the reference homes used for HERS, EnergyStar, and most energy codes, the Benchmark represents typical construction at a fixed point in time so it can be used as the basis for Building America's multi-year energy savings goals without the complication of chasing a ''moving target''.

  2. Building America Research Benchmark Definition: Updated December 20, 2007

    Energy Technology Data Exchange (ETDEWEB)

    Hendron, R.

    2008-01-01

    To track progress toward aggressive multi-year whole-house energy savings goals of 40-70% and onsite power production of up to 30%, DOE's Residential Buildings Program and NREL developed the Building America Research Benchmark in consultation with the Building America industry teams. The Benchmark is generally consistent with mid-1990s standard practice, as reflected in the Home Energy Rating System (HERS) Technical Guidelines (RESNET 2002), with additional definitions that allow the analyst to evaluate all residential end-uses, an extension of the traditional HERS rating approach that focuses on space conditioning and hot water. Unlike the reference homes used for HERS, EnergyStar, and most energy codes, the Benchmark represents typical construction at a fixed point in time so it can be used as the basis for Building America's multi-year energy savings goals without the complication of chasing a 'moving target'.

  3. Building America Research Benchmark Definition: Updated August 15, 2007

    Energy Technology Data Exchange (ETDEWEB)

    Hendron, R.

    2007-09-01

    To track progress toward aggressive multi-year whole-house energy savings goals of 40-70% and onsite power production of up to 30%, DOE's Residential Buildings Program and NREL developed the Building America Research Benchmark in consultation with the Building America industry teams. The Benchmark is generally consistent with mid-1990s standard practice, as reflected in the Home Energy Rating System (HERS) Technical Guidelines (RESNET 2002), with additional definitions that allow the analyst to evaluate all residential end-uses, an extension of the traditional HERS rating approach that focuses on space conditioning and hot water. Unlike the reference homes used for HERS, EnergyStar, and most energy codes, the Benchmark represents typical construction at a fixed point in time so it can be used as the basis for Building America's multi-year energy savings goals without the complication of chasing a 'moving target'.

  4. Benchmarking motion planning algorithms for bin-picking applications

    DEFF Research Database (Denmark)

    Iversen, Thomas Fridolin; Ellekilde, Lars-Peter

    2017-01-01

    Purpose For robot motion planning there exists a large number of different algorithms, each appropriate for a certain domain, and the right choice of planner depends on the specific use case. The purpose of this paper is to consider the application of bin picking and benchmark a set of motion...... planning algorithms to identify which are most suited in the given context. Design/methodology/approach The paper presents a selection of motion planning algorithms and defines benchmarks based on three different bin-picking scenarios. The evaluation is done based on a fixed set of tasks, which are planned...... and executed on a real and a simulated robot. Findings The benchmarking shows a clear difference between the planners and generally indicates that algorithms integrating optimization, despite longer planning time, perform better due to a faster execution. Originality/value The originality of this work lies...

  5. Local implementation of the Essence of Care benchmarks.

    Science.gov (United States)

    Jones, Sue

    To understand clinical practice benchmarking from the perspective of nurses working in a large acute NHS trust and to determine whether the nurses perceived that their commitment to Essence of Care led to improvements in care, the factors that influenced their role in the process and the organisational factors that influenced benchmarking. An ethnographic case study approach was adopted. Six themes emerged from the data. Two organisational issues emerged: leadership and the values and/or culture of the organisation. The findings suggested that the leadership ability of the Essence of Care link nurses and the value placed on this work by the organisation were key to the success of benchmarking. A model for successful implementation of the Essence of Care is proposed based on the findings of this study, which lends itself to testing by other organisations.

  6. HS06 Benchmark for an ARM Server

    Science.gov (United States)

    Kluth, Stefan

    2014-06-01

    We benchmarked an ARM cortex-A9 based server system with a four-core CPU running at 1.1 GHz. The system used Ubuntu 12.04 as operating system and the HEPSPEC 2006 (HS06) benchmarking suite was compiled natively with gcc-4.4 on the system. The benchmark was run for various settings of the relevant gcc compiler options. We did not find significant influence from the compiler options on the benchmark result. The final HS06 benchmark result is 10.4.

  7. HS06 benchmark for an ARM server

    International Nuclear Information System (INIS)

    Kluth, Stefan

    2014-01-01

    We benchmarked an ARM cortex-A9 based server system with a four-core CPU running at 1.1 GHz. The system used Ubuntu 12.04 as operating system and the HEPSPEC 2006 (HS06) benchmarking suite was compiled natively with gcc-4.4 on the system. The benchmark was run for various settings of the relevant gcc compiler options. We did not find significant influence from the compiler options on the benchmark result. The final HS06 benchmark result is 10.4.

  8. Gamma ray benchmark on the spent fuel shipping cask TN 12

    International Nuclear Information System (INIS)

    Blum, P.; Cagnon, R.; Cladel, C.; Ermont, G.; Nimal, J.C.

    1983-05-01

    The purpose of this benchmark is to compare measurements and calculation of gamma-ray dose rates around a shipping cask loaded with 12 spent fuel elements of FESSENHEIM PWR type. The benchmark provides a means to verify gamma-ray sources and gamma-ray transport calculation methods in shipping cask configurations. The comparison between measurements and calculations shows a good agreement except near the fuel element top where the discrepancy reaches a factor 2

  9. ICSBEP-2007, International Criticality Safety Benchmark Experiment Handbook

    International Nuclear Information System (INIS)

    Blair Briggs, J.

    2007-01-01

    1 - Description: The Critically Safety Benchmark Evaluation Project (CSBEP) was initiated in October of 1992 by the United Sates Department of Energy. The project quickly became an international effort as scientist from other interested countries became involved. The International Criticality Safety Benchmark Evaluation Project (ICSBEP) is now an official activity of the Organization of Economic Cooperation and Development - Nuclear Energy Agency (OECD-NEA). This handbook contains criticality safety benchmark specifications that have been derived from experiments that were performed at various nuclear critical facilities around the world. The benchmark specifications are intended for use by criticality safety engineers to validate calculational techniques used to establish minimum subcritical margins for operations with fissile material. The example calculations presented do not constitute a validation of the codes or cross section data. The work of the ICSBEP is documented as an International Handbook of Evaluated Criticality Safety Benchmark Experiments. Currently, the handbook spans over 42,000 pages and contains 464 evaluations representing 4,092 critical, near-critical, or subcritical configurations and 21 criticality alarm placement/shielding configurations with multiple dose points for each and 46 configurations that have been categorized as fundamental physics measurements that are relevant to criticality safety applications. The handbook is intended for use by criticality safety analysts to perform necessary validations of their calculational techniques and is expected to be a valuable tool for decades to come. The ICSBEP Handbook is available on DVD. You may request a DVD by completing the DVD Request Form on the internet. Access to the Handbook on the Internet requires a password. You may request a password by completing the Password Request Form. The Web address is: http://icsbep.inel.gov/handbook.shtml 2 - Method of solution: Experiments that are found

  10. An approach to calculating absorbed doses to organs of high radiation sensitivity in diagnostic radioisotope examinations in vivo

    International Nuclear Information System (INIS)

    Staniszewska, M.A.; Jankowski, J.

    1984-01-01

    A method is presented of dose calculations for internal exposures of organ-sources and organ-targets. Variations of absorbed doses depending on sex and age of the patients investigated with the use of radionuclides are discussed. Definitions of the effective and collective dose equivalents are also given. 8 refs., 1 tab. (author)

  11. Argonne Code Center: Benchmark problem book.

    Energy Technology Data Exchange (ETDEWEB)

    None, None

    1977-06-01

    This book is an outgrowth of activities of the Computational Benchmark Problems Committee of the Mathematics and Computation Division of the American Nuclear Society. This is the second supplement of the original benchmark book which was first published in February, 1968 and contained computational benchmark problems in four different areas. Supplement No. 1, which was published in December, 1972, contained corrections to the original benchmark book plus additional problems in three new areas. The current supplement. Supplement No. 2, contains problems in eight additional new areas. The objectives of computational benchmark work and the procedures used by the committee in pursuing the objectives are outlined in the original edition of the benchmark book (ANL-7416, February, 1968). The members of the committee who have made contributions to Supplement No. 2 are listed below followed by the contributors to the earlier editions of the benchmark book.

  12. Benchmarks

    Data.gov (United States)

    Earth Data Analysis Center, University of New Mexico — The National Flood Hazard Layer (NFHL) data incorporates all Digital Flood Insurance Rate Map(DFIRM) databases published by FEMA, and any Letters Of Map Revision...

  13. Development and application of freshwater sediment-toxicity benchmarks for currently used pesticides

    Energy Technology Data Exchange (ETDEWEB)

    Nowell, Lisa H., E-mail: lhnowell@usgs.gov [U.S. Geological Survey, California Water Science Center, Placer Hall, 6000 J Street, Sacramento, CA 95819 (United States); Norman, Julia E., E-mail: jnorman@usgs.gov [U.S. Geological Survey, Oregon Water Science Center, 2130 SW 5" t" h Avenue, Portland, OR 97201 (United States); Ingersoll, Christopher G., E-mail: cingersoll@usgs.gov [U.S. Geological Survey, Columbia Environmental Research Center, 4200 New Haven Road, Columbia, MO 65021 (United States); Moran, Patrick W., E-mail: pwmoran@usgs.gov [U.S. Geological Survey, Washington Water Science Center, 934 Broadway, Suite 300, Tacoma, WA 98402 (United States)

    2016-04-15

    Sediment-toxicity benchmarks are needed to interpret the biological significance of currently used pesticides detected in whole sediments. Two types of freshwater sediment benchmarks for pesticides were developed using spiked-sediment bioassay (SSB) data from the literature. These benchmarks can be used to interpret sediment-toxicity data or to assess the potential toxicity of pesticides in whole sediment. The Likely Effect Benchmark (LEB) defines a pesticide concentration in whole sediment above which there is a high probability of adverse effects on benthic invertebrates, and the Threshold Effect Benchmark (TEB) defines a concentration below which adverse effects are unlikely. For compounds without available SSBs, benchmarks were estimated using equilibrium partitioning (EqP). When a sediment sample contains a pesticide mixture, benchmark quotients can be summed for all detected pesticides to produce an indicator of potential toxicity for that mixture. Benchmarks were developed for 48 pesticide compounds using SSB data and 81 compounds using the EqP approach. In an example application, data for pesticides measured in sediment from 197 streams across the United States were evaluated using these benchmarks, and compared to measured toxicity from whole-sediment toxicity tests conducted with the amphipod Hyalella azteca (28-d exposures) and the midge Chironomus dilutus (10-d exposures). Amphipod survival, weight, and biomass were significantly and inversely related to summed benchmark quotients, whereas midge survival, weight, and biomass showed no relationship to benchmarks. Samples with LEB exceedances were rare (n = 3), but all were toxic to amphipods (i.e., significantly different from control). Significant toxicity to amphipods was observed for 72% of samples exceeding one or more TEBs, compared to 18% of samples below all TEBs. Factors affecting toxicity below TEBs may include the presence of contaminants other than pesticides, physical/chemical characteristics

  14. Development and application of freshwater sediment-toxicity benchmarks for currently used pesticides

    International Nuclear Information System (INIS)

    Nowell, Lisa H.; Norman, Julia E.; Ingersoll, Christopher G.; Moran, Patrick W.

    2016-01-01

    Sediment-toxicity benchmarks are needed to interpret the biological significance of currently used pesticides detected in whole sediments. Two types of freshwater sediment benchmarks for pesticides were developed using spiked-sediment bioassay (SSB) data from the literature. These benchmarks can be used to interpret sediment-toxicity data or to assess the potential toxicity of pesticides in whole sediment. The Likely Effect Benchmark (LEB) defines a pesticide concentration in whole sediment above which there is a high probability of adverse effects on benthic invertebrates, and the Threshold Effect Benchmark (TEB) defines a concentration below which adverse effects are unlikely. For compounds without available SSBs, benchmarks were estimated using equilibrium partitioning (EqP). When a sediment sample contains a pesticide mixture, benchmark quotients can be summed for all detected pesticides to produce an indicator of potential toxicity for that mixture. Benchmarks were developed for 48 pesticide compounds using SSB data and 81 compounds using the EqP approach. In an example application, data for pesticides measured in sediment from 197 streams across the United States were evaluated using these benchmarks, and compared to measured toxicity from whole-sediment toxicity tests conducted with the amphipod Hyalella azteca (28-d exposures) and the midge Chironomus dilutus (10-d exposures). Amphipod survival, weight, and biomass were significantly and inversely related to summed benchmark quotients, whereas midge survival, weight, and biomass showed no relationship to benchmarks. Samples with LEB exceedances were rare (n = 3), but all were toxic to amphipods (i.e., significantly different from control). Significant toxicity to amphipods was observed for 72% of samples exceeding one or more TEBs, compared to 18% of samples below all TEBs. Factors affecting toxicity below TEBs may include the presence of contaminants other than pesticides, physical/chemical characteristics

  15. International handbook of evaluated criticality safety benchmark experiments

    International Nuclear Information System (INIS)

    2010-01-01

    The Criticality Safety Benchmark Evaluation Project (CSBEP) was initiated in October of 1992 by the United States Department of Energy. The project quickly became an international effort as scientists from other interested countries became involved. The International Criticality Safety Benchmark Evaluation Project (ICSBEP) became an official activity of the Organization for Economic Cooperation and Development - Nuclear Energy Agency (OECD-NEA) in 1995. This handbook contains criticality safety benchmark specifications that have been derived from experiments performed at various nuclear critical facilities around the world. The benchmark specifications are intended for use by criticality safety engineers to validate calculational techniques used to establish minimum subcritical margins for operations with fissile material and to determine criticality alarm requirement and placement. Many of the specifications are also useful for nuclear data testing. Example calculations are presented; however, these calculations do not constitute a validation of the codes or cross section data. The evaluated criticality safety benchmark data are given in nine volumes. These volumes span over 55,000 pages and contain 516 evaluations with benchmark specifications for 4,405 critical, near critical, or subcritical configurations, 24 criticality alarm placement / shielding configurations with multiple dose points for each, and 200 configurations that have been categorized as fundamental physics measurements that are relevant to criticality safety applications. Experiments that are found unacceptable for use as criticality safety benchmark experiments are discussed in these evaluations; however, benchmark specifications are not derived for such experiments (in some cases models are provided in an appendix). Approximately 770 experimental configurations are categorized as unacceptable for use as criticality safety benchmark experiments. Additional evaluations are in progress and will be

  16. Refined hazard characterization of 3-MCPD using benchmark dose modeling

    NARCIS (Netherlands)

    Rietjens, I.M.C.M.; Scholz, G.; Berg, van den I.; Schilter, B.; Slob, W.

    2012-01-01

    3-Monochloropropane-1,2-diol (3-MCPD)-esters represent a newly identified class of food-borne process contaminants of possible health concern. Due to hydrolysis 3-MCPD esters constitute a potentially significant source of free 3-MCPD exposure and their preliminary risk assessment was based on

  17. A benchmarking program to reduce red blood cell outdating: implementation, evaluation, and a conceptual framework.

    Science.gov (United States)

    Barty, Rebecca L; Gagliardi, Kathleen; Owens, Wendy; Lauzon, Deborah; Scheuermann, Sheena; Liu, Yang; Wang, Grace; Pai, Menaka; Heddle, Nancy M

    2015-07-01

    Benchmarking is a quality improvement tool that compares an organization's performance to that of its peers for selected indicators, to improve practice. Processes to develop evidence-based benchmarks for red blood cell (RBC) outdating in Ontario hospitals, based on RBC hospital disposition data from Canadian Blood Services, have been previously reported. These benchmarks were implemented in 160 hospitals provincewide with a multifaceted approach, which included hospital education, inventory management tools and resources, summaries of best practice recommendations, recognition of high-performing sites, and audit tools on the Transfusion Ontario website (http://transfusionontario.org). In this study we describe the implementation process and the impact of the benchmarking program on RBC outdating. A conceptual framework for continuous quality improvement of a benchmarking program was also developed. The RBC outdating rate for all hospitals trended downward continuously from April 2006 to February 2012, irrespective of hospitals' transfusion rates or their distance from the blood supplier. The highest annual outdating rate was 2.82%, at the beginning of the observation period. Each year brought further reductions, with a nadir outdating rate of 1.02% achieved in 2011. The key elements of the successful benchmarking strategy included dynamic targets, a comprehensive and evidence-based implementation strategy, ongoing information sharing, and a robust data system to track information. The Ontario benchmarking program for RBC outdating resulted in continuous and sustained quality improvement. Our conceptual iterative framework for benchmarking provides a guide for institutions implementing a benchmarking program. © 2015 AABB.

  18. Using the fuzzy linear regression method to benchmark the energy efficiency of commercial buildings

    International Nuclear Information System (INIS)

    Chung, William

    2012-01-01

    Highlights: ► Fuzzy linear regression method is used for developing benchmarking systems. ► The systems can be used to benchmark energy efficiency of commercial buildings. ► The resulting benchmarking model can be used by public users. ► The resulting benchmarking model can capture the fuzzy nature of input–output data. -- Abstract: Benchmarking systems from a sample of reference buildings need to be developed to conduct benchmarking processes for the energy efficiency of commercial buildings. However, not all benchmarking systems can be adopted by public users (i.e., other non-reference building owners) because of the different methods in developing such systems. An approach for benchmarking the energy efficiency of commercial buildings using statistical regression analysis to normalize other factors, such as management performance, was developed in a previous work. However, the field data given by experts can be regarded as a distribution of possibility. Thus, the previous work may not be adequate to handle such fuzzy input–output data. Consequently, a number of fuzzy structures cannot be fully captured by statistical regression analysis. This present paper proposes the use of fuzzy linear regression analysis to develop a benchmarking process, the resulting model of which can be used by public users. An illustrative example is given as well.

  19. Development and application of freshwater sediment-toxicity benchmarks for currently used pesticides

    Science.gov (United States)

    Nowell, Lisa H.; Norman, Julia E.; Ingersoll, Christopher G.; Moran, Patrick W.

    2016-01-01

    Sediment-toxicity benchmarks are needed to interpret the biological significance of currently used pesticides detected in whole sediments. Two types of freshwater sediment benchmarks for pesticides were developed using spiked-sediment bioassay (SSB) data from the literature. These benchmarks can be used to interpret sediment-toxicity data or to assess the potential toxicity of pesticides in whole sediment. The Likely Effect Benchmark (LEB) defines a pesticide concentration in whole sediment above which there is a high probability of adverse effects on benthic invertebrates, and the Threshold Effect Benchmark (TEB) defines a concentration below which adverse effects are unlikely. For compounds without available SSBs, benchmarks were estimated using equilibrium partitioning (EqP). When a sediment sample contains a pesticide mixture, benchmark quotients can be summed for all detected pesticides to produce an indicator of potential toxicity for that mixture. Benchmarks were developed for 48 pesticide compounds using SSB data and 81 compounds using the EqP approach. In an example application, data for pesticides measured in sediment from 197 streams across the United States were evaluated using these benchmarks, and compared to measured toxicity from whole-sediment toxicity tests conducted with the amphipod Hyalella azteca (28-d exposures) and the midge Chironomus dilutus (10-d exposures). Amphipod survival, weight, and biomass were significantly and inversely related to summed benchmark quotients, whereas midge survival, weight, and biomass showed no relationship to benchmarks. Samples with LEB exceedances were rare (n = 3), but all were toxic to amphipods (i.e., significantly different from control). Significant toxicity to amphipods was observed for 72% of samples exceeding one or more TEBs, compared to 18% of samples below all TEBs. Factors affecting toxicity below TEBs may include the presence of contaminants other than pesticides, physical

  20. Supply network configuration—A benchmarking problem

    Science.gov (United States)

    Brandenburg, Marcus

    2018-03-01

    Managing supply networks is a highly relevant task that strongly influences the competitiveness of firms from various industries. Designing supply networks is a strategic process that considerably affects the structure of the whole network. In contrast, supply networks for new products are configured without major adaptations of the existing structure, but the network has to be configured before the new product is actually launched in the marketplace. Due to dynamics and uncertainties, the resulting planning problem is highly complex. However, formal models and solution approaches that support supply network configuration decisions for new products are scant. The paper at hand aims at stimulating related model-based research. To formulate mathematical models and solution procedures, a benchmarking problem is introduced which is derived from a case study of a cosmetics manufacturer. Tasks, objectives, and constraints of the problem are described in great detail and numerical values and ranges of all problem parameters are given. In addition, several directions for future research are suggested.

  1. Benchmarking the Cost per Person of Mass Treatment for Selected Neglected Tropical Diseases: An Approach Based on Literature Review and Meta-regression with Web-Based Software Application.

    Directory of Open Access Journals (Sweden)

    Christopher Fitzpatrick

    2016-12-01

    Full Text Available Advocacy around mass treatment for the elimination of selected Neglected Tropical Diseases (NTDs has typically put the cost per person treated at less than US$ 0.50. Whilst useful for advocacy, the focus on a single number misrepresents the complexity of delivering "free" donated medicines to about a billion people across the world. We perform a literature review and meta-regression of the cost per person per round of mass treatment against NTDs. We develop a web-based software application (https://healthy.shinyapps.io/benchmark/ to calculate setting-specific unit costs against which programme budgets and expenditures or results-based pay-outs can be benchmarked.We reviewed costing studies of mass treatment for the control, elimination or eradication of lymphatic filariasis, schistosomiasis, soil-transmitted helminthiasis, onchocerciasis, trachoma and yaws. These are the main 6 NTDs for which mass treatment is recommended. We extracted financial and economic unit costs, adjusted to a standard definition and base year. We regressed unit costs on the number of people treated and other explanatory variables. Regression results were used to "predict" country-specific unit cost benchmarks.We reviewed 56 costing studies and included in the meta-regression 34 studies from 23 countries and 91 sites. Unit costs were found to be very sensitive to economies of scale, and the decision of whether or not to use local volunteers. Financial unit costs are expected to be less than 2015 US$ 0.50 in most countries for programmes that treat 100 thousand people or more. However, for smaller programmes, including those in the "last mile", or those that cannot rely on local volunteers, both economic and financial unit costs are expected to be higher.The available evidence confirms that mass treatment offers a low cost public health intervention on the path towards universal health coverage. However, more costing studies focussed on elimination are needed. Unit cost

  2. A Generalized Approach to Model the Spectra and Radiation Dose Rate of Solar Particle Events on the Surface of Mars

    Science.gov (United States)

    Guo, Jingnan; Zeitlin, Cary; Wimmer-Schweingruber, Robert F.; McDole, Thoren; Kühl, Patrick; Appel, Jan C.; Matthiä, Daniel; Krauss, Johannes; Köhler, Jan

    2018-01-01

    For future human missions to Mars, it is important to study the surface radiation environment during extreme and elevated conditions. In the long term, it is mainly galactic cosmic rays (GCRs) modulated by solar activity that contribute to the radiation on the surface of Mars, but intense solar energetic particle (SEP) events may induce acute health effects. Such events may enhance the radiation level significantly and should be detected as immediately as possible to prevent severe damage to humans and equipment. However, the energetic particle environment on the Martian surface is significantly different from that in deep space due to the influence of the Martian atmosphere. Depending on the intensity and shape of the original solar particle spectra, as well as particle types, the surface spectra may induce entirely different radiation effects. In order to give immediate and accurate alerts while avoiding unnecessary ones, it is important to model and well understand the atmospheric effect on the incoming SEPs, including both protons and helium ions. In this paper, we have developed a generalized approach to quickly model the surface response of any given incoming proton/helium ion spectra and have applied it to a set of historical large solar events, thus providing insights into the possible variety of surface radiation environments that may be induced during SEP events. Based on the statistical study of more than 30 significant solar events, we have obtained an empirical model for estimating the surface dose rate directly from the intensities of a power-law SEP spectra.

  3. NASA Software Engineering Benchmarking Effort

    Science.gov (United States)

    Godfrey, Sally; Rarick, Heather

    2012-01-01

    Benchmarking was very interesting and provided a wealth of information (1) We did see potential solutions to some of our "top 10" issues (2) We have an assessment of where NASA stands with relation to other aerospace/defense groups We formed new contacts and potential collaborations (1) Several organizations sent us examples of their templates, processes (2) Many of the organizations were interested in future collaboration: sharing of training, metrics, Capability Maturity Model Integration (CMMI) appraisers, instructors, etc. We received feedback from some of our contractors/ partners (1) Desires to participate in our training; provide feedback on procedures (2) Welcomed opportunity to provide feedback on working with NASA

  4. NEACRP thermal fission product benchmark

    International Nuclear Information System (INIS)

    Halsall, M.J.; Taubman, C.J.

    1989-09-01

    The objective of the thermal fission product benchmark was to compare the range of fission product data in use at the present time. A simple homogeneous problem was set with 200 atoms H/1 atom U235, to be burnt up to 1000 days and then decay for 1000 days. The problem was repeated with 200 atoms H/1 atom Pu239, 20 atoms H/1 atom U235 and 20 atoms H/1 atom Pu239. There were ten participants and the submissions received are detailed in this report. (author)

  5. Benchmark neutron porosity log calculations

    International Nuclear Information System (INIS)

    Little, R.C.; Michael, M.; Verghese, K.; Gardner, R.P.

    1989-01-01

    Calculations have been made for a benchmark neutron porosity log problem with the general purpose Monte Carlo code MCNP and the specific purpose Monte Carlo code McDNL. For accuracy and timing comparison purposes the CRAY XMP and MicroVax II computers have been used with these codes. The CRAY has been used for an analog version of the MCNP code while the MicroVax II has been used for the optimized variance reduction versions of both codes. Results indicate that the two codes give the same results within calculated standard deviations. Comparisons are given and discussed for accuracy (precision) and computation times for the two codes

  6. Benchmarking routine psychological services: a discussion of challenges and methods.

    Science.gov (United States)

    Delgadillo, Jaime; McMillan, Dean; Leach, Chris; Lucock, Mike; Gilbody, Simon; Wood, Nick

    2014-01-01

    Policy developments in recent years have led to important changes in the level of access to evidence-based psychological treatments. Several methods have been used to investigate the effectiveness of these treatments in routine care, with different approaches to outcome definition and data analysis. To present a review of challenges and methods for the evaluation of evidence-based treatments delivered in routine mental healthcare. This is followed by a case example of a benchmarking method applied in primary care. High, average and poor performance benchmarks were calculated through a meta-analysis of published data from services working under the Improving Access to Psychological Therapies (IAPT) Programme in England. Pre-post treatment effect sizes (ES) and confidence intervals were estimated to illustrate a benchmarking method enabling services to evaluate routine clinical outcomes. High, average and poor performance ES for routine IAPT services were estimated to be 0.91, 0.73 and 0.46 for depression (using PHQ-9) and 1.02, 0.78 and 0.52 for anxiety (using GAD-7). Data from one specific IAPT service exemplify how to evaluate and contextualize routine clinical performance against these benchmarks. The main contribution of this report is to summarize key recommendations for the selection of an adequate set of psychometric measures, the operational definition of outcomes, and the statistical evaluation of clinical performance. A benchmarking method is also presented, which may enable a robust evaluation of clinical performance against national benchmarks. Some limitations concerned significant heterogeneity among data sources, and wide variations in ES and data completeness.

  7. Reconstructing Organophosphorus Pesticide Doses Using the Reversed Dosimetry Approach in a Simple Physiologically-Based Pharmacokinetic Model

    Directory of Open Access Journals (Sweden)

    Chensheng Lu

    2012-01-01

    Full Text Available We illustrated the development of a simple pharmacokinetic (SPK model aiming to estimate the absorbed chlorpyrifos doses using urinary biomarker data, 3,5,6-trichlorpyridinol as the model input. The effectiveness of the SPK model in the pesticide risk assessment was evaluated by comparing dose estimates using different urinary composite data. The dose estimates resulting from the first morning voids appeared to be lower than but not significantly different to those using before bedtime, lunch or dinner voids. We found similar trend for dose estimates using three different urinary composite data. However, the dose estimates using the SPK model for individual children were significantly higher than those from the conventional physiologically based pharmacokinetic (PBPK modeling using aggregate environmental measurements of chlorpyrifos as the model inputs. The use of urinary data in the SPK model intuitively provided a plausible alternative to the conventional PBPK model in reconstructing the absorbed chlorpyrifos dose.

  8. Global radiation damage at 300 and 260 K with dose rates approaching 1 MGy s{sup −1}

    Energy Technology Data Exchange (ETDEWEB)

    Warkentin, Matthew; Badeau, Ryan; Hopkins, Jesse B. [Cornell University, Ithaca, NY 14853 (United States); Mulichak, Anne M.; Keefe, Lisa J. [Argonne National Laboratory, Argonne, IL 60439 (United States); Thorne, Robert E., E-mail: ret6@cornell.edu [Cornell University, Ithaca, NY 14853 (United States)

    2012-02-01

    Approximately half of global radiation damage to thaumatin crystals can be outrun at 260 K if data are collected in less than 1 s. Global radiation damage to 19 thaumatin crystals has been measured using dose rates from 3 to 680 kGy s{sup −1}. At room temperature damage per unit dose appears to be roughly independent of dose rate, suggesting that the timescales for important damage processes are less than ∼1 s. However, at T = 260 K approximately half of the global damage manifested at dose rates of ∼10 kGy s{sup −1} can be outrun by collecting data at 680 kGy s{sup −1}. Appreciable sample-to-sample variability in global radiation sensitivity at fixed dose rate is observed. This variability cannot be accounted for by errors in dose calculation, crystal slippage or the size of the data sets in the assay.

  9. Biostatistical approaches for modeling U-shaped dose-response curves and study design considerations in assessing the biological effects of low doses

    International Nuclear Information System (INIS)

    Downs, T.

    1992-01-01

    The demonstration of hormetic effects is rendered difficult for a number of reasons: The spontaneous rate must be large enough for a difference to be detectable. In contrast with detrimental effects, there is a limited range of doses over which beneficial effects are likely to be found. Publication bias hampers publication of low-dose beneficial effects and discourages research in the area. Some scientists actually believe that hormetic effects are contary to reason. All these factors contribute to lessen the chances of detecting hormetic effects through synthesis of the scientific literature. The extra statistical power obtained from mathematical modeling is not available for hormetic studies when appropriate models are not available. Even a simple statistical device such as a test for linear trend does not work well for U-shaped data. The first part of this two-part chapter deals with the probabilities of determining qualitatively what kinds of health effects may result from exposures to substances, and the second part with characterizing quantitative relationships between such health effects and exposures. The health effects may be beneficial in some situations, and detrimental in others

  10. New approach to the approximation of «dose – effect» dependence during the human somatic cells irradiation

    Directory of Open Access Journals (Sweden)

    V. F. Chekhun

    2013-09-01

    Full Text Available New data on cytogenetic approximation of the experimental cytogenetic dependence "dose - effect" based on the spline regression model that improves biological dosimetry of human radiological exposure were received. This is achieved by reducing the error of the determination of absorbed dose as compared to the traditional use of linear and linear-quadratic models and makes it possible to predict the effect of dose curves on plateau.

  11. MCR2S unstructured mesh capabilities for use in shutdown dose rate analysis

    International Nuclear Information System (INIS)

    Eade, T.; Stonell, D.; Turner, A.

    2015-01-01

    Highlights: • Advancements in shutdown dose rate calculations will be needed as fusion moves from experimental reactors to full scale demonstration reactors in order to ensure the safety of personnel. • The MCR2S shutdown dose rate tool has been modified to allow shutdown dose rates calculations using an unstructured mesh. • The unstructured mesh capability of MCR2S was used on three shutdown dose rate models, a simple sphere, the ITER computational benchmark and the DEMO computational benchmark. • The results showed a reasonable agreement between an unstructured mesh approach and the CSG approach and highlighted the need to carefully choose the unstructured mesh resolution. - Abstract: As nuclear fusion progresses towards a sustainable energy source and the power of tokamak devices increases, a greater understanding of the radiation fields will be required. As well as on-load radiation fields, off-load or shutdown radiation field are an important consideration for the safety and economic viability of a commercial fusion reactor. Previously codes such as MCR2S have been written in order to predict the shutdown dose rates within, and in regions surrounding, a fusion reactor. MCR2S utilises a constructive solid geometry (CSG) model and a superimposed structured mesh to calculate 3-D maps of the shutdown dose rate. A new approach to MCR2S calculations is proposed and implemented using a single unstructured mesh to replace both the CSG model and the superimposed structured mesh. This new MCR2S approach has been demonstrated on three models of increasing complexity. These models were: a sphere, the ITER computational shutdown dose rate benchmark and the DEMO computational shutdown dose rate benchmark. In each case the results were compared to MCR2S calculations performed using MCR2S with CSG geometry and a superimposed structured mesh. It was concluded that the results from the unstructured mesh implementation of MCR2S compared well to the CSG structured mesh

  12. Controllable dose

    International Nuclear Information System (INIS)

    Alvarez R, J.T.; Anaya M, R.A.

    2004-01-01

    With the purpose of eliminating the controversy about the lineal hypothesis without threshold which found the systems of dose limitation of the recommendations of ICRP 26 and 60, at the end of last decade R. Clarke president of the ICRP proposed the concept of Controllable Dose: as the dose or dose sum that an individual receives from a particular source which can be reasonably controllable by means of any means; said concept proposes a change in the philosophy of the radiological protection of its concern by social approaches to an individual focus. In this work a panorama of the foundations is presented, convenient and inconveniences that this proposal has loosened in the international community of the radiological protection, with the purpose of to familiarize to our Mexican community in radiological protection with these new concepts. (Author)

  13. Reevaluation of the Jezebel Benchmark

    Energy Technology Data Exchange (ETDEWEB)

    Favorite, Jeffrey A. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2014-03-10

    Every nuclear engineering student is familiar with Jezebel, the homogeneous bare sphere of plutonium first assembled at Los Alamos in 1954-1955. The actual Jezebel assembly was neither homogeneous, nor bare, nor spherical; nor was it singular – there were hundreds of Jezebel configurations assembled. The Jezebel benchmark has been reevaluated for the International Criticality Safety Benchmark Evaluation Project (ICSBEP) Handbook. Logbooks, original drawings, mass accountability statements, internal reports, and published reports have been used to model four actual three-dimensional Jezebel assemblies with high fidelity. Because the documentation available today is often inconsistent, three major assumptions were made regarding plutonium part masses and dimensions. The first was that the assembly masses given in Los Alamos report LA-4208 (1969) were correct, and the second was that the original drawing dimension for the polar height of a certain major part was correct. The third assumption was that a change notice indicated on the original drawing was not actually implemented. This talk will describe these assumptions, the alternatives, and the implications. Since the publication of the 2013 ICSBEP Handbook, the actual masses of the major components have turned up. Our assumption regarding the assembly masses was proven correct, but we had the mass distribution incorrect. Work to incorporate the new information is ongoing, and this talk will describe the latest assessment.

  14. SCWEB, Scientific Workstation Evaluation Benchmark

    Energy Technology Data Exchange (ETDEWEB)

    Raffenetti, R C [Computing Services-Support Services Division, Argonne National Laboratory, 9700 South Cass Avenue, Argonne, Illinois 60439 (United States)

    1988-06-16

    1 - Description of program or function: The SCWEB (Scientific Workstation Evaluation Benchmark) software includes 16 programs which are executed in a well-defined scenario to measure the following performance capabilities of a scientific workstation: implementation of FORTRAN77, processor speed, memory management, disk I/O, monitor (or display) output, scheduling of processing (multiprocessing), and scheduling of print tasks (spooling). 2 - Method of solution: The benchmark programs are: DK1, DK2, and DK3, which do Fourier series fitting based on spline techniques; JC1, which checks the FORTRAN function routines which produce numerical results; JD1 and JD2, which solve dense systems of linear equations in double- and single-precision, respectively; JD3 and JD4, which perform matrix multiplication in single- and double-precision, respectively; RB1, RB2, and RB3, which perform substantial amounts of I/O processing on files other than the input and output files; RR1, which does intense single-precision floating-point multiplication in a tight loop, RR2, which initializes a 512x512 integer matrix in a manner which skips around in the address space rather than initializing each consecutive memory cell in turn; RR3, which writes alternating text buffers to the output file; RR4, which evaluates the timer routines and demonstrates that they conform to the specification; and RR5, which determines whether the workstation is capable of executing a 4-megabyte program

  15. Pynamic: the Python Dynamic Benchmark

    Energy Technology Data Exchange (ETDEWEB)

    Lee, G L; Ahn, D H; de Supinksi, B R; Gyllenhaal, J C; Miller, P J

    2007-07-10

    Python is widely used in scientific computing to facilitate application development and to support features such as computational steering. Making full use of some of Python's popular features, which improve programmer productivity, leads to applications that access extremely high numbers of dynamically linked libraries (DLLs). As a result, some important Python-based applications severely stress a system's dynamic linking and loading capabilities and also cause significant difficulties for most development environment tools, such as debuggers. Furthermore, using the Python paradigm for large scale MPI-based applications can create significant file IO and further stress tools and operating systems. In this paper, we present Pynamic, the first benchmark program to support configurable emulation of a wide-range of the DLL usage of Python-based applications for large scale systems. Pynamic has already accurately reproduced system software and tool issues encountered by important large Python-based scientific applications on our supercomputers. Pynamic provided insight for our system software and tool vendors, and our application developers, into the impact of several design decisions. As we describe the Pynamic benchmark, we will highlight some of the issues discovered in our large scale system software and tools using Pynamic.

  16. The Isprs Benchmark on Indoor Modelling

    Science.gov (United States)

    Khoshelham, K.; Díaz Vilariño, L.; Peter, M.; Kang, Z.; Acharya, D.

    2017-09-01

    Automated generation of 3D indoor models from point cloud data has been a topic of intensive research in recent years. While results on various datasets have been reported in literature, a comparison of the performance of different methods has not been possible due to the lack of benchmark datasets and a common evaluation framework. The ISPRS benchmark on indoor modelling aims to address this issue by providing a public benchmark dataset and an evaluation framework for performance comparison of indoor modelling methods. In this paper, we present the benchmark dataset comprising several point clouds of indoor environments captured by different sensors. We also discuss the evaluation and comparison of indoor modelling methods based on manually created reference models and appropriate quality evaluation criteria. The benchmark dataset is available for download at: html"target="_blank">http://www2.isprs.org/commissions/comm4/wg5/benchmark-on-indoor-modelling.html.

  17. A tiered approach for integrating exposure and dosimetry with in vitro dose-response data in the modern risk assessment paradigm

    Science.gov (United States)

    High-throughput (HT) risk screening approaches apply in vitro dose-response data to estimate potential health risks that arise from exposure to chemicals. However, much uncertainty is inherent in relating bioactivities observed in an in vitro system to the perturbations of biolog...

  18. Observer-based FDI for Gain Fault Detection in Ship Propulsion Benchmark

    DEFF Research Database (Denmark)

    Lootsma, T.F.; Izadi-Zamanabadi, Roozbeh; Nijmeijer, H.

    2001-01-01

    A geometric approach for input-affine nonlinear systems is briefly described and then applied to a ship propulsion benchmark. The obtained results are used to design a diagnostic nonlinear observer for successful FDI of the diesel engine gain fault......A geometric approach for input-affine nonlinear systems is briefly described and then applied to a ship propulsion benchmark. The obtained results are used to design a diagnostic nonlinear observer for successful FDI of the diesel engine gain fault...

  19. Observer-based FDI for Gain Fault Detection in Ship Propulsion Benchmark

    DEFF Research Database (Denmark)

    Lootsma, T.F.; Izadi-Zamanabadi, Roozbeh; Nijmeijer, H.

    2001-01-01

    A geometric approach for input-affine nonlinear systems is briefly described and then applied to a ship propulsion benchmark. The obtained results are used to design a diagnostic nonlinear observer for successful FDI of the diesel engine gain fault.......A geometric approach for input-affine nonlinear systems is briefly described and then applied to a ship propulsion benchmark. The obtained results are used to design a diagnostic nonlinear observer for successful FDI of the diesel engine gain fault....

  20. Effect of using type A radiation for dose reconstruction in type B irradiated material: A microdosimetry approach

    International Nuclear Information System (INIS)

    Piters, T.M.; Chernov, V.

    2008-01-01

    A model is proposed to explain that in previously γ irradiated calcite, the yield after additive β irradiation tends to incline to the saturation yield of the β radiation even if that yield is lower than the yield after the γ irradiation. However, the proposed model is not specific for calcite and in fact all calculations are done in a fictive material. The proposed model considers, in contrast to existing models, the track nature of γ and β radiations and that these different types of radiations can be distinguished by the dose distribution inside their tracks. The determination of the dose distribution in the tracks for the different types of irradiations is quite complicated and instead we approximate the γ and β tracks by type A and B tracks that have different but homogeneously distributed dose in their track volumes. The trapping of generated free charges in the track was calculated with a simple one electron-one hole trap model. To obtain the total dose response (the average concentration of occupied traps as a function of dose), the yield in one point was averaged over all possible configurations of track overlapping in that point. We determined the slope of the initial part of the response curve (low dose sensitivity) and the saturation yield as function of the track dose. It is observed that the low dose sensitivity and saturation yield both decrease with increasing track dose. Simulations of the response to sequential irradiation first by type A radiation with a 64 Gy track dose and then followed by type B radiation with a track dose of 128 Gy using our model show a similar effect as observed in calcite demonstrating that the track nature of radiation is a plausible cause for the observed effect

  1. Analysis of a molten salt reactor benchmark

    International Nuclear Information System (INIS)

    Ghosh, Biplab; Bajpai, Anil; Degweker, S.B.

    2013-01-01

    This paper discusses results of our studies of an IAEA molten salt reactor (MSR) benchmark. The benchmark, proposed by Japan, involves burnup calculations of a single lattice cell of a MSR for burning plutonium and other minor actinides. We have analyzed this cell with in-house developed burnup codes BURNTRAN and McBURN. This paper also presents a comparison of the results of our codes and those obtained by the proposers of the benchmark. (author)

  2. Benchmarking i eksternt regnskab og revision

    DEFF Research Database (Denmark)

    Thinggaard, Frank; Kiertzner, Lars

    2001-01-01

    løbende i en benchmarking-proces. Dette kapitel vil bredt undersøge, hvor man med nogen ret kan få benchmarking-begrebet knyttet til eksternt regnskab og revision. Afsnit 7.1 beskæftiger sig med det eksterne årsregnskab, mens afsnit 7.2 tager fat i revisionsområdet. Det sidste afsnit i kapitlet opsummerer...... betragtningerne om benchmarking i forbindelse med begge områder....

  3. Computational Chemistry Comparison and Benchmark Database

    Science.gov (United States)

    SRD 101 NIST Computational Chemistry Comparison and Benchmark Database (Web, free access)   The NIST Computational Chemistry Comparison and Benchmark Database is a collection of experimental and ab initio thermochemical properties for a selected set of molecules. The goals are to provide a benchmark set of molecules for the evaluation of ab initio computational methods and allow the comparison between different ab initio computational methods for the prediction of thermochemical properties.

  4. Aerodynamic Benchmarking of the Deepwind Design

    DEFF Research Database (Denmark)

    Bedona, Gabriele; Schmidt Paulsen, Uwe; Aagaard Madsen, Helge

    2015-01-01

    The aerodynamic benchmarking for the DeepWind rotor is conducted comparing different rotor geometries and solutions and keeping the comparison as fair as possible. The objective for the benchmarking is to find the most suitable configuration in order to maximize the power production and minimize...... the blade solicitation and the cost of energy. Different parameters are considered for the benchmarking study. The DeepWind blade is characterized by a shape similar to the Troposkien geometry but asymmetric between the top and bottom parts: this shape is considered as a fixed parameter in the benchmarking...

  5. HPC Benchmark Suite NMx, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — Intelligent Automation Inc., (IAI) and University of Central Florida (UCF) propose to develop a comprehensive numerical test suite for benchmarking current and...

  6. High Energy Physics (HEP) benchmark program

    International Nuclear Information System (INIS)

    Yasu, Yoshiji; Ichii, Shingo; Yashiro, Shigeo; Hirayama, Hideo; Kokufuda, Akihiro; Suzuki, Eishin.

    1993-01-01

    High Energy Physics (HEP) benchmark programs are indispensable tools to select suitable computer for HEP application system. Industry standard benchmark programs can not be used for this kind of particular selection. The CERN and the SSC benchmark suite are famous HEP benchmark programs for this purpose. The CERN suite includes event reconstruction and event generator programs, while the SSC one includes event generators. In this paper, we found that the results from these two suites are not consistent. And, the result from the industry benchmark does not agree with either of these two. Besides, we describe comparison of benchmark results using EGS4 Monte Carlo simulation program with ones from two HEP benchmark suites. Then, we found that the result from EGS4 in not consistent with the two ones. The industry standard of SPECmark values on various computer systems are not consistent with the EGS4 results either. Because of these inconsistencies, we point out the necessity of a standardization of HEP benchmark suites. Also, EGS4 benchmark suite should be developed for users of applications such as medical science, nuclear power plant, nuclear physics and high energy physics. (author)

  7. Establishing benchmarks and metrics for utilization management.

    Science.gov (United States)

    Melanson, Stacy E F

    2014-01-01

    The changing environment of healthcare reimbursement is rapidly leading to a renewed appreciation of the importance of utilization management in the clinical laboratory. The process of benchmarking of laboratory operations is well established for comparing organizational performance to other hospitals (peers) and for trending data over time through internal benchmarks. However, there are relatively few resources available to assist organizations in benchmarking for laboratory utilization management. This article will review the topic of laboratory benchmarking with a focus on the available literature and services to assist in managing physician requests for laboratory testing. © 2013.

  8. Professional Performance and Bureaucratic Benchmarking Information

    DEFF Research Database (Denmark)

    Schneider, Melanie L.; Mahlendorf, Matthias D.; Schäffer, Utz

    Prior research documents positive effects of benchmarking information provision on performance and attributes this to social comparisons. However, the effects on professional recipients are unclear. Studies of professional control indicate that professional recipients often resist bureaucratic...... controls because of organizational-professional conflicts. We therefore analyze the association between bureaucratic benchmarking information provision and professional performance and suggest that the association is more positive if prior professional performance was low. We test our hypotheses based...... on archival, publicly disclosed, professional performance data for 191 German orthopedics departments, matched with survey data on bureaucratic benchmarking information given to chief orthopedists by the administration. We find a positive association between bureaucratic benchmarking information provision...

  9. NRC-BNL Benchmark Program on Evaluation of Methods for Seismic Analysis of Coupled Systems

    International Nuclear Information System (INIS)

    Chokshi, N.; DeGrassi, G.; Xu, J.

    1999-01-01

    A NRC-BNL benchmark program for evaluation of state-of-the-art analysis methods and computer programs for seismic analysis of coupled structures with non-classical damping is described. The program includes a series of benchmarking problems designed to investigate various aspects of complexities, applications and limitations associated with methods for analysis of non-classically damped structures. Discussions are provided on the benchmarking process, benchmark structural models, and the evaluation approach, as well as benchmarking ground rules. It is expected that the findings and insights, as well as recommendations from this program will be useful in developing new acceptance criteria and providing guidance for future regulatory activities involving licensing applications of these alternate methods to coupled systems

  10. Benchmarking Analysis of Institutional University Autonomy in Denmark, Lithuania, Romania, Scotland, and Sweden

    DEFF Research Database (Denmark)

    This book presents a benchmark, comparative analysis of institutional university autonomy in Denmark, Lithuania, Romania, Scotland and Sweden. These countries are partners in a EU TEMPUS funded project 'Enhancing University Autonomy in Moldova' (EUniAM). This benchmark analysis was conducted...... by the EUniAM Lead Task Force team that collected and analysed secondary and primary data in each of these countries and produced four benchmark reports that are part of this book. For each dimension and interface of institutional university autonomy, the members of the Lead Task Force team identified...... respective evaluation criteria and searched for similarities and differences in approaches to higher education sectors and respective autonomy regimes in these countries. The consolidated report that precedes the benchmark reports summarises the process and key findings from the four benchmark reports...

  11. COMPETITIVE BIDDING IN MEDICARE ADVANTAGE: EFFECT OF BENCHMARK CHANGES ON PLAN BIDS

    Science.gov (United States)

    Song, Zirui; Landrum, Mary Beth; Chernew, Michael E.

    2013-01-01

    Bidding has been proposed to replace or complement the administered prices in Medicare pays to hospitals and health plans. In 2006, the Medicare Advantage program implemented a competitive bidding system to determine plan payments. In perfectly competitive models, plans bid their costs and thus bids are insensitive to the benchmark. Under many other models of competition, bids respond to changes in the benchmark. We conceptualize the bidding system and use an instrumental variable approach to study the effect of benchmark changes on bids. We use 2006–2010 plan payment data from the Centers for Medicare and Medicaid Services, published county benchmarks, actual realized fee-for-service costs, and Medicare Advantage enrollment. We find that a $1 increase in the benchmark leads to about a $0.53 increase in bids, suggesting that plans in the Medicare Advantage market have meaningful market power. PMID:24308881

  12. Competitive bidding in Medicare Advantage: effect of benchmark changes on plan bids.

    Science.gov (United States)

    Song, Zirui; Landrum, Mary Beth; Chernew, Michael E

    2013-12-01

    Bidding has been proposed to replace or complement the administered prices that Medicare pays to hospitals and health plans. In 2006, the Medicare Advantage program implemented a competitive bidding system to determine plan payments. In perfectly competitive models, plans bid their costs and thus bids are insensitive to the benchmark. Under many other models of competition, bids respond to changes in the benchmark. We conceptualize the bidding system and use an instrumental variable approach to study the effect of benchmark changes on bids. We use 2006-2010 plan payment data from the Centers for Medicare and Medicaid Services, published county benchmarks, actual realized fee-for-service costs, and Medicare Advantage enrollment. We find that a $1 increase in the benchmark leads to about a $0.53 increase in bids, suggesting that plans in the Medicare Advantage market have meaningful market power. Copyright © 2013 Elsevier B.V. All rights reserved.

  13. Benchmark Evaluation of Plutonium Nitrate Solution Arrays

    International Nuclear Information System (INIS)

    Marshall, M.A.; Bess, J.D.

    2011-01-01

    In October and November of 1981 thirteen approach-to-critical experiments were performed on a remote split table machine (RSTM) in the Critical Mass Laboratory of Pacific Northwest Laboratory (PNL) in Richland, Washington, using planar arrays of polyethylene bottles filled with plutonium (Pu) nitrate solution. Arrays of up to sixteen bottles were used to measure the critical number of bottles and critical array spacing with a tight fitting Plexiglas(reg s ign) reflector on all sides of the arrays except the top. Some experiments used Plexiglas shells fitted around each bottles to determine the effect of moderation on criticality. Each bottle contained approximately 2.4 L of Pu(NO3)4 solution with a Pu content of 105 g Pu/L and a free acid molarity H+ of 5.1. The plutonium was of low 240Pu (2.9 wt.%) content. These experiments were performed to fill a gap in experimental data regarding criticality limits for storing and handling arrays of Pu solution in reprocessing facilities. Of the thirteen approach-to-critical experiments eleven resulted in extrapolations to critical configurations. Four of the approaches were extrapolated to the critical number of bottles; these were not evaluated further due to the large uncertainty associated with the modeling of a fraction of a bottle. The remaining seven approaches were extrapolated to critical array spacing of 3-4 and 4-4 arrays; these seven critical configurations were evaluation for inclusion as acceptable benchmark experiments in the International Criticality Safety Benchmark Evaluation Project (ICSBEP) Handbook. Detailed and simple models of these configurations were created and the associated bias of these simplifications was determined to range from 0.00116 and 0.00162 ± 0.00006 ?keff. Monte Carlo analysis of all models was completed using MCNP5 with ENDF/BVII.0 neutron cross section libraries. A thorough uncertainty analysis of all critical, geometric, and material parameters was performed using parameter

  14. Benchmarking homogenization algorithms for monthly data

    Directory of Open Access Journals (Sweden)

    V. K. C. Venema

    2012-01-01

    Full Text Available The COST (European Cooperation in Science and Technology Action ES0601: advances in homogenization methods of climate series: an integrated approach (HOME has executed a blind intercomparison and validation study for monthly homogenization algorithms. Time series of monthly temperature and precipitation were evaluated because of their importance for climate studies and because they represent two important types of statistics (additive and multiplicative. The algorithms were validated against a realistic benchmark dataset. The benchmark contains real inhomogeneous data as well as simulated data with inserted inhomogeneities. Random independent break-type inhomogeneities with normally distributed breakpoint sizes were added to the simulated datasets. To approximate real world conditions, breaks were introduced that occur simultaneously in multiple station series within a simulated network of station data. The simulated time series also contained outliers, missing data periods and local station trends. Further, a stochastic nonlinear global (network-wide trend was added.

    Participants provided 25 separate homogenized contributions as part of the blind study. After the deadline at which details of the imposed inhomogeneities were revealed, 22 additional solutions were submitted. These homogenized datasets were assessed by a number of performance metrics including (i the centered root mean square error relative to the true homogeneous value at various averaging scales, (ii the error in linear trend estimates and (iii traditional contingency skill scores. The metrics were computed both using the individual station series as well as the network average regional series. The performance of the contributions depends significantly on the error metric considered. Contingency scores by themselves are not very informative. Although relative homogenization algorithms typically improve the homogeneity of temperature data, only the best ones improve

  15. Benchmarking of nuclear economics tools

    International Nuclear Information System (INIS)

    Moore, Megan; Korinny, Andriy; Shropshire, David; Sadhankar, Ramesh

    2017-01-01

    Highlights: • INPRO and GIF economic tools exhibited good alignment in total capital cost estimation. • Subtle discrepancies in the cost result from differences in financing and the fuel cycle assumptions. • A common set of assumptions was found to reduce the discrepancies to 1% or less. • Opportunities for harmonisation of economic tools exists. - Abstract: Benchmarking of the economics methodologies developed by the Generation IV International Forum (GIF) and the International Atomic Energy Agency’s International Project on Innovative Nuclear Reactors and Fuel Cycles (INPRO), was performed for three Generation IV nuclear energy systems. The Economic Modeling Working Group of GIF developed an Excel based spreadsheet package, G4ECONS (Generation 4 Excel-based Calculation Of Nuclear Systems), to calculate the total capital investment cost (TCIC) and the levelised unit energy cost (LUEC). G4ECONS is sufficiently generic in the sense that it can accept the types of projected input, performance and cost data that are expected to become available for Generation IV systems through various development phases and that it can model both open and closed fuel cycles. The Nuclear Energy System Assessment (NESA) Economic Support Tool (NEST) was developed to enable an economic analysis using the INPRO methodology to easily calculate outputs including the TCIC, LUEC and other financial figures of merit including internal rate of return, return of investment and net present value. NEST is also Excel based and can be used to evaluate nuclear reactor systems using the open fuel cycle, MOX (mixed oxide) fuel recycling and closed cycles. A Super Critical Water-cooled Reactor system with an open fuel cycle and two Fast Reactor systems, one with a break-even fuel cycle and another with a burner fuel cycle, were selected for the benchmarking exercise. Published data on capital and operating costs were used for economics analyses using G4ECONS and NEST tools. Both G4ECONS and

  16. FENDL neutronics benchmark: Specifications for the calculational neutronics and shielding benchmark

    International Nuclear Information System (INIS)

    Sawan, M.E.

    1994-12-01

    During the IAEA Advisory Group Meeting on ''Improved Evaluations and Integral Data Testing for FENDL'' held in Garching near Munich, Germany in the period 12-16 September 1994, the Working Group II on ''Experimental and Calculational Benchmarks on Fusion Neutronics for ITER'' recommended that a calculational benchmark representative of the ITER design should be developed. This report describes the neutronics and shielding calculational benchmark available for scientists interested in performing analysis for this benchmark. (author)

  17. Methodical approach to reconstruct individual internal doses for persons residing in areas of Belarus contaminated as a result of the Chernobyl accident

    International Nuclear Information System (INIS)

    Skryabin, Anatoly; Belsky, Yuri

    2008-01-01

    Full text: The studies on the risk to population of low-level exposure following the Chernobyl accident require the estimation of the individual doses. The most difficult aspect is the estimation of internal exposure (IAED int ). Level of individual internal exposure due to ingestion of long-lived caesium isotopes defines by individual 'food habits' (IFH) of the person. Non-standard methodical approach is suggested to evaluate internal doses taking into IFH: 1) IFH are generally conservative by food characteristic and steady in time; 2) IFH of the person determines his dose which can be calculated using data of personal interview and the special table of conformity establishing connection between IFH and corresponding percentile interval in a variation line of doses in given settlement; 3) IAED int (1986-2005) is calculated as the sum of annual doses of the individual for all period of exposure and in all settlements of residing. To develop the model, WBC measurements data (around 1.5 millions) collected in 1987-2005 for population of around 1000 Belarusian settlements were used. The input data for IAEA int calculation include consumption of dose-significant products, duration, and place of residence obtained by mean of individual questionnaire; WBC measurements data; table of conformity (IFH → IAED int ). (author)

  18. SeSBench - An initiative to benchmark reactive transport models for environmental subsurface processes

    Science.gov (United States)

    Jacques, Diederik

    2017-04-01

    As soil functions are governed by a multitude of interacting hydrological, geochemical and biological processes, simulation tools coupling mathematical models for interacting processes are needed. Coupled reactive transport models are a typical example of such coupled tools mainly focusing on hydrological and geochemical coupling (see e.g. Steefel et al., 2015). Mathematical and numerical complexity for both the tool itself or of the specific conceptual model can increase rapidly. Therefore, numerical verification of such type of models is a prerequisite for guaranteeing reliability and confidence and qualifying simulation tools and approaches for any further model application. In 2011, a first SeSBench -Subsurface Environmental Simulation Benchmarking- workshop was held in Berkeley (USA) followed by four other ones. The objective is to benchmark subsurface environmental simulation models and methods with a current focus on reactive transport processes. The final outcome was a special issue in Computational Geosciences (2015, issue 3 - Reactive transport benchmarks for subsurface environmental simulation) with a collection of 11 benchmarks. Benchmarks, proposed by the participants of the workshops, should be relevant for environmental or geo-engineering applications; the latter were mostly related to radioactive waste disposal issues - excluding benchmarks defined for pure mathematical reasons. Another important feature is the tiered approach within a benchmark with the definition of a single principle problem and different sub problems. The latter typically benchmarked individual or simplified processes (e.g. inert solute transport, simplified geochemical conceptual model) or geometries (e.g. batch or one-dimensional, homogeneous). Finally, three codes should be involved into a benchmark. The SeSBench initiative contributes to confidence building for applying reactive transport codes. Furthermore, it illustrates the use of those type of models for different

  19. Benchmarking: A tool for conducting self-assessment

    International Nuclear Information System (INIS)

    Perkey, D.N.

    1992-01-01

    There is more information on nuclear plant performance available than can reasonably be assimilated and used effectively by plant management or personnel responsible for self-assessment. Also, it is becoming increasingly more important that an effective self-assessment program uses internal parameters not only to evaluate performance, but to incorporate lessons learned from other plants. Because of the quantity of information available, it is important to focus efforts and resources in areas where safety or performance is a concern and where the most improvement can be realized. One of the techniques that is being used to effectively accomplish this is benchmarking. Benchmarking involves the use of various sources of information to self-identify a plant's strengths and weaknesses, identify which plants are strong performers in specific areas, evaluate what makes a top performer, and incorporate the success factors into existing programs. The formality with which benchmarking is being implemented varies widely depending on the objective. It can be as simple as looking at a single indicator, such as systematic assessment of licensee performance (SALP) in engineering and technical support, then surveying the top performers with specific questions. However, a more comprehensive approach may include the performance of a detailed benchmarking study. Both operational and economic indicators may be used in this type of evaluation. Some of the indicators that may be considered and the limitations of each are discussed

  20. A rational quantitative approach to determine the best dosing regimen for a target therapeutic effect: a unified formalism for antibiotic evaluation.

    Science.gov (United States)

    Li, Jun; Nekka, Fahima

    2013-02-21

    The determination of an optimal dosing regimen is a critical step to enhance the drug efficacy and avoid toxicity. Rational dosing recommendations based on mathematical considerations are increasingly being adopted in the process of drug development and use. In this paper, we propose a quantitative approach to evaluate the efficacy of antibiotic agents. By integrating both pharmacokinetic (PK) and pharmacodynamic (PD) information, this approach gives rise to a unified formalism able to measure the cause-effect of dosing regimens. This new pharmaco-metric allows to cover a whole range of antibiotics, including the two well known concentration and time dependent classes, through the introduction of the Hill-dependency concept. As a direct fallout, our formalism opens a new path toward the bioequivalence evaluation in terms of PK and PD, which associates the in vivo drug concentration and the in vitro drug effect. Using this new approach, we succeeded to reveal unexpected, but relevant behaviors of drug performance when different drug regimens and drug classes are considered. Of particular notice, we found that the doses required to reach the same therapeutic effect, when scheduled differently, exhibit completely different tendencies for concentration and time dependent drugs. Moreover, we theoretically confirmed the previous experimental results of the superiority of the once daily regimen of aminoglycosides. The proposed methodology is appealing for its computational features and can easily be applicable to design fair clinical protocols or rationalize prescription decisions. Copyright © 2012 Elsevier Ltd. All rights reserved.

  1. The spatial resolution in dosimetry with normoxic polymer-gels investigated with the dose modulation transfer approach

    International Nuclear Information System (INIS)

    Bayreder, Christian; Schoen, Robert; Wieland, M.; Georg, Dietmar; Moser, Ewald; Berg, Andreas

    2008-01-01

    The verification of dose distributions with high dose gradients as appearing in brachytherapy or stereotactic radiotherapy for example, calls for dosimetric methods with sufficiently high spatial resolution. Polymer gels in combination with a MR or optical scanner as a readout device have the potential of performing the verification of a three-dimensional dose distribution within a single measurement. The purpose of this work is to investigate the spatial resolution achievable in MR-based polymer gel dosimetry. The authors show that dosimetry on a very small spatial scale (voxel size: 94x94x1000 μm 3 ) can be performed with normoxic polymer gels using parameter selective T2 imaging. In order to prove the spatial resolution obtained we are relying on the dose-modulation transfer function (DMTF) concept based on very fine dose modulations at half periods of 200 μm. Very fine periodic dose modulations of a 60 Co photon field were achieved by means of an absorption grid made of tungsten-carbide, specifically designed for quality control. The dose modulation in the polymer gel is compared with that of film dosimetry in one plane via the DMTF concept for general access to the spatial resolution of a dose imaging system. Additionally Monte Carlo simulations were performed and used for the calculation of the DMTF of both, the polymer gel and film dosimetry. The results obtained by film dosimetry agree well with those of Monte Carlo simulations, whereas polymer gel dosimetry overestimates the amplitude value of the fine dose modulations. The authors discuss possible reasons. The in-plane resolution achieved in this work competes with the spatial resolution of standard clinical film-scanner systems

  2. Human factors reliability Benchmark exercise

    International Nuclear Information System (INIS)

    Poucet, A.

    1989-06-01

    The Joint Research Centre of the European Commission has organized a Human Factors Reliability Benchmark Exercise (HF-RBE) with the aim of assessing the state of the art in human reliability modelling and assessment. Fifteen teams from eleven countries, representing industry, utilities, licensing organisations and research institutes, participated in the HF-RBE. The HF-RBE was organized around two study cases: (1) analysis of routine functional Test and Maintenance (T and M) procedures: with the aim of assessing the probability of test induced failures, the probability of failures to remain unrevealed and the potential to initiate transients because of errors performed in the test; (2) analysis of human actions during an operational transient: with the aim of assessing the probability that the operators will correctly diagnose the malfunctions and take proper corrective action. This report contains the final summary reports produced by the participants in the exercise

  3. Experimental and computational benchmark tests

    International Nuclear Information System (INIS)

    Gilliam, D.M.; Briesmeister, J.F.

    1994-01-01

    A program involving principally NIST, LANL, and ORNL has been in progress for about four years now to establish a series of benchmark measurements and calculations related to the moderation and leakage of 252 Cf neutrons from a source surrounded by spherical aqueous moderators of various thicknesses and compositions. The motivation for these studies comes from problems in criticality calculations concerning arrays of multiplying components, where the leakage from one component acts as a source for the other components. This talk compares experimental and calculated values for the fission rates of four nuclides - 235 U, 239 Pu, 238 U, and 237 Np - in the leakage spectrum from moderator spheres of diameters 76.2 mm, 101.6 mm, and 127.0 mm, with either pure water or enriched B-10 solutions as the moderator. Very detailed Monte Carlo calculations were done with the MCNP code, using a open-quotes light waterclose quotes S(α,β) scattering kernel

  4. ENVIRONMENTAL BENCHMARKING FOR LOCAL AUTHORITIES

    Directory of Open Access Journals (Sweden)

    Marinela GHEREŞ

    2010-01-01

    Full Text Available This paper is an attempt to clarify and present the many definitions ofbenchmarking. It also attempts to explain the basic steps of benchmarking, toshow how this tool can be applied by local authorities as well as to discuss itspotential benefits and limitations. It is our strong belief that if cities useindicators and progressively introduce targets to improve management andrelated urban life quality, and to measure progress towards more sustainabledevelopment, we will also create a new type of competition among cities andfoster innovation. This is seen to be important because local authorities’actions play a vital role in responding to the challenges of enhancing thestate of the environment not only in policy-making, but also in the provision ofservices and in the planning process. Local communities therefore need tobe aware of their own sustainability performance levels and should be able toengage in exchange of best practices to respond effectively to the ecoeconomicalchallenges of the century.

  5. Benchmark results in radiative transfer

    International Nuclear Information System (INIS)

    Garcia, R.D.M.; Siewert, C.E.

    1986-02-01

    Several aspects of the F N method are reported, and the method is used to solve accurately some benchmark problems in radiative transfer in the field of atmospheric physics. The method was modified to solve cases of pure scattering and an improved process was developed for computing the radiation intensity. An algorithms for computing several quantities used in the F N method was done. An improved scheme to evaluate certain integrals relevant to the method is done, and a two-term recursion relation that has proved useful for the numerical evaluation of matrix elements, basic for the method, is given. The methods used to solve the encountered linear algebric equations are discussed, and the numerical results are evaluated. (M.C.K.) [pt

  6. Benchmarking time-dependent neutron problems with Monte Carlo codes

    International Nuclear Information System (INIS)

    Couet, B.; Loomis, W.A.

    1990-01-01

    Many nuclear logging tools measure the time dependence of a neutron flux in a geological formation to infer important properties of the formation. The complex geometry of the tool and the borehole within the formation does not permit an exact deterministic modelling of the neutron flux behaviour. While this exact simulation is possible with Monte Carlo methods the computation time does not facilitate quick turnaround of results useful for design and diagnostic purposes. Nonetheless a simple model based on the diffusion-decay equation for the flux of neutrons of a single energy group can be useful in this situation. A combination approach where a Monte Carlo calculation benchmarks a deterministic model in terms of the diffusion constants of the neutrons propagating in the media and their flux depletion rates thus offers the possibility of quick calculation with assurance as to accuracy. We exemplify this approach with the Monte Carlo benchmarking of a logging tool problem, showing standoff and bedding response. (author)

  7. Healthcare Analytics: Creating a Prioritized Improvement System with Performance Benchmarking.

    Science.gov (United States)

    Kolker, Eugene; Kolker, Evelyne

    2014-03-01

    The importance of healthcare improvement is difficult to overstate. This article describes our collaborative work with experts at Seattle Children's to create a prioritized improvement system using performance benchmarking. We applied analytics and modeling approaches to compare and assess performance metrics derived from U.S. News and World Report benchmarking data. We then compared a wide range of departmental performance metrics, including patient outcomes, structural and process metrics, survival rates, clinical practices, and subspecialist quality. By applying empirically simulated transformations and imputation methods, we built a predictive model that achieves departments' average rank correlation of 0.98 and average score correlation of 0.99. The results are then translated into prioritized departmental and enterprise-wide improvements, following a data to knowledge to outcomes paradigm. These approaches, which translate data into sustainable outcomes, are essential to solving a wide array of healthcare issues, improving patient care, and reducing costs.

  8. NASA Software Engineering Benchmarking Study

    Science.gov (United States)

    Rarick, Heather L.; Godfrey, Sara H.; Kelly, John C.; Crumbley, Robert T.; Wifl, Joel M.

    2013-01-01

    To identify best practices for the improvement of software engineering on projects, NASA's Offices of Chief Engineer (OCE) and Safety and Mission Assurance (OSMA) formed a team led by Heather Rarick and Sally Godfrey to conduct this benchmarking study. The primary goals of the study are to identify best practices that: Improve the management and technical development of software intensive systems; Have a track record of successful deployment by aerospace industries, universities [including research and development (R&D) laboratories], and defense services, as well as NASA's own component Centers; and Identify candidate solutions for NASA's software issues. Beginning in the late fall of 2010, focus topics were chosen and interview questions were developed, based on the NASA top software challenges. Between February 2011 and November 2011, the Benchmark Team interviewed a total of 18 organizations, consisting of five NASA Centers, five industry organizations, four defense services organizations, and four university or university R and D laboratory organizations. A software assurance representative also participated in each of the interviews to focus on assurance and software safety best practices. Interviewees provided a wealth of information on each topic area that included: software policy, software acquisition, software assurance, testing, training, maintaining rigor in small projects, metrics, and use of the Capability Maturity Model Integration (CMMI) framework, as well as a number of special topics that came up in the discussions. NASA's software engineering practices compared favorably with the external organizations in most benchmark areas, but in every topic, there were ways in which NASA could improve its practices. Compared to defense services organizations and some of the industry organizations, one of NASA's notable weaknesses involved communication with contractors regarding its policies and requirements for acquired software. One of NASA's strengths

  9. Performance benchmarking in cardiac imaging

    International Nuclear Information System (INIS)

    Schick, D.; Thiele, D.

    2004-01-01

    Full text: Diagnostic and interventional procedures performed in a cardiac catheter laboratory while demanding high image quality may also result in high patient radiation dose depending on the length or complexity of the procedure. Clinicians using the X-ray equipment require confidence that the system is operating optimally to ensure maximum benefit to the patient with minimum risk. 17 cardiac catheterisation laboratories have been surveyed using a phantom based on the NEMA XR 21 -2000 standard. The testing protocol measures spatial resolution, low contrast detectability, patient dose rate, dynamic range and motion blur for modes of operation and simulated patient sizes applicable to a diagnostic left heart catheter study. The combined results of the assessed laboratories are presented. The latest generation systems with flat-panel detectors exhibit better spatial resolution than older systems with image intensifiers. Phantom measurements show up to a 6 fold variation in dose rate across the range of systems assessed for a given patient size. As expected, some correlation between patient dose rate and the low contrast detectability score is evident. The extent of temporal filtering and pulse width is reflected in the motion blur score. The dynamic range measurements are found to be a less sensitive measure in evaluating system performance. Examination of patient dose results in the context of low contrast detectability score indicates that dose reduction could be achieved without compromising diagnosis on some systems. Copyright (2004) Australasian College of Physical Scientists and Engineers in Medicine

  10. HyspIRI Low Latency Concept and Benchmarks

    Science.gov (United States)

    Mandl, Dan

    2010-01-01

    Topics include HyspIRI low latency data ops concept, HyspIRI data flow, ongoing efforts, experiment with Web Coverage Processing Service (WCPS) approach to injecting new algorithms into SensorWeb, low fidelity HyspIRI IPM testbed, compute cloud testbed, open cloud testbed environment, Global Lambda Integrated Facility (GLIF) and OCC collaboration with Starlight, delay tolerant network (DTN) protocol benchmarking, and EO-1 configuration for preliminary DTN prototype.

  11. Benchmarking set for domestic smart grid management

    NARCIS (Netherlands)

    Bosman, M.G.C.; Bakker, Vincent; Molderink, Albert; Hurink, Johann L.; Smit, Gerardus Johannes Maria

    2010-01-01

    In this paper we propose a benchmark for domestic smart grid management. It consists of an in-depth description of a domestic smart grid, in which local energy consumers, producers and buffers can be controlled. First, from this description a general benchmark framework is derived, which can be used

  12. Medical school benchmarking - from tools to programmes.

    Science.gov (United States)

    Wilkinson, Tim J; Hudson, Judith N; Mccoll, Geoffrey J; Hu, Wendy C Y; Jolly, Brian C; Schuwirth, Lambert W T

    2015-02-01

    Benchmarking among medical schools is essential, but may result in unwanted effects. To apply a conceptual framework to selected benchmarking activities of medical schools. We present an analogy between the effects of assessment on student learning and the effects of benchmarking on medical school educational activities. A framework by which benchmarking can be evaluated was developed and applied to key current benchmarking activities in Australia and New Zealand. The analogy generated a conceptual framework that tested five questions to be considered in relation to benchmarking: what is the purpose? what are the attributes of value? what are the best tools to assess the attributes of value? what happens to the results? and, what is the likely "institutional impact" of the results? If the activities were compared against a blueprint of desirable medical graduate outcomes, notable omissions would emerge. Medical schools should benchmark their performance on a range of educational activities to ensure quality improvement and to assure stakeholders that standards are being met. Although benchmarking potentially has positive benefits, it could also result in perverse incentives with unforeseen and detrimental effects on learning if it is undertaken using only a few selected assessment tools.

  13. Benchmark Two-Good Utility Functions

    NARCIS (Netherlands)

    de Jaegher, K.

    Benchmark two-good utility functions involving a good with zero income elasticity and unit income elasticity are well known. This paper derives utility functions for the additional benchmark cases where one good has zero cross-price elasticity, unit own-price elasticity, and zero own price

  14. Repeated Results Analysis for Middleware Regression Benchmarking

    Czech Academy of Sciences Publication Activity Database

    Bulej, Lubomír; Kalibera, T.; Tůma, P.

    2005-01-01

    Roč. 60, - (2005), s. 345-358 ISSN 0166-5316 R&D Projects: GA ČR GA102/03/0672 Institutional research plan: CEZ:AV0Z10300504 Keywords : middleware benchmarking * regression benchmarking * regression testing Subject RIV: JD - Computer Applications, Robotics Impact factor: 0.756, year: 2005

  15. Benchmarking, Total Quality Management, and Libraries.

    Science.gov (United States)

    Shaughnessy, Thomas W.

    1993-01-01

    Discussion of the use of Total Quality Management (TQM) in higher education and academic libraries focuses on the identification, collection, and use of reliable data. Methods for measuring quality, including benchmarking, are described; performance measures are considered; and benchmarking techniques are examined. (11 references) (MES)

  16. Benchmark problems for radiological assessment codes. Final report

    International Nuclear Information System (INIS)

    Mills, M.; Vogt, D.; Mann, B.

    1983-09-01

    This report describes benchmark problems to test computer codes used in the radiological assessment of high-level waste repositories. The problems presented in this report will test two types of codes. The first type of code calculates the time-dependent heat generation and radionuclide inventory associated with a high-level waste package. Five problems have been specified for this code type. The second code type addressed in this report involves the calculation of radionuclide transport and dose-to-man. For these codes, a comprehensive problem and two subproblems have been designed to test the relevant capabilities of these codes for assessing a high-level waste repository setting

  17. A Seafloor Benchmark for 3-dimensional Geodesy

    Science.gov (United States)

    Chadwell, C. D.; Webb, S. C.; Nooner, S. L.

    2014-12-01

    We have developed an inexpensive, permanent seafloor benchmark to increase the longevity of seafloor geodetic measurements. The benchmark provides a physical tie to the sea floor lasting for decades (perhaps longer) on which geodetic sensors can be repeatedly placed and removed with millimeter resolution. Global coordinates estimated with seafloor geodetic techniques will remain attached to the benchmark allowing for the interchange of sensors as they fail or become obsolete, or for the sensors to be removed and used elsewhere, all the while maintaining a coherent series of positions referenced to the benchmark. The benchmark has been designed to free fall from the sea surface with transponders attached. The transponder can be recalled via an acoustic command sent from the surface to release from the benchmark and freely float to the sea surface for recovery. The duration of the sensor attachment to the benchmark will last from a few days to a few years depending on the specific needs of the experiment. The recovered sensors are then available to be reused at other locations, or again at the same site in the future. Three pins on the sensor frame mate precisely and unambiguously with three grooves on the benchmark. To reoccupy a benchmark a Remotely Operated Vehicle (ROV) uses its manipulator arm to place the sensor pins into the benchmark grooves. In June 2014 we deployed four benchmarks offshore central Oregon. We used the ROV Jason to successfully demonstrate the removal and replacement of packages onto the benchmark. We will show the benchmark design and its operational capabilities. Presently models of megathrust slip within the Cascadia Subduction Zone (CSZ) are mostly constrained by the sub-aerial GPS vectors from the Plate Boundary Observatory, a part of Earthscope. More long-lived seafloor geodetic measures are needed to better understand the earthquake and tsunami risk associated with a large rupture of the thrust fault within the Cascadia subduction zone

  18. Benchmark for license plate character segmentation

    Science.gov (United States)

    Gonçalves, Gabriel Resende; da Silva, Sirlene Pio Gomes; Menotti, David; Shwartz, William Robson

    2016-09-01

    Automatic license plate recognition (ALPR) has been the focus of many researches in the past years. In general, ALPR is divided into the following problems: detection of on-track vehicles, license plate detection, segmentation of license plate characters, and optical character recognition (OCR). Even though commercial solutions are available for controlled acquisition conditions, e.g., the entrance of a parking lot, ALPR is still an open problem when dealing with data acquired from uncontrolled environments, such as roads and highways when relying only on imaging sensors. Due to the multiple orientations and scales of the license plates captured by the camera, a very challenging task of the ALPR is the license plate character segmentation (LPCS) step, because its effectiveness is required to be (near) optimal to achieve a high recognition rate by the OCR. To tackle the LPCS problem, this work proposes a benchmark composed of a dataset designed to focus specifically on the character segmentation step of the ALPR within an evaluation protocol. Furthermore, we propose the Jaccard-centroid coefficient, an evaluation measure more suitable than the Jaccard coefficient regarding the location of the bounding box within the ground-truth annotation. The dataset is composed of 2000 Brazilian license plates consisting of 14000 alphanumeric symbols and their corresponding bounding box annotations. We also present a straightforward approach to perform LPCS efficiently. Finally, we provide an experimental evaluation for the dataset based on five LPCS approaches and demonstrate the importance of character segmentation for achieving an accurate OCR.

  19. Benchmark of systematic human action reliability procedure

    International Nuclear Information System (INIS)

    Spurgin, A.J.; Hannaman, G.W.; Moieni, P.

    1986-01-01

    Probabilistic risk assessment (PRA) methodology has emerged as one of the most promising tools for assessing the impact of human interactions on plant safety and understanding the importance of the man/machine interface. Human interactions were considered to be one of the key elements in the quantification of accident sequences in a PRA. The approach to quantification of human interactions in past PRAs has not been very systematic. The Electric Power Research Institute sponsored the development of SHARP to aid analysts in developing a systematic approach for the evaluation and quantification of human interactions in a PRA. The SHARP process has been extensively peer reviewed and has been adopted by the Institute of Electrical and Electronics Engineers as the basis of a draft guide for the industry. By carrying out a benchmark process, in which SHARP is an essential ingredient, however, it appears possible to assess the strengths and weaknesses of SHARP to aid human reliability analysts in carrying out human reliability analysis as part of a PRA

  20. Benchmarking of refinery emissions performance : Executive summary

    International Nuclear Information System (INIS)

    2003-07-01

    This study was undertaken to collect emissions performance data for Canadian and comparable American refineries. The objective was to examine parameters that affect refinery air emissions performance and develop methods or correlations to normalize emissions performance. Another objective was to correlate and compare the performance of Canadian refineries to comparable American refineries. For the purpose of this study, benchmarking involved the determination of levels of emission performance that are being achieved for generic groups of facilities. A total of 20 facilities were included in the benchmarking analysis, and 74 American refinery emission correlations were developed. The recommended benchmarks, and the application of those correlations for comparison between Canadian and American refinery performance, were discussed. The benchmarks were: sulfur oxides, nitrogen oxides, carbon monoxide, particulate, volatile organic compounds, ammonia and benzene. For each refinery in Canada, benchmark emissions were developed. Several factors can explain differences in Canadian and American refinery emission performance. 4 tabs., 7 figs

  1. Benchmarking Water Quality from Wastewater to Drinking Waters Using Reduced Transcriptome of Human Cells.

    Science.gov (United States)

    Xia, Pu; Zhang, Xiaowei; Zhang, Hanxin; Wang, Pingping; Tian, Mingming; Yu, Hongxia

    2017-08-15

    One of the major challenges in environmental science is monitoring and assessing the risk of complex environmental mixtures. In vitro bioassays with limited key toxicological end points have been shown to be suitable to evaluate mixtures of organic pollutants in wastewater and recycled water. Omics approaches such as transcriptomics can monitor biological effects at the genome scale. However, few studies have applied omics approach in the assessment of mixtures of organic micropollutants. Here, an omics approach was developed for profiling bioactivity of 10 water samples ranging from wastewater to drinking water in human cells by a reduced human transcriptome (RHT) approach and dose-response modeling. Transcriptional expression of 1200 selected genes were measured by an Ampliseq technology in two cell lines, HepG2 and MCF7, that were exposed to eight serial dilutions of each sample. Concentration-effect models were used to identify differentially expressed genes (DEGs) and to calculate effect concentrations (ECs) of DEGs, which could be ranked to investigate low dose response. Furthermore, molecular pathways disrupted by different samples were evaluated by Gene Ontology (GO) enrichment analysis. The ability of RHT for representing bioactivity utilizing both HepG2 and MCF7 was shown to be comparable to the results of previous in vitro bioassays. Finally, the relative potencies of the mixtures indicated by RHT analysis were consistent with the chemical profiles of the samples. RHT analysis with human cells provides an efficient and cost-effective approach to benchmarking mixture of micropollutants and may offer novel insight into the assessment of mixture toxicity in water.

  2. Vver-1000 Mox core computational benchmark

    International Nuclear Information System (INIS)

    2006-01-01

    The NEA Nuclear Science Committee has established an Expert Group that deals with the status and trends of reactor physics, fuel performance and fuel cycle issues related to disposing of weapons-grade plutonium in mixed-oxide fuel. The objectives of the group are to provide NEA member countries with up-to-date information on, and to develop consensus regarding, core and fuel cycle issues associated with burning weapons-grade plutonium in thermal water reactors (PWR, BWR, VVER-1000, CANDU) and fast reactors (BN-600). These issues concern core physics, fuel performance and reliability, and the capability and flexibility of thermal water reactors and fast reactors to dispose of weapons-grade plutonium in standard fuel cycles. The activities of the NEA Expert Group on Reactor-based Plutonium Disposition are carried out in close co-operation (jointly, in most cases) with the NEA Working Party on Scientific Issues in Reactor Systems (WPRS). A prominent part of these activities include benchmark studies. At the time of preparation of this report, the following benchmarks were completed or in progress: VENUS-2 MOX Core Benchmarks: carried out jointly with the WPRS (formerly the WPPR) (completed); VVER-1000 LEU and MOX Benchmark (completed); KRITZ-2 Benchmarks: carried out jointly with the WPRS (formerly the WPPR) (completed); Hollow and Solid MOX Fuel Behaviour Benchmark (completed); PRIMO MOX Fuel Performance Benchmark (ongoing); VENUS-2 MOX-fuelled Reactor Dosimetry Calculation (ongoing); VVER-1000 In-core Self-powered Neutron Detector Calculational Benchmark (started); MOX Fuel Rod Behaviour in Fast Power Pulse Conditions (started); Benchmark on the VENUS Plutonium Recycling Experiments Configuration 7 (started). This report describes the detailed results of the benchmark investigating the physics of a whole VVER-1000 reactor core using two-thirds low-enriched uranium (LEU) and one-third MOX fuel. It contributes to the computer code certification process and to the

  3. Assessment of diurnal systemic dose of agrochemicals in regulatory toxicity testing--an integrated approach without additional animal use.

    Science.gov (United States)

    Saghir, Shakil A; Bartels, Michael J; Rick, David L; McCoy, Alene T; Rasoulpour, Reza J; Ellis-Hutchings, Robert G; Sue Marty, M; Terry, Claire; Bailey, Jason P; Billington, Richard; Bus, James S

    2012-07-01

    Integrated toxicokinetics (TK) data provide information on the rate, extent and duration of systemic exposure across doses, species, strains, gender, and life stages within a toxicology program. While routine for pharmaceuticals, TK assessments of non-pharmaceuticals are still relatively rare, and have never before been included in a full range of guideline studies for a new agrochemical. In order to better understand the relationship between diurnal systemic dose (AUC(24h)) and toxicity of agrochemicals, TK analyses in the study animals is now included in all short- (excluding acute), medium- and long-term guideline mammalian toxicity studies including reproduction/developmental tests. This paper describes a detailed procedure for the implementation of TK in short-, medium- and long-term regulatory toxicity studies, without the use of satellite animals, conducted on three agrochemicals (X11422208, 2,4-D and X574175). In these studies, kinetically-derived maximum doses (KMD) from short-term studies instead of, or along with, maximum tolerated doses (MTD) were used for the selection of the high dose in subsequent longer-term studies. In addition to leveraging TK data to guide dose level selection, the integrated program was also used to select the most appropriate method of oral administration (i.e., gavage versus dietary) of test materials for rat and rabbit developmental toxicity studies. The integrated TK data obtained across toxicity studies (without the use of additional/satellite animals) provided data critical to understanding differences in response across doses, species, strains, sexes, and life stages. Such data should also be useful in mode of action studies and to improve human risk assessments. Copyright © 2012 Elsevier Inc. All rights reserved.

  4. Theory-motivated benchmark models and superpartners at the Fermilab Tevatron

    International Nuclear Information System (INIS)

    Kane, G.L.; Nelson, Brent D.; Wang Liantao; Wang, Ting T.; Lykken, J.; Mrenna, Stephen

    2003-01-01

    Recently published benchmark models have contained rather heavy superpartners. To test the robustness of this result, several benchmark models have been constructed based on theoretically well-motivated approaches, particularly string-based ones. These include variations on anomaly- and gauge-mediated models, as well as gravity mediation. The resulting spectra often have light gauginos that are produced in significant quantities at the Fermilab Tevatron collider, or will be at a 500 GeV linear collider. The signatures also provide interesting challenges for the CERN LHC. In addition, these models are capable of accounting for electroweak symmetry breaking with less severe cancellations among soft supersymmetry breaking parameters than previous benchmark models

  5. Communication: energy benchmarking with quantum Monte Carlo for water nano-droplets and bulk liquid water.

    Science.gov (United States)

    Alfè, D; Bartók, A P; Csányi, G; Gillan, M J

    2013-06-14

    We show the feasibility of using quantum Monte Carlo (QMC) to compute benchmark energies for configuration samples of thermal-equilibrium water clusters and the bulk liquid containing up to 64 molecules. Evidence that the accuracy of these benchmarks approaches that of basis-set converged coupled-cluster calculations is noted. We illustrate the usefulness of the benchmarks by using them to analyze the errors of the popular BLYP approximation of density functional theory (DFT). The results indicate the possibility of using QMC as a routine tool for analyzing DFT errors for non-covalent bonding in many types of condensed-phase molecular system.

  6. 8-MOP PUVA for psoriasis: a comparison of a minimal phototoxic dose-based regimen with a skin-type approach

    Energy Technology Data Exchange (ETDEWEB)

    Collins, P.; Wainwright, N.J.; Amorim, I.; Lakshmipathi, T.; Ferguson, J. [Ninewells Hospital and Medical School, Dundee (United Kingdom)

    1996-08-01

    Two ultraviolet A (UVA) regimens for oral 8-methoxypsoralen (8-MOP) photochemotherapy (PUVA) for moderate/severe chronic plaque psoriasis using a half-body study technique were compared. Each patient received both regimens. A higher-dose regimen based on minimal phototoxic dose (MPD) with percentage incremental increases was given to one-half of the body. The other half received a lower dose regimen based on skin type with fixed incremental UVA increases. Patients were treated twice weekly. Symmetrical plaques were scored to determine the rate of resolution with each regimen. In addition, the number of treatments, cumulative UVA dose and number of days in treatment to achieve overall clearance were recorded. Patients were reviewed monthly for one year to record remission data. Thirty-three patients completed the study. Both regimens were effective and well tolerated. With the MPD-based approach, number of exposures was significantly less for patients with skin types I and II but not III. Although the cumulative UVA dose was higher with the MPD regimen for all skin types studied, the reduced number of exposures required for clearance for skin types I and II but not III, combined with the security of individualized MPD testing, has practical attractions. MPD testing also identified five patients who required an increased psoralen dose and six patients who required a reduction of the initial UVA dose with the skin type regimen. Forty-two percent were still clear 1 year after treatment and there was no significant difference in the number of days in remission between the regimens for those whose psoriasis had recurred. The reduction in the number of exposures required for clearance with the MPD-based regimen may be safer and more cost effective in the long term. (author).

  7. A methodological approach to a realistic evaluation of skin absorbed doses during manipulation of radioactive sources by means of GAMOS Monte Carlo simulations

    Science.gov (United States)

    Italiano, Antonio; Amato, Ernesto; Auditore, Lucrezia; Baldari, Sergio

    2018-05-01

    The accurate evaluation of the radiation burden associated with radiation absorbed doses to the skin of the extremities during the manipulation of radioactive sources is a critical issue in operational radiological protection, deserving the most accurate calculation approaches available. Monte Carlo simulation of the radiation transport and interaction is the gold standard for the calculation of dose distributions in complex geometries and in presence of extended spectra of multi-radiation sources. We propose the use of Monte Carlo simulations in GAMOS, in order to accurately estimate the dose to the extremities during manipulation of radioactive sources. We report the results of these simulations for 90Y, 131I, 18F and 111In nuclides in water solutions enclosed in glass or plastic receptacles, such as vials or syringes. Skin equivalent doses at 70 μm of depth and dose-depth profiles are reported for different configurations, highlighting the importance of adopting a realistic geometrical configuration in order to get accurate dosimetric estimations. Due to the easiness of implementation of GAMOS simulations, case-specific geometries and nuclides can be adopted and results can be obtained in less than about ten minutes of computation time with a common workstation.

  8. Radiation dose and intra-articular access: comparison of the lateral mortise and anterior midline approaches to fluoroscopically guided tibiotalar joint injections

    Energy Technology Data Exchange (ETDEWEB)

    Huang, Ambrose J.; Torriani, Martin; Bredella, Miriam A.; Chang, Connie Y.; Simeone, Frank J.; Palmer, William E. [Massachusetts General Hospital, Department of Radiology, Division of Musculoskeletal Imaging and Intervention, Boston, MA (United States); Balza, Rene [Centro Medico de Occidente, Department of Radiology, Maracaibo (Venezuela, Bolivarian Republic of)

    2016-03-15

    To compare the lateral mortise and anterior midline approaches to fluoroscopically guided tibiotalar joint injections with respect to successful intra-articular needle placement, fluoroscopy time, radiation dose, and dose area product (DAP). This retrospective study was IRB-approved and HIPAA-compliant. 498 fluoroscopically guided tibiotalar joint injections were performed or supervised by one of nine staff radiologists from 11/1/2010-12/31/2013. The injection approach was determined by operator preference. Images were reviewed on a PACS workstation to determine the injection approach (lateral mortise versus anterior midline) and to confirm intra-articular needle placement. Fluoroscopy time (minutes), radiation dose (mGy), and DAP (μGy-m{sup 2}) were recorded and compared using the student's t-test (fluoroscopy time) or the Wilcoxon rank sum test (radiation dose and DAP). There were 246 lateral mortise injections and 252 anterior midline injections. Two lateral mortise injections were excluded from further analysis because no contrast was administered. Intra-articular location of the needle tip was documented in 242/244 lateral mortise injections and 252/252 anterior midline injections. Mean fluoroscopy time was shorter for the lateral mortise group than the anterior midline group (0.7 ± 0.5 min versus 1.2 ± 0.8 min, P < 0.0001). Mean radiation dose and DAP were less for the lateral mortise group than the anterior midline group (2.1 ± 3.7 mGy versus 2.5 ± 3.5 mGy, P = 0.04; 11.5 ± 15.3 μGy-m{sup 2} versus 13.5 ± 17.3 μGy-m{sup 2}, P = 0.006). Both injection approaches resulted in nearly 100 % rates of intra-articular needle placement, but the lateral mortise approach used approximately 40 % less fluoroscopy time and delivered 15 % lower radiation dose and DAP to the patient. (orig.)

  9. The cyclophosphamide equivalent dose as an approach for quantifying alkylating agent exposure: a report from the Childhood Cancer Survivor Study.

    Science.gov (United States)

    Green, Daniel M; Nolan, Vikki G; Goodman, Pamela J; Whitton, John A; Srivastava, DeoKumar; Leisenring, Wendy M; Neglia, Joseph P; Sklar, Charles A; Kaste, Sue C; Hudson, Melissa M; Diller, Lisa R; Stovall, Marilyn; Donaldson, Sarah S; Robison, Leslie L

    2014-01-01

    Estimation of the risk of adverse long-term outcomes such as second malignant neoplasms and infertility often requires reproducible quantification of exposures. The method for quantification should be easily utilized and valid across different study populations. The widely used Alkylating Agent Dose (AAD) score is derived from the drug dose distribution of the study population and thus cannot be used for comparisons across populations as each will have a unique distribution of drug doses. We compared the performance of the Cyclophosphamide Equivalent Dose (CED), a unit for quantifying alkylating agent exposure independent of study population, to the AAD. Comparisons included associations from three Childhood Cancer Survivor Study (CCSS) outcome analyses, receiver operator characteristic (ROC) curves and goodness of fit based on the Akaike's Information Criterion (AIC). The CED and AAD performed essentially identically in analyses of risk for pregnancy among the partners of male CCSS participants, risk for adverse dental outcomes among all CCSS participants and risk for premature menopause among female CCSS participants, based on similar associations, lack of statistically significant differences between the areas under the ROC curves and similar model fit values for the AIC between models including the two measures of exposure. The CED is easily calculated, facilitating its use for patient counseling. It is independent of the drug dose distribution of a particular patient population, a characteristic that will allow direct comparisons of outcomes among epidemiological cohorts. We recommend the use of the CED in future research assessing cumulative alkylating agent exposure. © 2013 Wiley Periodicals, Inc.

  10. Stepped-irradiation SAR: A viable approach to circumvent OSL equivalent dose underestimation in last glacial loess of northwestern China

    International Nuclear Information System (INIS)

    Qin, J.T.; Zhou, L.P.

    2009-01-01

    The equivalent dose (D e ) obtained with the continuous irradiation SAR (CI-SAR) protocol for fine-grained quartz from loess of northwestern China is found to be lower than the expected value for samples older than 70 ka based on the regional stratigraphy. This is attributed to the difference in the response of the quartz to natural radiation and laboratory beta irradiation whose rates vary by ∼10 8 times. A stepped irradiation SAR protocol was employed to evaluate the influence of such a 'dose rate effect' on the equivalent dose determination. After investigating the effects of thermal treatment and 'unit-dose' on OSL signal and D e , we refined the stepped irradiation strategy with a 'unit-dose' of ∼25 Gy and successive thermal treatments at 250 deg. C for 10 s, and applied it to the SAR protocol. This stepped irradiation SAR (SI-SAR) protocol led to a 20%-70% increase in D e value for loess deposited during the early last glacial period.

  11. Framework for benchmarking online retailing performance using fuzzy AHP and TOPSIS method

    Directory of Open Access Journals (Sweden)

    M. Ahsan Akhtar Hasin

    2012-08-01

    Full Text Available Due to increasing penetration of internet connectivity, on-line retail is growing from the pioneer phase to increasing integration within people's lives and companies' normal business practices. In the increasingly competitive environment, on-line retail service providers require systematic and structured approach to have cutting edge over the rival. Thus, the use of benchmarking has become indispensable to accomplish superior performance to support the on-line retail service providers. This paper uses the fuzzy analytic hierarchy process (FAHP approach to support a generic on-line retail benchmarking process. Critical success factors for on-line retail service have been identified from a structured questionnaire and literature and prioritized using fuzzy AHP. Using these critical success factors, performance levels of the ORENET an on-line retail service provider is benchmarked along with four other on-line service providers using TOPSIS method. Based on the benchmark, their relative ranking has also been illustrated.

  12. Experience with in vivo diode dosimetry for verifying radiotherapy dose delivery: Practical implementation of cost-effective approaches

    International Nuclear Information System (INIS)

    Thwaites, D.I.; Blyth, C.; Carruthers, L.; Elliott, P.A.; Kidane, G.; Millwater, C.J.; MacLeod, A.S.; Paolucci, M.; Stacey, C.

    2002-01-01

    A systematic programme of in vivo dosimetry using diodes to verify radiotherapy delivered doses began in Edinburgh in 1992. The aims were to investigate the feasibility of routine systematic use of diodes as part of a comprehensive QA programme, to carry out clinical pilot studies to assess the accuracy of dose delivery on each machine and for each site and technique, to identify and rectify systematic deviations, to assess departmental dosimetric precision and to compare to clinical requirements. A further aim was to carry out a cost-benefit evaluation based on the results from the pilot studies to consider how best to use diodes routinely

  13. A MATHEMATICAL APPROACH TO ECONOMY OF EXPERIMENT IN DETERMINATIONS OF THE DIFFERENTIAL DOSE ALBEDO OF GAMMA RAYS

    Energy Technology Data Exchange (ETDEWEB)

    Shoemaker, N. F.; Huddleston, C. M.

    1962-12-10

    Treatments of the differential dose albedo of gamma rays on concrete have supposed that the albedo value is a function of: the energy of the incident gamma radiation, the polar angle of incidence, the polar angle of reflection (or scatter), and the azimuthal angle of reflection. It is demonstrated that, if certain reasonable assumptions are made regarding the mechanism of reflection, it is not necessary to investigate variations in albedo with azimuthal angle of refiection. Once differential dose albedo has been determined for a complete set of incident and reflected polar angles with zero azimuth, albedo at any azimuth can be derived by a suitable transformation. (auth)

  14. Utilizing Benchmarking to Study the Effectiveness of Parent-Child Interaction Therapy Implemented in a Community Setting

    Science.gov (United States)

    Self-Brown, Shannon; Valente, Jessica R.; Wild, Robert C.; Whitaker, Daniel J.; Galanter, Rachel; Dorsey, Shannon; Stanley, Jenelle

    2012-01-01

    Benchmarking is a program evaluation approach that can be used to study whether the outcomes of parents/children who participate in an evidence-based program in the community approximate the outcomes found in randomized trials. This paper presents a case illustration using benchmarking methodology to examine a community implementation of…

  15. What Randomized Benchmarking Actually Measures

    International Nuclear Information System (INIS)

    Proctor, Timothy; Rudinger, Kenneth; Young, Kevin; Sarovar, Mohan; Blume-Kohout, Robin

    2017-01-01

    Randomized benchmarking (RB) is widely used to measure an error rate of a set of quantum gates, by performing random circuits that would do nothing if the gates were perfect. In the limit of no finite-sampling error, the exponential decay rate of the observable survival probabilities, versus circuit length, yields a single error metric r. For Clifford gates with arbitrary small errors described by process matrices, r was believed to reliably correspond to the mean, over all Clifford gates, of the average gate infidelity between the imperfect gates and their ideal counterparts. We show that this quantity is not a well-defined property of a physical gate set. It depends on the representations used for the imperfect and ideal gates, and the variant typically computed in the literature can differ from r by orders of magnitude. We present new theories of the RB decay that are accurate for all small errors describable by process matrices, and show that the RB decay curve is a simple exponential for all such errors. Here, these theories allow explicit computation of the error rate that RB measures (r), but as far as we can tell it does not correspond to the infidelity of a physically allowed (completely positive) representation of the imperfect gates.

  16. Benchmarking Commercial Conformer Ensemble Generators.

    Science.gov (United States)

    Friedrich, Nils-Ole; de Bruyn Kops, Christina; Flachsenberg, Florian; Sommer, Kai; Rarey, Matthias; Kirchmair, Johannes

    2017-11-27

    We assess and compare the performance of eight commercial conformer ensemble generators (ConfGen, ConfGenX, cxcalc, iCon, MOE LowModeMD, MOE Stochastic, MOE Conformation Import, and OMEGA) and one leading free algorithm, the distance geometry algorithm implemented in RDKit. The comparative study is based on a new version of the Platinum Diverse Dataset, a high-quality benchmarking dataset of 2859 protein-bound ligand conformations extracted from the PDB. Differences in the performance of commercial algorithms are much smaller than those observed for free algorithms in our previous study (J. Chem. Inf. 2017, 57, 529-539). For commercial algorithms, the median minimum root-mean-square deviations measured between protein-bound ligand conformations and ensembles of a maximum of 250 conformers are between 0.46 and 0.61 Å. Commercial conformer ensemble generators are characterized by their high robustness, with at least 99% of all input molecules successfully processed and few or even no substantial geometrical errors detectable in their output conformations. The RDKit distance geometry algorithm (with minimization enabled) appears to be a good free alternative since its performance is comparable to that of the midranked commercial algorithms. Based on a statistical analysis, we elaborate on which algorithms to use and how to parametrize them for best performance in different application scenarios.

  17. Benchmark tests of JENDL-1

    International Nuclear Information System (INIS)

    Kikuchi, Yasuyuki; Hasegawa, Akira; Takano, Hideki; Kamei, Takanobu; Hojuyama, Takeshi; Sasaki, Makoto; Seki, Yuji; Zukeran, Atsushi; Otake, Iwao.

    1982-02-01

    Various benchmark tests were made on JENDL-1. At the first stage, various core center characteristics were tested for many critical assemblies with one-dimensional model. At the second stage, applicability of JENDL-1 was further tested to more sophisticated problems for MOZART and ZPPR-3 assemblies with two-dimensional model. It was proved that JENDL-1 predicted various quantities of fast reactors satisfactorily as a whole. However, the following problems were pointed out: 1) There exists discrepancy of 0.9% in the k sub(eff)-values between the Pu- and U-cores. 2) The fission rate ratio of 239 Pu to 235 U is underestimated by 3%. 3) The Doppler reactivity coefficients are overestimated by about 10%. 4) The control rod worths are underestimated by 4%. 5) The fission rates of 235 U and 239 Pu are underestimated considerably in the outer core and radial blanket regions. 6) The negative sodium void reactivities are overestimated, when the sodium is removed from the outer core. As a whole, most of problems of JENDL-1 seem to be related with the neutron leakage and the neutron spectrum. It was found through the further study that most of these problems came from too small diffusion coefficients and too large elastic removal cross sections above 100 keV, which might be probably caused by overestimation of the total and elastic scattering cross sections for structural materials in the unresolved resonance region up to several MeV. (author)

  18. Performance of Multi-chaotic PSO on a shifted benchmark functions set

    Energy Technology Data Exchange (ETDEWEB)

    Pluhacek, Michal; Senkerik, Roman; Zelinka, Ivan [Tomas Bata University in Zlín, Faculty of Applied Informatics Department of Informatics and Artificial Intelligence nám. T.G. Masaryka 5555, 760 01 Zlín (Czech Republic)

    2015-03-10

    In this paper the performance of Multi-chaotic PSO algorithm is investigated using two shifted benchmark functions. The purpose of shifted benchmark functions is to simulate the time-variant real-world problems. The results of chaotic PSO are compared with canonical version of the algorithm. It is concluded that using the multi-chaotic approach can lead to better results in optimization of shifted functions.

  19. COMBINING THE CONCEPTS OF BENCHMARKING AND MATRIX GAME IN MARKETING (RE)POSITIONING OF SEAPORTS

    OpenAIRE

    Senka Sekularac-Ivošević; Sanja Bauk; Mirjana Gligorijević

    2013-01-01

    This paper considers the effects of combination of two different approaches in developing seaports positioning strategy. The first one is based on comparing the most important quantitative and qualitative seaports choice criteria by benchmarking method. Benchmarking has been used in creating the appropriate model for efficient marketing positioning of Aegean, Adriatic and Black Sea seaports. The criteria that describe the degree of these seaports competitiveness are chosen upon the investigat...

  20. Performance of Multi-chaotic PSO on a shifted benchmark functions set

    International Nuclear Information System (INIS)

    Pluhacek, Michal; Senkerik, Roman; Zelinka, Ivan

    2015-01-01

    In this paper the performance of Multi-chaotic PSO algorithm is investigated using two shifted benchmark functions. The purpose of shifted benchmark functions is to simulate the time-variant real-world problems. The results of chaotic PSO are compared with canonical version of the algorithm. It is concluded that using the multi-chaotic approach can lead to better results in optimization of shifted functions