WorldWideScience

Sample records for benchmark dose approach

  1. Dose-response assessment using the benchmark dose approach of changes in hepatic EROD activity for individual polychlorinated biphenyl congeners

    Energy Technology Data Exchange (ETDEWEB)

    Fattore, E.; Fanelli, R. [' ' Mario Negri' ' Institute for Pharmacological Research, Milan (Italy); Chu, I. [Safe Environments Programme, Healthy Environments and Consumer Safety Branch, Tunney' s Pasture, Ottawa, ON (Canada); Sand, S.; Haakansson, H. [Institute of Environmental Medicine, Karolinska Institutet, Stockholm (Sweden); Falk-Filippson, A. [Swedish Chemicals Inspectorate, Sundbyberg (Sweden)

    2004-09-15

    The benchmark dose (BMD) approach was proposed as an alternative to the no-observedadverse- effect-level (NOAEL) or the lowest-observed-adverse-effect-level (LOAEL) as point of departure (POD) for extrapolation of data from animal studies to the low dose human exposure situation. In the risk assessment process using the NOAEL/LOAEL parameter, the reference dose (RfD) or the admissible daily intake (ADI) is obtained by dividing the NOAEL/LOAEL value by uncertainty factors. The uncertainty factors are incorporated in order to take into account variability in the sensitivity of different species, inter-individual differences in sensitivity within the human population, and variability in experimental data. In the BMD approach a dose-response curve is fitted to experimental data (Figure 1) and the BMD is calculated from the equation of the curve as the dose corresponding to a predetermined change in the response defined as the benchmark response (BMR). The 95% lower confidence bound of the BMD, usually referred to as BMDL, can be used as the POD in the extrapolation process to get a RfD or an ADI. The advantages of using the BMD approach are many. First, all the experimental data are utilized to construct the doseresponse curve; second, the variability and uncertainty are taken into account by incorporating standard deviations of means; and third, it represents a single methodology for cancer and noncancer endpoints. In this study the BMD methodology was applied to evaluate dose-response data of seven chlorinated biphenyl (CB) congeners (Table 1), some of which are dioxin-like while others are not. The data were obtained from subchronic dietary exposure studies in male and female Sprague Dawley rats. Elevation in ethoxyresorufin-O-deethylase (EROD) activity was selected as biological response because it is known to be an endpoint sensitive to the exposure of dioxin-like PCBs. Since this response is not an adverse effect per se, in this paper we will refer to the no

  2. Benchmark Dose Modeling

    Science.gov (United States)

    Finite doses are employed in experimental toxicology studies. Under the traditional methodology, the point of departure (POD) value for low dose extrapolation is identified as one of these doses. Dose spacing necessarily precludes a more accurate description of the POD value. ...

  3. Avoiding Pitfalls in the Use of the Benchmark Dose Approach to Chemical Risk Assessments; Some Illustrative Case Studies (Presentation)

    Science.gov (United States)

    The USEPA's benchmark dose software (BMDS) version 1.2 has been available over the Internet since April, 2000 (epa.gov/ncea/bmds.htm), and has already been used in risk assessments of some significant environmental pollutants (e.g., diesel exhaust, dichloropropene, hexachlorocycl...

  4. Simple benchmark for complex dose finding studies.

    Science.gov (United States)

    Cheung, Ying Kuen

    2014-06-01

    While a general goal of early phase clinical studies is to identify an acceptable dose for further investigation, modern dose finding studies and designs are highly specific to individual clinical settings. In addition, as outcome-adaptive dose finding methods often involve complex algorithms, it is crucial to have diagnostic tools to evaluate the plausibility of a method's simulated performance and the adequacy of the algorithm. In this article, we propose a simple technique that provides an upper limit, or a benchmark, of accuracy for dose finding methods for a given design objective. The proposed benchmark is nonparametric optimal in the sense of O'Quigley et al. (2002, Biostatistics 3, 51-56), and is demonstrated by examples to be a practical accuracy upper bound for model-based dose finding methods. We illustrate the implementation of the technique in the context of phase I trials that consider multiple toxicities and phase I/II trials where dosing decisions are based on both toxicity and efficacy, and apply the benchmark to several clinical examples considered in the literature. By comparing the operating characteristics of a dose finding method to that of the benchmark, we can form quick initial assessments of whether the method is adequately calibrated and evaluate its sensitivity to the dose-outcome relationships.

  5. Effects of exposure imprecision on estimation of the benchmark dose

    DEFF Research Database (Denmark)

    Budtz-Jørgensen, Esben; Keiding, Niels; Grandjean, Philippe

    2004-01-01

    approach is one of the most widely used methods for development of exposure limits. An important advantage of this approach is that it can be applied to observational data. However, in this type of data, exposure markers are seldom measured without error. It is shown that, if the exposure error is ignored......, then the benchmark approach produces results that are biased toward higher and less protective levels. It is therefore important to take exposure measurement error into account when calculating benchmark doses. Methods that allow this adjustment are described and illustrated in data from an epidemiological study...

  6. Effects of Exposure Imprecision on Estimation of the Benchmark Dose

    DEFF Research Database (Denmark)

    Budtz-Jørgensen, Esben; Keiding, Niels; Grandjean, Philippe

    Environmental epidemiology; exposure measurement error; effect of prenatal mercury exposure; exposure standards; benchmark dose......Environmental epidemiology; exposure measurement error; effect of prenatal mercury exposure; exposure standards; benchmark dose...

  7. Benchmark dose profiles for joint-action continuous data in quantitative risk assessment.

    Science.gov (United States)

    Deutsch, Roland C; Piegorsch, Walter W

    2013-09-01

    Benchmark analysis is a widely used tool in biomedical and environmental risk assessment. Therein, estimation of minimum exposure levels, called benchmark doses (BMDs), that induce a prespecified benchmark response (BMR) is well understood for the case of an adverse response to a single stimulus. For cases where two agents are studied in tandem, however, the benchmark approach is far less developed. This paper demonstrates how the benchmark modeling paradigm can be expanded from the single-agent setting to joint-action, two-agent studies. Focus is on continuous response outcomes. Extending the single-exposure setting, representations of risk are based on a joint-action dose-response model involving both agents. Based on such a model, the concept of a benchmark profile-a two-dimensional analog of the single-dose BMD at which both agents achieve the specified BMR-is defined for use in quantitative risk characterization and assessment.

  8. An Effective Approach for Benchmarking Implementation

    OpenAIRE

    B. M. Deros; Tan, J.; M.N.A. Rahman; N. A.Q.M. Daud

    2011-01-01

    Problem statement: The purpose of this study is to present a benchmarking guideline, conceptual framework and computerized mini program to assists companies achieve better performance in terms of quality, cost, delivery, supply chain and eventually increase their competitiveness in the market. The study begins with literature review on benchmarking definition, barriers and advantages from the implementation and the study of benchmarking framework. Approach: Thirty res...

  9. Benchmark dose approach for low-level lead induced haematogenesis inhibition and associations of childhood intelligences with ALAD activity and ALA levels.

    Science.gov (United States)

    Wang, Q; Ye, L X; Zhao, H H; Chen, J W; Zhou, Y K

    2011-04-15

    Lead (Pb) levels, delta-aminolevulinic acid dehydratase (ALAD) activities, zinc protoporphyrin (ZPP) levels in blood, and urinary delta-aminolevulinic acid (ALA) and coproporphyrin (CP) concentrations were measured for 318 environmental Pb exposed children recruited from an area of southeast China. The mean of blood lead (PbB) levels was 75.0μg/L among all subjects. Benchmark dose (BMD) method was conducted to present a lower PbB BMD (lower bound of BMD) of 32.4μg/L (22.7) based on ALAD activity than those based on the other three haematological indices, corresponding to a benchmark response of 1%. Childhood intelligence degrees were not associated significantly with ALAD activities or ALA levels. It was concluded that blood ALAD activity is a sensitive indicator of early haematological damage due to low-level Pb exposures for children.

  10. Benchmark dose of lead inducing anemia at the workplace.

    Science.gov (United States)

    Karita, Kanae; Yano, Eiji; Dakeishi, Miwako; Iwata, Toyoto; Murata, Katsuyuki

    2005-08-01

    To estimate the critical dose of lead inducing anemia in humans, the effects of lead on hemoglobin (Hb) and hematocrit (Hct) levels and red blood cell (RBC) count were examined in 388 male lead-exposed workers with blood lead (BPb) levels of 0.05-5.5 (mean 1.3) micromol/L by using the benchmark dose (BMD) approach. The BPb level was significantly related to Hb (regression coefficient beta=-0.276), RBC (beta=-11.35), and Hct (beta=-0.563) among the workers (p anemia (1.85 micromol/L), based on the WHO criteria, than in those without anemia (1.26 micromol/L). The benchmark dose levels of BPb (i.e., lower 95% confidence limits of BMD), calculated from the K-power model set at an abnormal probability of 5% in unexposed workers and an excess risk of 5% in exposed workers were estimated to be 0.94 micromol/L (19.5 microg/dl) for Hb, 0.94 micromol/L (19.4 microg/dl) for RBC, and 1.43 micromol/L (29.6 microg/dl) for Hct. These findings suggest that reduction in hematopoietic indicators may be initiated at BPbs below the level currently considered without effect.

  11. An Effective Approach for Benchmarking Implementation

    Directory of Open Access Journals (Sweden)

    B. M. Deros

    2011-01-01

    Full Text Available Problem statement: The purpose of this study is to present a benchmarking guideline, conceptual framework and computerized mini program to assists companies achieve better performance in terms of quality, cost, delivery, supply chain and eventually increase their competitiveness in the market. The study begins with literature review on benchmarking definition, barriers and advantages from the implementation and the study of benchmarking framework. Approach: Thirty respondents were involved in the case study. They comprise of industrial practitioners, which had assessed usability and practicability of the guideline, conceptual framework and computerized mini program. Results: A guideline and template were proposed to simplify the adoption of benchmarking techniques. A conceptual framework was proposed by integrating the Deming’s PDCA and Six Sigma DMAIC theory. It was provided a step-by-step method to simplify the implementation and to optimize the benchmarking results. A computerized mini program was suggested to assist the users in adopting the technique as part of improvement project. As the result from the assessment test, the respondents found that the implementation method provided an idea for company to initiate benchmarking implementation and it guides them to achieve the desired goal as set in a benchmarking project. Conclusion: The result obtained and discussed in this study can be applied in implementing benchmarking in a more systematic way for ensuring its success.

  12. Benchmark dose profiles for joint-action quantal data in quantitative risk assessment.

    Science.gov (United States)

    Deutsch, Roland C; Piegorsch, Walter W

    2012-12-01

    Benchmark analysis is a widely used tool in public health risk analysis. Therein, estimation of minimum exposure levels, called Benchmark Doses (BMDs), that induce a prespecified Benchmark Response (BMR) is well understood for the case of an adverse response to a single stimulus. For cases where two agents are studied in tandem, however, the benchmark approach is far less developed. This article demonstrates how the benchmark modeling paradigm can be expanded from the single-dose setting to joint-action, two-agent studies. Focus is on response outcomes expressed as proportions. Extending the single-exposure setting, representations of risk are based on a joint-action dose-response model involving both agents. Based on such a model, the concept of a benchmark profile (BMP) - a two-dimensional analog of the single-dose BMD at which both agents achieve the specified BMR - is defined for use in quantitative risk characterization and assessment. The resulting, joint, low-dose guidelines can improve public health planning and risk regulation when dealing with low-level exposures to combinations of hazardous agents.

  13. 77 FR 36533 - Notice of Availability of the Benchmark Dose Technical Guidance

    Science.gov (United States)

    2012-06-19

    ... From the Federal Register Online via the Government Publishing Office ENVIRONMENTAL PROTECTION AGENCY Notice of Availability of the Benchmark Dose Technical Guidance AGENCY: Environmental Protection... announcing the availability of Benchmark Dose Technical Guidance (BMD). This document was developed as...

  14. Dose-response modeling : Evaluation, application, and development of procedures for benchmark dose analysis in health risk assessment of chemical substances

    OpenAIRE

    Sand, Salomon

    2005-01-01

    In this thesis, dose-response modeling and procedures for benchmark dose (BMD) analysis in health risk assessment of chemical substances have been investigated. The BMD method has been proposed as an alternative to the NOAEL (no-observedadverse- effect-level) approach in health risk assessment of non-genotoxic agents. According to the BMD concept, a dose-response model is fitted to data and the BMD is defined as the dose causing a predetermined change in response. A lowe...

  15. Benchmarking analytical calculations of proton doses in heterogeneous matter.

    Science.gov (United States)

    Ciangaru, George; Polf, Jerimy C; Bues, Martin; Smith, Alfred R

    2005-12-01

    A proton dose computational algorithm, performing an analytical superposition of infinitely narrow proton beamlets (ASPB) is introduced. The algorithm uses the standard pencil beam technique of laterally distributing the central axis broad beam doses according to the Moliere scattering theory extended to slablike varying density media. The purpose of this study was to determine the accuracy of our computational tool by comparing it with experimental and Monte Carlo (MC) simulation data as benchmarks. In the tests, parallel wide beams of protons were scattered in water phantoms containing embedded air and bone materials with simple geometrical forms and spatial dimensions of a few centimeters. For homogeneous water and bone phantoms, the proton doses we calculated with the ASPB algorithm were found very comparable to experimental and MC data. For layered bone slab inhomogeneity in water, the comparison between our analytical calculation and the MC simulation showed reasonable agreement, even when the inhomogeneity was placed at the Bragg peak depth. There also was reasonable agreement for the parallelepiped bone block inhomogeneity placed at various depths, except for cases in which the bone was located in the region of the Bragg peak, when discrepancies were as large as more than 10%. When the inhomogeneity was in the form of abutting air-bone slabs, discrepancies of as much as 8% occurred in the lateral dose profiles on the air cavity side of the phantom. Additionally, the analytical depth-dose calculations disagreed with the MC calculations within 3% of the Bragg peak dose, at the entry and midway depths in the phantom. The distal depth-dose 20%-80% fall-off widths and ranges calculated with our algorithm and the MC simulation were generally within 0.1 cm of agreement. The analytical lateral-dose profile calculations showed smaller (by less than 0.1 cm) 20%-80% penumbra widths and shorter fall-off tails than did those calculated by the MC simulations. Overall

  16. Evaluation of the applicability of the Benchmark approach to existing toxicological data. Framework: Chemical compounds in the working place

    OpenAIRE

    Appel MJ; Bouman HGM; Pieters MN; Slob W; Adviescentrum voor chemische arbeidsomstandigheden (ACCA) TNO; CSR

    2001-01-01

    Five chemicals used in workplace, for which a risk assessment had already been carried out, were selected and the relevant critical studies re-analyzed by the Benchmark approach. The endpoints involved included continuous, and ordinal data. Dose-response modeling could be reasonablyapplied to the dose-response data encountered, and Critical Effect Doses (CEDs) could be derived for almost all of the endpoints considered. The resulting benchmark dose for the study as a whole was close to the NO...

  17. Evaluation of the benchmark dose for point of departure determination for a variety of chemical classes in applied regulatory settings.

    Science.gov (United States)

    Izadi, Hoda; Grundy, Jean E; Bose, Ranjan

    2012-05-01

    Repeated-dose studies received by the New Substances Assessment and Control Bureau (NSACB) of Health Canada are used to provide hazard information toward risk calculation. These studies provide a point of departure (POD), traditionally the NOAEL or LOAEL, which is used to extrapolate the quantity of substance above which adverse effects can be expected in humans. This project explored the use of benchmark dose (BMD) modeling as an alternative to this approach for studies with few dose groups. Continuous data from oral repeated-dose studies for chemicals previously assessed by NSACB were reanalyzed using U.S. EPA benchmark dose software (BMDS) to determine the BMD and BMD 95% lower confidence limit (BMDL(05) ) for each endpoint critical to NOAEL or LOAEL determination for each chemical. Endpoint-specific benchmark dose-response levels , indicative of adversity, were consistently applied. An overall BMD and BMDL(05) were calculated for each chemical using the geometric mean. The POD obtained from benchmark analysis was then compared with the traditional toxicity thresholds originally used for risk assessment. The BMD and BMDL(05) generally were higher than the NOAEL, but lower than the LOAEL. BMDL(05) was generally constant at 57% of the BMD. Benchmark provided a clear advantage in health risk assessment when a LOAEL was the only POD identified, or when dose groups were widely distributed. Although the benchmark method cannot always be applied, in the selected studies with few dose groups it provided a more accurate estimate of the real no-adverse-effect level of a substance.

  18. Current modeling practice may lead to falsely high benchmark dose estimates.

    Science.gov (United States)

    Ringblom, Joakim; Johanson, Gunnar; Öberg, Mattias

    2014-07-01

    Benchmark dose (BMD) modeling is increasingly used as the preferred approach to define the point-of-departure for health risk assessment of chemicals. As data are inherently variable, there is always a risk to select a model that defines a lower confidence bound of the BMD (BMDL) that, contrary to expected, exceeds the true BMD. The aim of this study was to investigate how often and under what circumstances such anomalies occur under current modeling practice. Continuous data were generated from a realistic dose-effect curve by Monte Carlo simulations using four dose groups and a set of five different dose placement scenarios, group sizes between 5 and 50 animals and coefficients of variations of 5-15%. The BMD calculations were conducted using nested exponential models, as most BMD software use nested approaches. "Non-protective" BMDLs (higher than true BMD) were frequently observed, in some scenarios reaching 80%. The phenomenon was mainly related to the selection of the non-sigmoidal exponential model (Effect=a·e(b)(·dose)). In conclusion, non-sigmoid models should be used with caution as it may underestimate the risk, illustrating that awareness of the model selection process and sound identification of the point-of-departure is vital for health risk assessment.

  19. A unified framework for benchmark dose estimation applied to mixed models and model averaging

    DEFF Research Database (Denmark)

    Ritz, Christian; Gerhard, Daniel; Hothorn, Ludwig A.

    2013-01-01

    This article develops a framework for benchmark dose estimation that allows intrinsically nonlinear dose-response models to be used for continuous data in much the same way as is already possible for quantal data. This means that the same dose-response model equations may be applied to both...

  20. Standardizing Benchmark Dose Calculations to Improve Science-Based Decisions in Human Health Assessments

    Science.gov (United States)

    Wignall, Jessica A.; Shapiro, Andrew J.; Wright, Fred A.; Woodruff, Tracey J.; Chiu, Weihsueh A.; Guyton, Kathryn Z.

    2014-01-01

    Background: Benchmark dose (BMD) modeling computes the dose associated with a prespecified response level. While offering advantages over traditional points of departure (PODs), such as no-observed-adverse-effect-levels (NOAELs), BMD methods have lacked consistency and transparency in application, interpretation, and reporting in human health assessments of chemicals. Objectives: We aimed to apply a standardized process for conducting BMD modeling to reduce inconsistencies in model fitting and selection. Methods: We evaluated 880 dose–response data sets for 352 environmental chemicals with existing human health assessments. We calculated benchmark doses and their lower limits [10% extra risk, or change in the mean equal to 1 SD (BMD/L10/1SD)] for each chemical in a standardized way with prespecified criteria for model fit acceptance. We identified study design features associated with acceptable model fits. Results: We derived values for 255 (72%) of the chemicals. Batch-calculated BMD/L10/1SD values were significantly and highly correlated (R2 of 0.95 and 0.83, respectively, n = 42) with PODs previously used in human health assessments, with values similar to reported NOAELs. Specifically, the median ratio of BMDs10/1SD:NOAELs was 1.96, and the median ratio of BMDLs10/1SD:NOAELs was 0.89. We also observed a significant trend of increasing model viability with increasing number of dose groups. Conclusions: BMD/L10/1SD values can be calculated in a standardized way for use in health assessments on a large number of chemicals and critical effects. This facilitates the exploration of health effects across multiple studies of a given chemical or, when chemicals need to be compared, providing greater transparency and efficiency than current approaches. Citation: Wignall JA, Shapiro AJ, Wright FA, Woodruff TJ, Chiu WA, Guyton KZ, Rusyn I. 2014. Standardizing benchmark dose calculations to improve science-based decisions in human health assessments. Environ Health

  1. An Experiential Approach to Benchmarking Curriculum

    Science.gov (United States)

    Grandzol, John R.; Grandzol, Christian J.

    2011-01-01

    Continuous curriculum improvement derives from a variety of perspectives, opportunities, and approaches. In this brief, we describe a process that facilitated student participation in curriculum development. We took our Supply Chain Management students to a regional conference affiliated with APICS and had them assess their knowledge readiness…

  2. Immunotoxicity of perfluorinated alkylates: calculation of benchmark doses based on serum concentrations in children

    DEFF Research Database (Denmark)

    Grandjean, Philippe; Budtz-Joergensen, Esben

    2013-01-01

    acid and 0.3 ng/mL serum for perfluorooctanoic acid at a benchmark dose response of 5%. These results are below average serum concentrations reported in recent population studies. Even lower results were obtained using logarithmic dose--response curves. Assumption of no effect below the lowest observed...

  3. Residual Generation for the Ship Benchmark Using Structural Approach

    DEFF Research Database (Denmark)

    Cocquempot, V.; Izadi-Zamanabadi, Roozbeh; Staroswiecki, M

    1998-01-01

    The prime objective of Fault-tolerant Control (FTC) systems is to handle faults and discrepancies using appropriate accommodation policies. The issue of obtaining information about various parameters and signals, which have to be monitored for fault detection purposes, becomes a rigorous task wit...... with the growing number of subsystems. The structural approach, presented in this paper, constitutes a general framework for providing information when the system becomes complex. The methodology of this approach is illustrated on the ship propulsion benchmark....

  4. Portfolio selection and asset pricing under a benchmark approach

    Science.gov (United States)

    Platen, Eckhard

    2006-10-01

    The paper presents classical and new results on portfolio optimization, as well as the fair pricing concept for derivative pricing under the benchmark approach. The growth optimal portfolio is shown to be a central object in a market model. It links asset pricing and portfolio optimization. The paper argues that the market portfolio is a proxy of the growth optimal portfolio. By choosing the drift of the discounted growth optimal portfolio as parameter process, one obtains a realistic theoretical market dynamics.

  5. Benchmarking Benchmarks

    NARCIS (Netherlands)

    D.C. Blitz (David)

    2011-01-01

    textabstractBenchmarking benchmarks is a bundle of six studies that are inspired by the prevalence of benchmarking in academic finance research as well as in investment practice. Three studies examine if current benchmark asset pricing models adequately describe the cross-section of stock returns.

  6. Benchmarking Benchmarks

    NARCIS (Netherlands)

    D.C. Blitz (David)

    2011-01-01

    textabstractBenchmarking benchmarks is a bundle of six studies that are inspired by the prevalence of benchmarking in academic finance research as well as in investment practice. Three studies examine if current benchmark asset pricing models adequately describe the cross-section of stock returns. W

  7. Benchmarking pediatric cranial CT protocols using a dose tracking software system: a multicenter study

    Energy Technology Data Exchange (ETDEWEB)

    Bondt, Timo de; Parizel, Paul M. [Antwerp University Hospital and University of Antwerp, Department of Radiology, Antwerp (Belgium); Mulkens, Tom [H. Hart Hospital, Department of Radiology, Lier (Belgium); Zanca, Federica [GE Healthcare, DoseWatch, Buc (France); KU Leuven, Imaging and Pathology Department, Leuven (Belgium); Pyfferoen, Lotte; Casselman, Jan W. [AZ St. Jan Brugge-Oostende AV Hospital, Department of Radiology, Brugge (Belgium)

    2017-02-15

    To benchmark regional standard practice for paediatric cranial CT-procedures in terms of radiation dose and acquisition parameters. Paediatric cranial CT-data were retrospectively collected during a 1-year period, in 3 different hospitals of the same country. A dose tracking system was used to automatically gather information. Dose (CTDI and DLP), scan length, amount of retakes and demographic data were stratified by age and clinical indication; appropriate use of child-specific protocols was assessed. In total, 296 paediatric cranial CT-procedures were collected. Although the median dose of each hospital was below national and international diagnostic reference level (DRL) for all age categories, statistically significant (p-value < 0.001) dose differences among hospitals were observed. The hospital with lowest dose levels showed smallest dose variability and used age-stratified protocols for standardizing paediatric head exams. Erroneous selection of adult protocols for children still occurred, mostly in the oldest age-group. Even though all hospitals complied with national and international DRLs, dose tracking and benchmarking showed that further dose optimization and standardization is possible by using age-stratified protocols for paediatric cranial CT. Moreover, having a dose tracking system revealed that adult protocols are still applied for paediatric CT, a practice that must be avoided. (orig.)

  8. BENCHMARKING UPGRADED HOTSPOT DOSE CALCULATIONS AGAINST MACCS2 RESULTS

    Energy Technology Data Exchange (ETDEWEB)

    Brotherton, Kevin

    2009-04-30

    The radiological consequence of interest for a documented safety analysis (DSA) is the centerline Total Effective Dose Equivalent (TEDE) incurred by the Maximally Exposed Offsite Individual (MOI) evaluated at the 95th percentile consequence level. An upgraded version of HotSpot (Version 2.07) has been developed with the capabilities to read site meteorological data and perform the necessary statistical calculations to determine the 95th percentile consequence result. These capabilities should allow HotSpot to join MACCS2 (Version 1.13.1) and GENII (Version 1.485) as radiological consequence toolbox codes in the Department of Energy (DOE) Safety Software Central Registry. Using the same meteorological data file, scenarios involving a one curie release of {sup 239}Pu were modeled in both HotSpot and MACCS2. Several sets of release conditions were modeled, and the results compared. In each case, input parameter specifications for each code were chosen to match one another as much as the codes would allow. The results from the two codes are in excellent agreement. Slight differences observed in results are explained by algorithm differences.

  9. Evaluation of the applicability of the Benchmark approach to existing toxicological data. Framework: Chemical compounds in the working place

    NARCIS (Netherlands)

    Appel MJ; Bouman HGM; Pieters MN; Slob W; Adviescentrum voor chemische; CSR

    2001-01-01

    Five chemicals used in workplace, for which a risk assessment had already been carried out, were selected and the relevant critical studies re-analyzed by the Benchmark approach. The endpoints involved included continuous, and ordinal data. Dose-response modeling could be reasonablyapplied to the

  10. Regression Benchmarking: An Approach to Quality Assurance in Performance

    OpenAIRE

    2005-01-01

    The paper presents a short summary of our work in the area of regression benchmarking and its application to software development. Specially, we explain the concept of regression benchmarking, the requirements for employing regression testing in a software project, and methods used for analyzing the vast amounts of data resulting from repeated benchmarking. We present the application of regression benchmarking on a real software project and conclude with a glimpse at the challenges for the fu...

  11. Benchmarking novel approaches for modelling species range dynamics.

    Science.gov (United States)

    Zurell, Damaris; Thuiller, Wilfried; Pagel, Jörn; Cabral, Juliano S; Münkemüller, Tamara; Gravel, Dominique; Dullinger, Stefan; Normand, Signe; Schiffers, Katja H; Moore, Kara A; Zimmermann, Niklaus E

    2016-08-01

    Increasing biodiversity loss due to climate change is one of the most vital challenges of the 21st century. To anticipate and mitigate biodiversity loss, models are needed that reliably project species' range dynamics and extinction risks. Recently, several new approaches to model range dynamics have been developed to supplement correlative species distribution models (SDMs), but applications clearly lag behind model development. Indeed, no comparative analysis has been performed to evaluate their performance. Here, we build on process-based, simulated data for benchmarking five range (dynamic) models of varying complexity including classical SDMs, SDMs coupled with simple dispersal or more complex population dynamic models (SDM hybrids), and a hierarchical Bayesian process-based dynamic range model (DRM). We specifically test the effects of demographic and community processes on model predictive performance. Under current climate, DRMs performed best, although only marginally. Under climate change, predictive performance varied considerably, with no clear winners. Yet, all range dynamic models improved predictions under climate change substantially compared to purely correlative SDMs, and the population dynamic models also predicted reasonable extinction risks for most scenarios. When benchmarking data were simulated with more complex demographic and community processes, simple SDM hybrids including only dispersal often proved most reliable. Finally, we found that structural decisions during model building can have great impact on model accuracy, but prior system knowledge on important processes can reduce these uncertainties considerably. Our results reassure the clear merit in using dynamic approaches for modelling species' response to climate change but also emphasize several needs for further model and data improvement. We propose and discuss perspectives for improving range projections through combination of multiple models and for making these approaches

  12. Benchmark dose and the three Rs. Part I. Getting more information from the same number of animals.

    Science.gov (United States)

    Slob, Wout

    2014-08-01

    Evaluating dose-response data using the Benchmark dose (BMD) approach rather than by the no observed adverse effect (NOAEL) approach implies a considerable step forward from the perspective of the Reduction, Replacement, and Refinement, three Rs, in particular the R of reduction: more information is obtained from the same number of animals, or, vice versa, similar information may be obtained from fewer animals. The first part of this twin paper focusses on the former, the second on the latter aspect. Regarding the former, the BMD approach provides more information from any given dose-response dataset in various ways. First, the BMDL (= BMD lower confidence bound) provides more information by its more explicit definition. Further, as compared to the NOAEL approach the BMD approach results in more statistical precision in the value of the point of departure (PoD), for deriving exposure limits. While part of the animals in the study do not directly contribute to the numerical value of a NOAEL, all animals are effectively used and do contribute to a BMDL. In addition, the BMD approach allows for combining similar datasets for the same chemical (e.g., both sexes) in a single analysis, which further increases precision. By combining a dose-response dataset with similar historical data for other chemicals, the precision can even be substantially increased. Further, the BMD approach results in more precise estimates for relative potency factors (RPFs, or TEFs). And finally, the BMD approach is not only more precise, it also allows for quantification of the precision in the BMD estimate, which is not possible in the NOAEL approach.

  13. Uncertainties in Monte Carlo-based absorbed dose calculations for an experimental benchmark.

    Science.gov (United States)

    Renner, F; Wulff, J; Kapsch, R-P; Zink, K

    2015-10-01

    There is a need to verify the accuracy of general purpose Monte Carlo codes like EGSnrc, which are commonly employed for investigations of dosimetric problems in radiation therapy. A number of experimental benchmarks have been published to compare calculated values of absorbed dose to experimentally determined values. However, there is a lack of absolute benchmarks, i.e. benchmarks without involved normalization which may cause some quantities to be cancelled. Therefore, at the Physikalisch-Technische Bundesanstalt a benchmark experiment was performed, which aimed at the absolute verification of radiation transport calculations for dosimetry in radiation therapy. A thimble-type ionization chamber in a solid phantom was irradiated by high-energy bremsstrahlung and the mean absorbed dose in the sensitive volume was measured per incident electron of the target. The characteristics of the accelerator and experimental setup were precisely determined and the results of a corresponding Monte Carlo simulation with EGSnrc are presented within this study. For a meaningful comparison, an analysis of the uncertainty of the Monte Carlo simulation is necessary. In this study uncertainties with regard to the simulation geometry, the radiation source, transport options of the Monte Carlo code and specific interaction cross sections are investigated, applying the general methodology of the Guide to the expression of uncertainty in measurement. Besides studying the general influence of changes in transport options of the EGSnrc code, uncertainties are analyzed by estimating the sensitivity coefficients of various input quantities in a first step. Secondly, standard uncertainties are assigned to each quantity which are known from the experiment, e.g. uncertainties for geometric dimensions. Data for more fundamental quantities such as photon cross sections and the I-value of electron stopping powers are taken from literature. The significant uncertainty contributions are identified as

  14. Concordance of transcriptional and apical benchmark dose levels for conazole-induced liver effects in mice.

    Science.gov (United States)

    Bhat, Virunya S; Hester, Susan D; Nesnow, Stephen; Eastmond, David A

    2013-11-01

    The ability to anchor chemical class-based gene expression changes to phenotypic lesions and to describe these changes as a function of dose and time informs mode-of-action determinations and improves quantitative risk assessments. Previous global expression profiling identified a 330-probe cluster differentially expressed and commonly responsive to 3 hepatotumorigenic conazoles (cyproconazole, epoxiconazole, and propiconazole) at 30 days. Extended to 2 more conazoles (triadimefon and myclobutanil), the present assessment encompasses 4 tumorigenic and 1 nontumorigenic conazole. Transcriptional benchmark dose levels (BMDL(T)) were estimated for a subset of the cluster with dose-responsive behavior and a ≥ 5-fold increase or decrease in signal intensity at the highest dose. These genes primarily encompassed CAR/RXR activation, P450 metabolism, liver hypertrophy- glutathione depletion, LPS/IL-1-mediated inhibition of RXR, and NRF2-mediated oxidative stress pathways. Median BMDL(T) estimates from the subset were concordant (within a factor of 2.4) with apical benchmark doses (BMDL(A)) for increased liver weight at 30 days for the 5 conazoles. The 30-day median BMDL(T) estimates were within one-half order of magnitude of the chronic BMDLA for hepatocellular tumors. Potency differences seen in the dose-responsive transcription of certain phase II metabolism, bile acid detoxification, and lipid oxidation genes mirrored each conazole's tumorigenic potency. The 30-day BMDL(T) corresponded to tumorigenic potency on a milligram per kilogram day basis with cyproconazole > epoxiconazole > propiconazole > triadimefon > myclobutanil (nontumorigenic). These results support the utility of measuring short-term gene expression changes to inform quantitative risk assessments from long-term exposures.

  15. Mutual Fund Style, Characteristic-Matched Performance Benchmarks and Activity Measures: A New Approach

    OpenAIRE

    Daniel Buncic; Jon E. Eggins; Robert J. Hill

    2010-01-01

    We propose a new approach for measuring mutual fund style and constructing characteristic-matched performance benchmarks that requires only portfolio holdings and two reference portfolios in each style dimension. The characteristic-matched performance benchmark literature typically follows a bottom-up approach by first matching individual stocks with benchmarks and then obtaining a portfolio’s excess return as a weighted average of the excess returns on each of its constituent stocks. Our app...

  16. Development of a chronic noncancer oral reference dose and drinking water screening level for sulfolane using benchmark dose modeling.

    Science.gov (United States)

    Thompson, Chad M; Gaylor, David W; Tachovsky, J Andrew; Perry, Camarie; Carakostas, Michael C; Haws, Laurie C

    2013-12-01

    Sulfolane is a widely used industrial solvent that is often used for gas treatment (sour gas sweetening; hydrogen sulfide removal from shale and coal processes, etc.), and in the manufacture of polymers and electronics, and may be found in pharmaceuticals as a residual solvent used in the manufacturing processes. Sulfolane is considered a high production volume chemical with worldwide production around 18 000-36 000 tons per year. Given that sulfolane has been detected as a contaminant in groundwater, an important potential route of exposure is tap water ingestion. Because there are currently no federal drinking water standards for sulfolane in the USA, we developed a noncancer oral reference dose (RfD) based on benchmark dose modeling, as well as a tap water screening value that is protective of ingestion. Review of the available literature suggests that sulfolane is not likely to be mutagenic, clastogenic or carcinogenic, or pose reproductive or developmental health risks except perhaps at very high exposure concentrations. RfD values derived using benchmark dose modeling were 0.01-0.04 mg kg(-1) per day, although modeling of developmental endpoints resulted in higher values, approximately 0.4 mg kg(-1) per day. The lowest, most conservative, RfD of 0.01 mg kg(-1) per day was based on reduced white blood cell counts in female rats. This RfD was used to develop a tap water screening level that is protective of ingestion, viz. 365 µg l(-1). It is anticipated that these values, along with the hazard identification and dose-response modeling described herein, should be informative for risk assessors and regulators interested in setting health-protective drinking water guideline values for sulfolane.

  17. Ensemble approach to predict specificity determinants: benchmarking and validation

    Directory of Open Access Journals (Sweden)

    Panchenko Anna R

    2009-07-01

    Full Text Available Abstract Background It is extremely important and challenging to identify the sites that are responsible for functional specification or diversification in protein families. In this study, a rigorous comparative benchmarking protocol was employed to provide a reliable evaluation of methods which predict the specificity determining sites. Subsequently, three best performing methods were applied to identify new potential specificity determining sites through ensemble approach and common agreement of their prediction results. Results It was shown that the analysis of structural characteristics of predicted specificity determining sites might provide the means to validate their prediction accuracy. For example, we found that for smaller distances it holds true that the more reliable the prediction method is, the closer predicted specificity determining sites are to each other and to the ligand. Conclusion We observed certain similarities of structural features between predicted and actual subsites which might point to their functional relevance. We speculate that majority of the identified potential specificity determining sites might be indirectly involved in specific interactions and could be ideal target for mutagenesis experiments.

  18. Quality Assurance Testing of Version 1.3 of U.S. EPA Benchmark Dose Software (Presentation)

    Science.gov (United States)

    EPA benchmark dose software (BMDS) issued to evaluate chemical dose-response data in support of Agency risk assessments, and must therefore be dependable. Quality assurance testing methods developed for BMDS were designed to assess model dependability with respect to curve-fitt...

  19. Benchmarking of computer codes and approaches for modeling exposure scenarios

    Energy Technology Data Exchange (ETDEWEB)

    Seitz, R.R. [EG and G Idaho, Inc., Idaho Falls, ID (United States); Rittmann, P.D.; Wood, M.I. [Westinghouse Hanford Co., Richland, WA (United States); Cook, J.R. [Westinghouse Savannah River Co., Aiken, SC (United States)

    1994-08-01

    The US Department of Energy Headquarters established a performance assessment task team (PATT) to integrate the activities of DOE sites that are preparing performance assessments for the disposal of newly generated low-level waste. The PATT chartered a subteam with the task of comparing computer codes and exposure scenarios used for dose calculations in performance assessments. This report documents the efforts of the subteam. Computer codes considered in the comparison include GENII, PATHRAE-EPA, MICROSHIELD, and ISOSHLD. Calculations were also conducted using spreadsheets to provide a comparison at the most fundamental level. Calculations and modeling approaches are compared for unit radionuclide concentrations in water and soil for the ingestion, inhalation, and external dose pathways. Over 30 tables comparing inputs and results are provided.

  20. Is the assumption of normality or log-normality for continuous response data critical for benchmark dose estimation?

    Science.gov (United States)

    Shao, Kan; Gift, Jeffrey S; Setzer, R Woodrow

    2013-11-01

    Continuous responses (e.g. body weight) are widely used in risk assessment for determining the benchmark dose (BMD) which is used to derive a U.S. EPA reference dose. One critical question that is not often addressed in dose-response assessments is whether to model the continuous data as normally or log-normally distributed. Additionally, if lognormality is assumed, and only summarized response data (i.e., mean±standard deviation) are available as is usual in the peer-reviewed literature, the BMD can only be approximated. In this study, using the "hybrid" method and relative deviation approach, we first evaluate six representative continuous dose-response datasets reporting individual animal responses to investigate the impact on BMD/BMDL estimates of (1) the distribution assumption and (2) the use of summarized versus individual animal data when a log-normal distribution is assumed. We also conduct simulation studies evaluating model fits to various known distributions to investigate whether the distribution assumption has influence on BMD/BMDL estimates. Our results indicate that BMDs estimated using the hybrid method are more sensitive to the distribution assumption than counterpart BMDs estimated using the relative deviation approach. The choice of distribution assumption has limited impact on the BMD/BMDL estimates when the within dose-group variance is small, while the lognormality assumption is a better choice for relative deviation method when data are more skewed because of its appropriateness in describing the relationship between mean and standard deviation. Additionally, the results suggest that the use of summarized data versus individual response data to characterize log-normal distributions has minimal impact on BMD estimates.

  1. A Consumer's Guide to Benchmark Dose Models: Results of U.S. EPA Testing of 14 Dichotomous, 8 Continuous, and 6 Developmental Models (Presentation)

    Science.gov (United States)

    Benchmark dose risk assessment software (BMDS) was designed by EPA to generate dose-response curves and facilitate the analysis, interpretation and synthesis of toxicological data. Partial results of QA/QC testing of the EPA benchmark dose software (BMDS) are presented. BMDS pr...

  2. Benchmarking the Degree of Implementation of Learner-Centered Approaches

    Science.gov (United States)

    Blumberg, Phyllis; Pontiggia, Laura

    2011-01-01

    We describe an objective way to measure whether curricula, educational programs, and institutions are learner-centered. This technique for benchmarking learner-centeredness uses rubrics to measure courses on 29 components within Weimer's five dimensions. We converted the scores on the rubrics to four-point indices and constructed histograms that…

  3. Correlation of In Vivo Versus In Vitro Benchmark Doses (BMDs) Derived From Micronucleus Test Data: A Proof of Concept Study.

    Science.gov (United States)

    Soeteman-Hernández, Lya G; Fellows, Mick D; Johnson, George E; Slob, Wout

    2015-12-01

    In this study, we explored the applicability of using in vitro micronucleus (MN) data from human lymphoblastoid TK6 cells to derive in vivo genotoxicity potency information. Nineteen chemicals covering a broad spectrum of genotoxic modes of action were tested in an in vitro MN test using TK6 cells using the same study protocol. Several of these chemicals were considered to need metabolic activation, and these were administered in the presence of S9. The Benchmark dose (BMD) approach was applied using the dose-response modeling program PROAST to estimate the genotoxic potency from the in vitro data. The resulting in vitro BMDs were compared with previously derived BMDs from in vivo MN and carcinogenicity studies. A proportional correlation was observed between the BMDs from the in vitro MN and the BMDs from the in vivo MN assays. Further, a clear correlation was found between the BMDs from in vitro MN and the associated BMDs for malignant tumors. Although these results are based on only 19 compounds, they show that genotoxicity potencies estimated from in vitro tests may result in useful information regarding in vivo genotoxic potency, as well as expected cancer potency. Extension of the number of compounds and further investigation of metabolic activation (S9) and of other toxicokinetic factors would be needed to validate our initial conclusions. However, this initial work suggests that this approach could be used for in vitro to in vivo extrapolations which would support the reduction of animals used in research (3Rs: replacement, reduction, and refinement).

  4. International benchmarking of tertiary trauma centers: productivity and throughput approach

    Directory of Open Access Journals (Sweden)

    Matthes Gerrit

    2011-08-01

    were the most comparable parts of trauma care between the hospitals. The study also showed that the international benchmarking approach could be used to reveal bottlenecks in system-level policies and practices.

  5. Observer-based FDI for Gain Fault Detection in Ship Propulsion Benchmark:a Geometric Approach

    OpenAIRE

    Lootsma, T.F.; Izadi-Zamanabadi, Roozbeh; Nijmeijer, H.

    2001-01-01

    A geometric approach for input-affine nonlinear systems is briefly described and then applied to a ship propulsion benchmark. The obtained results are used to design a diagnostic nonlinear observer for successful FDI of the diesel engine gain fault

  6. Benchmarking Gas Path Diagnostic Methods: A Public Approach

    Science.gov (United States)

    Simon, Donald L.; Bird, Jeff; Davison, Craig; Volponi, Al; Iverson, R. Eugene

    2008-01-01

    Recent technology reviews have identified the need for objective assessments of engine health management (EHM) technology. The need is two-fold: technology developers require relevant data and problems to design and validate new algorithms and techniques while engine system integrators and operators need practical tools to direct development and then evaluate the effectiveness of proposed solutions. This paper presents a publicly available gas path diagnostic benchmark problem that has been developed by the Propulsion and Power Systems Panel of The Technical Cooperation Program (TTCP) to help address these needs. The problem is coded in MATLAB (The MathWorks, Inc.) and coupled with a non-linear turbofan engine simulation to produce "snap-shot" measurements, with relevant noise levels, as if collected from a fleet of engines over their lifetime of use. Each engine within the fleet will experience unique operating and deterioration profiles, and may encounter randomly occurring relevant gas path faults including sensor, actuator and component faults. The challenge to the EHM community is to develop gas path diagnostic algorithms to reliably perform fault detection and isolation. An example solution to the benchmark problem is provided along with associated evaluation metrics. A plan is presented to disseminate this benchmark problem to the engine health management technical community and invite technology solutions.

  7. Application of Benchmark Dose (BMD) in Estimating Biological Exposure Limit (BEL) to Cadmium

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    Objective To estimate the biological exposure limit (BEL) using benchmark dose (BMD) based on two sets of data from occupational epidemiology. Methods Cadmium-exposed workers were selected from a cadmium smelting factory and a zinc product factory. Doctors, nurses or shop assistants living in the same area served as a control group. Urinary cadmium (UCd) was used as an exposure biomarker and urinary β2-microgloburin (B2M), N-acetyl-β-D-glucosaminidase (NAG) and albumin (ALB) as effect biomarkers. All urine parameters were adjusted by urinary creatinine. Software of BMDS (Version 1.3.2, EPA.U.S.A) was used to calculate BMD. Results The cut-off point (abnormal values) was determined based on the upper limit of 95% of effect biomarkers in control group. There was a significant dose response relationship between the effect biomarkers (urinary B2M, NAG, and ALB) and exposure biomarker (UCd). BEL value was 5 μg/g creatinine for UB2M as an effect biomarker, consistent with the recommendation of WHO. BEL could be estimated by using the method of BMD. BEL value was 3 μg/g creatinine for UNAG as an effect biomarker. The more sensitive the used biomarker is, the more occupational population will be protected. Conclusion BMD can be used in estimating the biological exposure limit (BEL). UNAG is a sensitive biomarker for estimating BEL after cadmium exposure.

  8. Is the assumption of normality or log-normality for continuous response data critical for benchmark dose estimation?

    Energy Technology Data Exchange (ETDEWEB)

    Shao, Kan, E-mail: Shao.Kan@epa.gov [ORISE Postdoctoral Fellow, National Center for Environmental Assessment, U.S. Environmental Protection Agency, Research Triangle Park, NC (United States); Gift, Jeffrey S. [National Center for Environmental Assessment, U.S. Environmental Protection Agency, Research Triangle Park, NC (United States); Setzer, R. Woodrow [National Center for Computational Toxicology, U.S. Environmental Protection Agency, Research Triangle Park, NC (United States)

    2013-11-01

    Continuous responses (e.g. body weight) are widely used in risk assessment for determining the benchmark dose (BMD) which is used to derive a U.S. EPA reference dose. One critical question that is not often addressed in dose–response assessments is whether to model the continuous data as normally or log-normally distributed. Additionally, if lognormality is assumed, and only summarized response data (i.e., mean ± standard deviation) are available as is usual in the peer-reviewed literature, the BMD can only be approximated. In this study, using the “hybrid” method and relative deviation approach, we first evaluate six representative continuous dose–response datasets reporting individual animal responses to investigate the impact on BMD/BMDL estimates of (1) the distribution assumption and (2) the use of summarized versus individual animal data when a log-normal distribution is assumed. We also conduct simulation studies evaluating model fits to various known distributions to investigate whether the distribution assumption has influence on BMD/BMDL estimates. Our results indicate that BMDs estimated using the hybrid method are more sensitive to the distribution assumption than counterpart BMDs estimated using the relative deviation approach. The choice of distribution assumption has limited impact on the BMD/BMDL estimates when the within dose-group variance is small, while the lognormality assumption is a better choice for relative deviation method when data are more skewed because of its appropriateness in describing the relationship between mean and standard deviation. Additionally, the results suggest that the use of summarized data versus individual response data to characterize log-normal distributions has minimal impact on BMD estimates. - Highlights: • We investigate to what extent the distribution assumption can affect BMD estimates. • Both real data analysis and simulation study are conducted. • BMDs estimated using hybrid method are more

  9. Benchmarking B-cell epitope prediction with quantitative dose-response data on antipeptide antibodies: towards novel pharmaceutical product development.

    Science.gov (United States)

    Caoili, Salvador Eugenio C

    2014-01-01

    B-cell epitope prediction can enable novel pharmaceutical product development. However, a mechanistically framed consensus has yet to emerge on benchmarking such prediction, thus presenting an opportunity to establish standards of practice that circumvent epistemic inconsistencies of casting the epitope prediction task as a binary-classification problem. As an alternative to conventional dichotomous qualitative benchmark data, quantitative dose-response data on antibody-mediated biological effects are more meaningful from an information-theoretic perspective in the sense that such effects may be expressed as probabilities (e.g., of functional inhibition by antibody) for which the Shannon information entropy (SIE) can be evaluated as a measure of informativeness. Accordingly, half-maximal biological effects (e.g., at median inhibitory concentrations of antibody) correspond to maximally informative data while undetectable and maximal biological effects correspond to minimally informative data. This applies to benchmarking B-cell epitope prediction for the design of peptide-based immunogens that elicit antipeptide antibodies with functionally relevant cross-reactivity. Presently, the Immune Epitope Database (IEDB) contains relatively few quantitative dose-response data on such cross-reactivity. Only a small fraction of these IEDB data is maximally informative, and many more of them are minimally informative (i.e., with zero SIE). Nevertheless, the numerous qualitative data in IEDB suggest how to overcome the paucity of informative benchmark data.

  10. Correlation of In  Vivo Versus In Vitro Benchmark Doses (BMDs) Derived From Micronucleus Test Data: A Proof of Concept Study

    OpenAIRE

    2015-01-01

    In this study, we explored the applicability of using in vitro micronucleus (MN) data from human lymphoblastoid TK6 cells to derive in vivo genotoxicity potency information. Nineteen chemicals covering a broad spectrum of genotoxic modes of action were tested in an in vitro MN test using TK6 cells using the same study protocol. Several of these chemicals were considered to need metabolic activation, and these were administered in the presence of S9. The Benchmark dose (BMD) approach was appli...

  11. Comparative Benchmark Dose Modeling as a Tool to Make the First Estimate of Safe Human Exposure Levels to Lunar Dust

    Science.gov (United States)

    James, John T.; Lam, Chiu-wing; Scully, Robert R.

    2013-01-01

    Brief exposures of Apollo Astronauts to lunar dust occasionally elicited upper respiratory irritation; however, no limits were ever set for prolonged exposure ot lunar dust. Habitats for exploration, whether mobile of fixed must be designed to limit human exposure to lunar dust to safe levels. We have used a new technique we call Comparative Benchmark Dose Modeling to estimate safe exposure limits for lunar dust collected during the Apollo 14 mission.

  12. Multiple exposures to indoor contaminants: Derivation of benchmark doses and relative potency factors based on male reprotoxic effects.

    Science.gov (United States)

    Fournier, K; Tebby, C; Zeman, F; Glorennec, P; Zmirou-Navier, D; Bonvallot, N

    2016-02-01

    Semi-Volatile Organic Compounds (SVOCs) are commonly present in dwellings and several are suspected of having effects on male reproductive function mediated by an endocrine disruption mode of action. To improve knowledge of the health impact of these compounds, cumulative toxicity indicators are needed. This work derives Benchmark Doses (BMD) and Relative Potency Factors (RPF) for SVOCs acting on the male reproductive system through the same mode of action. We included SVOCs fulfilling the following conditions: detection frequency (>10%) in French dwellings, availability of data on the mechanism/mode of action for male reproductive toxicity, and availability of comparable dose-response relationships. Of 58 SVOCs selected, 18 induce a decrease in serum testosterone levels. Six have sufficient and comparable data to derive BMDs based on 10 or 50% of the response. The SVOCs inducing the largest decrease in serum testosterone concentration are: for 10%, bisphenol A (BMD10 = 7.72E-07 mg/kg bw/d; RPF10 = 7,033,679); for 50%, benzo[a]pyrene (BMD50 = 0.030 mg/kg bw/d; RPF50 = 1630), and the one inducing the smallest one is benzyl butyl phthalate (RPF10 and RPF50 = 0.095). This approach encompasses contaminants from diverse chemical families acting through similar modes of action, and makes possible a cumulative risk assessment in indoor environments. The main limitation remains the lack of comparable toxicological data.

  13. Quality assurance and benchmarking: an approach for European dental schools.

    Science.gov (United States)

    Jones, M L; Hobson, R S; Plasschaert, A J M; Gundersen, S; Dummer, P; Roger-Leroi, V; Sidlauskas, A; Hamlin, J

    2007-08-01

    This document was written by Task Force 3 of DentEd III, which is a European Union funded Thematic Network working under the auspices of the Association for Dental Education in Europe (ADEE). It provides a guide to assist in the harmonisation of Dental Education Quality Assurance (QA) systems across the European Higher Education Area (EHEA). There is reference to the work, thus far, of DentEd, DentEd Evolves, DentEd III and the ADEE as they strive to assist the convergence of standards in dental education; obviously QA and benchmarking has an important part to play in the European HE response to the Bologna Process. Definitions of Quality, Quality Assurance, Quality Management and Quality Improvement are given and put into the context of dental education. The possible process and framework for Quality Assurance are outlined and some basic guidelines/recommendations suggested. It is recognised that Quality Assurance in Dental Schools has to co-exist as part of established Quality Assurance systems within faculties and universities, and that Schools also may have to comply with existing local or national systems. Perhaps of greatest importance are the 14 'requirements' for the Quality Assurance of Dental Education in Europe. These, together with the document and its appendices, were unanimously supported by the ADEE at its General Assembly in 2006. As there must be more than one road to achieve a convergence or harmonisation standard, a number of appendices are made available on the ADEE website. These provide a series of 'toolkits' from which schools can 'pick and choose' to assist them in developing QA systems appropriate to their own environment. Validated contributions and examples continue to be most welcome from all members of the European dental community for inclusion at this website. It is realised that not all schools will be able to achieve all of these requirements immediately, by definition, successful harmonisation is a process that will take time. At

  14. A new methodology for building energy benchmarking: An approach based on clustering concept and statistical models

    Science.gov (United States)

    Gao, Xuefeng

    Though many building energy benchmarking programs have been developed during the past decades, they hold certain limitations. The major concern is that they may cause misleading benchmarking due to not fully considering the impacts of the multiple features of buildings on energy performance. The existing methods classify buildings according to only one of many features of buildings -- the use type, which may result in a comparison between two buildings that are tremendously different in other features and not properly comparable as a result. This research aims to tackle this challenge by proposing a new methodology based on the clustering concept and statistical analysis. The clustering concept, which reflects on machine learning algorithms, classifies buildings based on a multi-dimensional domain of building features, rather than the single dimension of use type. Buildings with the greatest similarity of features that influence energy performance are classified into the same cluster, and benchmarked according to the centroid reference of the cluster. Statistical analysis is applied to find the most influential features impacting building energy performance, as well as provide prediction models for the new design energy consumption. The proposed methodology as applicable to both existing building benchmarking and new design benchmarking was discussed in this dissertation. The former contains four steps: feature selection, clustering algorithm adaptation, results validation, and interpretation. The latter consists of three parts: data observation, inverse modeling, and forward modeling. The experimentation and validation were carried out for both perspectives. It was shown that the proposed methodology could account for the total building energy performance and was able to provide a more comprehensive approach to benchmarking. In addition, the multi-dimensional clustering concept enables energy benchmarking among different types of buildings, and inspires a new

  15. Determining Optimal Crude Oil Price Benchmark in Nigeria: An Empirical Approach

    Directory of Open Access Journals (Sweden)

    Saibu Olufemi Muibi

    2015-12-01

    Full Text Available This paper contributes to on-going empirical search for an appropriate crude oil price benchmark that ensures greater financial stability and efficient fiscal management in Nigeria. It adopted the seasonally adjusted ARIMA forecasting models using monthly data series from 2000m01 to 2012m12 to predict future movement in Nigeria crude oil prices. The paper derived a more robust and dynamic framework that accommodates fluctuation in crude oil price and also in government spending. The result shows that if the incessant withdrawal from the ECA fund and the increasing debt profile of government in recent times are factored into the benchmark, the real crude oil numerical fiscal rule is (US$82.3 for 2013 which is higher than the official benchmark of $75 used for 2013 and 2014 budget proposal. The paper argues that the current long run price rule based on 5-10 year moving average approach adopted by government is rigid and inflexible as a rule for managing Nigerian oil funds. The unrealistic assumption of the extant benchmark accounted for excessive depletion and lack of accountability of the excess crude oil account. The paper concludes that except the federal government can curtail its spending profligacy and adopts a more stringent fiscal discipline rules, the current benchmark is unrealistic and unsuitable for fiscal management of oil revenue in the context of Nigerian economic spending profile.

  16. Benchmark selection

    DEFF Research Database (Denmark)

    Hougaard, Jens Leth; Tvede, Mich

    2002-01-01

    Within a production theoretic framework, this paper considers an axiomatic approach to benchmark selection. It is shown that two simple and weak axioms; efficiency and comprehensive monotonicity characterize a natural family of benchmarks which typically becomes unique. Further axioms are added...

  17. Toxicological Benchmarks for Wildlife

    Energy Technology Data Exchange (ETDEWEB)

    Sample, B.E. Opresko, D.M. Suter, G.W.

    1993-01-01

    Ecological risks of environmental contaminants are evaluated by using a two-tiered process. In the first tier, a screening assessment is performed where concentrations of contaminants in the environment are compared to no observed adverse effects level (NOAEL)-based toxicological benchmarks. These benchmarks represent concentrations of chemicals (i.e., concentrations presumed to be nonhazardous to the biota) in environmental media (water, sediment, soil, food, etc.). While exceedance of these benchmarks does not indicate any particular level or type of risk, concentrations below the benchmarks should not result in significant effects. In practice, when contaminant concentrations in food or water resources are less than these toxicological benchmarks, the contaminants may be excluded from further consideration. However, if the concentration of a contaminant exceeds a benchmark, that contaminant should be retained as a contaminant of potential concern (COPC) and investigated further. The second tier in ecological risk assessment, the baseline ecological risk assessment, may use toxicological benchmarks as part of a weight-of-evidence approach (Suter 1993). Under this approach, based toxicological benchmarks are one of several lines of evidence used to support or refute the presence of ecological effects. Other sources of evidence include media toxicity tests, surveys of biota (abundance and diversity), measures of contaminant body burdens, and biomarkers. This report presents NOAEL- and lowest observed adverse effects level (LOAEL)-based toxicological benchmarks for assessment of effects of 85 chemicals on 9 representative mammalian wildlife species (short-tailed shrew, little brown bat, meadow vole, white-footed mouse, cottontail rabbit, mink, red fox, and whitetail deer) or 11 avian wildlife species (American robin, rough-winged swallow, American woodcock, wild turkey, belted kingfisher, great blue heron, barred owl, barn owl, Cooper's hawk, and red

  18. Coral growth on three reefs: development of recovery benchmarks using a space for time approach

    Science.gov (United States)

    Done, T. J.; Devantier, L. M.; Turak, E.; Fisk, D. A.; Wakeford, M.; van Woesik, R.

    2010-12-01

    This 14-year study (1989-2003) develops recovery benchmarks based on a period of very strong coral recovery in Acropora-dominated assemblages on the Great Barrier Reef (GBR) following major setbacks from the predatory sea-star Acanthaster planci in the early 1980s. A space for time approach was used in developing the benchmarks, made possible by the choice of three study reefs (Green Island, Feather Reef and Rib Reef), spread along 3 degrees of latitude (300 km) of the GBR. The sea-star outbreaks progressed north to south, causing death of corals that reached maximum levels in the years 1980 (Green), 1982 (Feather) and 1984 (Rib). The reefs were initially surveyed in 1989, 1990, 1993 and 1994, which represent recovery years 5-14 in the space for time protocol. Benchmark trajectories for coral abundance, colony sizes, coral cover and diversity were plotted against nominal recovery time (years 5-14) and defined as non-linear functions. A single survey of the same three reefs was conducted in 2003, when the reefs were nominally 1, 3 and 5 years into a second recovery period, following further Acanthaster impacts and coincident coral bleaching events around the turn of the century. The 2003 coral cover was marginally above the benchmark trajectory, but colony density (colonies.m-2) was an order of magnitude lower than the benchmark, and size structure was biased toward larger colonies that survived the turn of the century disturbances. The under-representation of small size classes in 2003 suggests that mass recruitment of corals had been suppressed, reflecting low regional coral abundance and depression of coral fecundity by recent bleaching events. The marginally higher cover and large colonies of 2003 were thus indicative of a depleted and aging assemblage not yet rejuvenated by a strong cohort of recruits.

  19. Polychlorinated biphenyls as oxidative stress inducers in liver of subacutely exposed rats: implication for dose-dependence toxicity and benchmark dose concept.

    Science.gov (United States)

    Buha, Aleksandra; Antonijević, Biljana; Milovanović, Vesna; Janković, Saša; Bulat, Zorica; Matović, Vesna

    2015-01-01

    Hepatotoxicity is one of the well-documented adverse health effects of polychlorinated biphenyls (PCBs)-persistent organic pollutants widely present in the environment. Although previous studies suggest possible role of oxidative stress, the precise mechanisms of PCB-induced ROS production in liver still remain to be fully assessed. The aim of this study was to evaluate the effects of different doses of PCBs on the parameters of oxidative stress and to investigate whether these effects are dose dependent. Furthermore, a comparison between calculated benchmark doses (BMD) and estimated NOAEL values for investigated parameters, was made. Six groups of male albino Wistar rats (7 animals per group) were receiving Aroclor 1254 dissolved in corn oil in the doses of 0.5, 1, 2, 4, 8, 16 mg PCBs/kg b.w./day by oral gavage during 28 days while control animals were receiving corn oil only. The following parameters of oxidative stress were analyzed in liver homogenates: superoxide dismutase activity, glutathione, malondialdehyde (MDA) and total protein thiol levels. Hepatic enzymes AST, ALT, ALP and protein albumin were also determined in serum as clinical parameters of liver function. Collected data on the investigated parameters were analyzed by the BMD method. The results of this study demonstrate that subacute exposure to PCBs causes induction of oxidative stress in liver with dose-dependent changes of the investigated parameters, although more pronounced adverse effects were observed on enzymatic than on non-enzymatic components of antioxidant protection. The obtained values for BMD and NOAEL support the use of BMD concept in the prediction of health risks associated with PCBs exposure. Furthermore, our results implicate possible use of MDA in PCBs risk assessment, since MDA was the most sensitive investigated parameter with calculated low critical effect dose of 0.07 mg/kg b.w.

  20. Benchmark measurements and simulations of dose perturbations due to metallic spheres in proton beams

    Science.gov (United States)

    Newhauser, Wayne D.; Rechner, Laura; Mirkovic, Dragan; Yepes, Pablo; Koch, Nicholas C.; Titt, Uwe; Fontenot, Jonas D.; Zhang, Rui

    2014-01-01

    Monte Carlo simulations are increasingly used for dose calculations in proton therapy due to its inherent accuracy. However, dosimetric deviations have been found using Monte Carlo code when high density materials are present in the proton beam line. The purpose of this work was to quantify the magnitude of dose perturbation caused by metal objects. We did this by comparing measurements and Monte Carlo predictions of dose perturbations caused by the presence of small metal spheres in several clinical proton therapy beams as functions of proton beam range, spread-out Bragg peak width and drift space. Monte Carlo codes MCNPX, GEANT4 and Fast Dose Calculator (FDC) were used. Generally good agreement was found between measurements and Monte Carlo predictions, with the average difference within 5% and maximum difference within 17%. The modification of multiple Coulomb scattering model in MCNPX code yielded improvement in accuracy and provided the best overall agreement with measurements. Our results confirmed that Monte Carlo codes are well suited for predicting multiple Coulomb scattering in proton therapy beams when short drift spaces are involved. PMID:25147474

  1. A Comparative Benchmark Dose Study for N, N-Dimethylformamide Induced Liver Injury in a Chinese Occupational Cohort.

    Science.gov (United States)

    Wu, Zhijun; Liu, Qiang; Wang, Chunmin; Xu, Bo; Guan, Mingyue; Ye, Meng; Jiang, Hai; Zheng, Min; Zhang, Man; Zhao, Wenjin; Jiang, Xiao; Leng, Shuguang; Cheng, Juan

    2017-07-01

    Widespread contamination of N,N-dimethylformamide (DMF) has been identified in the environment of leather industries and their surrounding residential areas. Few studies have assessed the dose-response relationships between internal exposure biomarkers and liver injury in DMF exposed populations. We assessed urinary N-methylformamide (NMF) and N-acetyl-S-(N-methylcarbamoyl) cysteine (AMCC) and blood N-methylcarbmoylated hemoglobin (NMHb) levels in 698 Chinese DMF-exposed workers and 188 nonDMF- exposed workers using ultraperformance liquid-chromatography tandem mass-spectrometry. Liver injury was defined as having abnormal serum activities of any of the 3 liver enzymes, including alanine aminotransferase, aspartate aminotransferase, and γ-glutamyl transpeptidase. Higher liver injury rates were identified in DMF-exposed workers versus nonDMF-exposed workers (9.17% vs 4.26%, P = .029) and in male versus female workers (11.4% vs 3.2%, P < .001). Positive correlations between environmental exposure categories and internal biomarker levels were identified with all 3 biomarkers undetectable in nonDMF-exposed workers. Lower confidence limit of benchmark dose (BMDL) was estimated using the benchmark dose (BMD) method. Within all study subjects, BMDLs of 14.0 mg/l for NMF, 155 mg/l for AMCC, and 93.3 nmol/g for NMHb were estimated based on dose-response relationships between internal levels and liver injury rates. Among male workers, BMDLs of 10.9 mg/l for NMF, 119 mg/l for AMCC, and 97.0 nmol/g for NMHb were estimated. In conclusion, NMF, AMCC, and NMHb are specific and reliable biomarkers and correlate well with DMF-induced hepatotoxicity. NMF correlates the best with liver injury, while NMHb may be the most stable indicator. Males have a greater risk of liver injury than females upon DMF exposure. © The Author 2017. Published by Oxford University Press on behalf of the Society of Toxicology. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  2. Benchmarking studies for the DESCARTES and CIDER codes. Hanford Environmental Dose Reconstruction Project

    Energy Technology Data Exchange (ETDEWEB)

    Eslinger, P.W.; Ouderkirk, S.J.; Nichols, W.E.

    1993-01-01

    The Hanford Envirorunental Dose Reconstruction (HEDR) project is developing several computer codes to model the airborne release, transport, and envirormental accumulation of radionuclides resulting from Hanford operations from 1944 through 1972. In order to calculate the dose of radiation a person may have received in any given location, the geographic area addressed by the HEDR Project will be divided into a grid. The grid size suggested by the draft requirements contains 2091 units called nodes. Two of the codes being developed are DESCARTES and CIDER. The DESCARTES code will be used to estimate the concentration of radionuclides in environmental pathways from the output of the air transport code RATCHET. The CIDER code will use information provided by DESCARTES to estimate the dose received by an individual. The requirements that Battelle (BNW) set for these two codes were released to the HEDR Technical Steering Panel (TSP) in a draft document on November 10, 1992. This document reports on the preliminary work performed by the code development team to determine if the requirements could be met.

  3. An International Pooled Analysis for Obtaining a Benchmark Dose for Environmental Lead Exposure in Children

    DEFF Research Database (Denmark)

    Budtz-Jørgensen, Esben; Bellinger, David; Lanphear, Bruce;

    2013-01-01

    Lead is a recognized neurotoxicant, but estimating effects at the lowest measurable levels is difficult. An international pooled analysis of data from seven cohort studies reported an inverse and supra-linear relationship between blood lead concentrations and IQ scores in children. The lack...... yielding lower confidence limits (BMDLs) of about 0.1-1.0 for the dose leading to a loss of one IQ point. We conclude that current allowable blood lead concentrations need to be lowered and further prevention efforts are needed to protect children from lead toxicity....

  4. Benchmarking inverse statistical approaches for protein structure and design with exactly solvable models

    CERN Document Server

    Jacquin, Hugo; Shakhnovich, Eugene; Cocco, Simona; Monasson, Rémi

    2016-01-01

    Inverse statistical approaches to determine protein structure and function from Multiple Sequence Alignments (MSA) are emerging as powerful tools in computational biology. However the underlying assumptions of the relationship between the inferred effective Potts Hamiltonian and real protein structure and energetics remain untested so far. Here we use lattice protein model (LP) to benchmark those inverse statistical approaches. We build MSA of highly stable sequences in target LP structures, and infer the effective pairwise Potts Hamiltonians from those MSA. We find that inferred Potts Hamiltonians reproduce many important aspects of 'true' LP structures and energetics. Careful analysis reveals that effective pairwise couplings in inferred Potts Hamiltonians depend not only on the energetics of the native structure but also on competing folds; in particular, the coupling values reflect both positive design (stabilization of native conformation) and negative design (destabilization of competing folds). In addi...

  5. Estimation of benchmark dose as the threshold amount of alcohol consumption for blood pressure in Japanese workers.

    Science.gov (United States)

    Suwazono, Yasushi; Sakata, Kouichi; Oishi, Mitsuhiro; Okubo, Yasushi; Dochi, Mirei; Kobayashi, Etsuko; Kido, Teruhiko; Nogawa, Koji

    2007-12-01

    In order to determine the threshold amount of alcohol consumption for blood pressure, we calculated the benchmark dose (BMD) of alcohol consumption and its 95% lower confidence interval (BMDL) in Japanese workers. The subjects consisted of 4,383 males and 387 females in a Japanese steel company. The target variables were systolic, diastolic, and mean arterial pressures. The effects of other potential covariates such as age and body mass index were adjusted by including these covariates in the multiple linear regression models. In male workers, BMD/BMDL for alcohol consumption (g/week) at which the probability of an adverse response was estimated to increase by 5% relative to no alcohol consumption, were 396/315 (systolic blood pressure), 321/265 (diastolic blood pressure), and 326/269 (mean arterial pressures). These values were based on significant regression coefficients of alcohol consumption. In female workers, BMD/BMDL for alcohol consumption based on insignificant regression coefficients were 693/134 (systolic blood pressure), 199/90 (diastolic blood pressure), and 267/77 (mean arterial pressure). Therefore, BMDs/BMDLs in males were more informative than those in females as there was no significant relationship between alcohol and blood pressure in females. The threshold amount of alcohol consumption determined in this study provides valuable information for preventing alcohol-induced hypertension.

  6. Benchmark study of UV/Visible spectra of coumarin derivatives by computational approach

    Science.gov (United States)

    Irfan, Muhammad; Iqbal, Javed; Eliasson, Bertil; Ayub, Khurshid; Rana, Usman Ali; Ud-Din Khan, Salah

    2017-02-01

    A benchmark study of UV/Visible spectra of Simple coumarins and Furanocoumarins derivatives was conducted by employing the Density Functional Theory (DFT) and Time Dependent Density Functional Theory (TD-DFT) approaches. In this study the geometries of ground and excited states, excitation energy and absorption spectra were estimated by using the DFT functional CAM-B3LYP, WB97XD, HSEH1PBE, MPW1PW91 and TD-B3LYP with 6-31 + G (d,p) basis set. CAM-B3LYP functional was found to have close agreement with the experimental values of Furranocoumarin class of coumarins while MPW1PW91 gave close results for simple coumarins. This study provided an insight about the electronic characteristics of the selected compounds and provided an effective tool for developing and designing the better UV absorber compounds.

  7. A practical approach to determine dose metrics for nanomaterials.

    Science.gov (United States)

    Delmaar, Christiaan J E; Peijnenburg, Willie J G M; Oomen, Agnes G; Chen, Jingwen; de Jong, Wim H; Sips, Adriënne J A M; Wang, Zhuang; Park, Margriet V D Z

    2015-05-01

    Traditionally, administered mass is used to describe doses of conventional chemical substances in toxicity studies. For deriving toxic doses of nanomaterials, mass and chemical composition alone may not adequately describe the dose, because particles with the same chemical composition can have completely different toxic mass doses depending on properties such as particle size. Other dose metrics such as particle number, volume, or surface area have been suggested, but consensus is lacking. The discussion regarding the most adequate dose metric for nanomaterials clearly needs a systematic, unbiased approach to determine the most appropriate dose metric for nanomaterials. In the present study, the authors propose such an approach and apply it to results from in vitro and in vivo experiments with silver and silica nanomaterials. The proposed approach is shown to provide a convenient tool to systematically investigate and interpret dose metrics of nanomaterials. Recommendations for study designs aimed at investigating dose metrics are provided.

  8. Benchmarking Inverse Statistical Approaches for Protein Structure and Design with Exactly Solvable Models.

    Directory of Open Access Journals (Sweden)

    Hugo Jacquin

    2016-05-01

    Full Text Available Inverse statistical approaches to determine protein structure and function from Multiple Sequence Alignments (MSA are emerging as powerful tools in computational biology. However the underlying assumptions of the relationship between the inferred effective Potts Hamiltonian and real protein structure and energetics remain untested so far. Here we use lattice protein model (LP to benchmark those inverse statistical approaches. We build MSA of highly stable sequences in target LP structures, and infer the effective pairwise Potts Hamiltonians from those MSA. We find that inferred Potts Hamiltonians reproduce many important aspects of 'true' LP structures and energetics. Careful analysis reveals that effective pairwise couplings in inferred Potts Hamiltonians depend not only on the energetics of the native structure but also on competing folds; in particular, the coupling values reflect both positive design (stabilization of native conformation and negative design (destabilization of competing folds. In addition to providing detailed structural information, the inferred Potts models used as protein Hamiltonian for design of new sequences are able to generate with high probability completely new sequences with the desired folds, which is not possible using independent-site models. Those are remarkable results as the effective LP Hamiltonians used to generate MSA are not simple pairwise models due to the competition between the folds. Our findings elucidate the reasons for the success of inverse approaches to the modelling of proteins from sequence data, and their limitations.

  9. Evaluation of the applicability of the Benchmark approach to existing toxicological data. Framework: Chemical compounds in the working place

    OpenAIRE

    Appel MJ; Bouman HGM; Pieters MN; Slob W; CSR

    2001-01-01

    Vijf stoffen in de werkomgeving waarvoor risico-evaluaties beschikbaar waren, werden geselecteerd voor analyse met de benchmark-benadering. De kritische studies werden voor elk van deze stoffen geanalyseerd. De onderzochte toxicologische parameters betroffen zowel continue als ordinale gegevens. Het bleek dat dosis-respons modellering redelijk kon worden toegepast op de beschikbare gegevens. Kritische-effect-doseringen ('Critical Effect Doses', CEDs) werden afgeleid voor bijna alle ...

  10. Qualitative and quantitative approaches in the dose-response assessment of genotoxic carcinogens.

    Science.gov (United States)

    Fukushima, Shoji; Gi, Min; Kakehashi, Anna; Wanibuchi, Hideki; Matsumoto, Michiharu

    2016-05-01

    Qualitative and quantitative approaches are important issues in field of carcinogenic risk assessment of the genotoxic carcinogens. Herein, we provide quantitative data on low-dose hepatocarcinogenicity studies for three genotoxic hepatocarcinogens: 2-amino-3,8-dimethylimidazo[4,5-f]quinoxaline (MeIQx), 2-amino-3-methylimidazo[4,5-f]quinoline (IQ) and N-nitrosodiethylamine (DEN). Hepatocarcinogenicity was examined by quantitative analysis of glutathione S-transferase placental form (GST-P) positive foci, which are the preneoplastic lesions in rat hepatocarcinogenesis and the endpoint carcinogenic marker in the rat liver medium-term carcinogenicity bioassay. We also examined DNA damage and gene mutations which occurred through the initiation stage of carcinogenesis. For the establishment of points of departure (PoD) from which the cancer-related risk can be estimated, we analyzed the above events by quantitative no-observed-effect level and benchmark dose approaches. MeIQx at low doses induced formation of DNA-MeIQx adducts; somewhat higher doses caused elevation of 8-hydroxy-2'-deoxyquanosine levels; at still higher doses gene mutations occurred; and the highest dose induced formation of GST-P positive foci. These data indicate that early genotoxic events in the pathway to carcinogenesis showed the expected trend of lower PoDs for earlier events in the carcinogenic process. Similarly, only the highest dose of IQ caused an increase in the number of GST-P positive foci in the liver, while IQ-DNA adduct formation was observed with low doses. Moreover, treatment with DEN at low doses had no effect on development of GST-P positive foci in the liver. These data on PoDs for the markers contribute to understand whether genotoxic carcinogens have a threshold for their carcinogenicity. The most appropriate approach to use in low dose-response assessment must be approved on the basis of scientific judgment.

  11. Benchmarking Density Functional Theory Approaches for the Description of Symmetry-Breaking in Long Polymethine Dyes

    KAUST Repository

    Gieseking, Rebecca L.

    2016-04-25

    Long polymethines are well-known experimentally to symmetry-break, which dramatically modifies their linear and nonlinear optical properties. Computational modeling could be very useful to provide insight into the symmetry-breaking process, which is not readily available experimentally; however, accurately predicting the crossover point from symmetric to symmetry-broken structures has proven challenging. Here, we benchmark the accuracy of several DFT approaches relative to CCSD(T) geometries. In particular, we compare analogous hybrid and long-range corrected (LRC) functionals to clearly show the influence of the functional exchange term. Although both hybrid and LRC functionals can be tuned to reproduce the CCSD(T) geometries, the LRC functionals are better performing at reproducing the geometry evolution with chain length and provide a finite upper limit for the gas-phase crossover point; these methods also provide good agreement with the experimental crossover points for more complex polymethines in polar solvents. Using an approach based on LRC functionals, a reduction in the crossover length is found with increasing medium dielectric constant, which is related to localization of the excess charge on the end groups. Symmetry-breaking is associated with the appearance of an imaginary frequency of b2 symmetry involving a large change in the degree of bond-length alternation. Examination of the IR spectra show that short, isolated streptocyanines have a mode at ~1200 cm-1 involving a large change in bond-length alternation; as the polymethine length or the medium dielectric increases, the frequency of this mode decreases before becoming imaginary at the crossover point.

  12. Benchmarking in University Toolbox

    Directory of Open Access Journals (Sweden)

    Katarzyna Kuźmicz

    2015-06-01

    Full Text Available In the face of global competition and rising challenges that higher education institutions (HEIs meet, it is imperative to increase innovativeness and efficiency of their management. Benchmarking can be the appropriate tool to search for a point of reference necessary to assess institution’s competitive position and learn from the best in order to improve. The primary purpose of the paper is to present in-depth analysis of benchmarking application in HEIs worldwide. The study involves indicating premises of using benchmarking in HEIs. It also contains detailed examination of types, approaches and scope of benchmarking initiatives. The thorough insight of benchmarking applications enabled developing classification of benchmarking undertakings in HEIs. The paper includes review of the most recent benchmarking projects and relating them to the classification according to the elaborated criteria (geographical range, scope, type of data, subject, support and continuity. The presented examples were chosen in order to exemplify different approaches to benchmarking in higher education setting. The study was performed on the basis of the published reports from benchmarking projects, scientific literature and the experience of the author from the active participation in benchmarking projects. The paper concludes with recommendations for university managers undertaking benchmarking, derived on the basis of the conducted analysis.

  13. Modeling Dose-response at Low Dose: A Systems Biology Approach for Ionization Radiation.

    Science.gov (United States)

    Zhao, Yuchao; Ricci, Paolo F

    2010-03-18

    For ionization radiation (IR) induced cancer, a linear non-threshold (LNT) model at very low doses is the default used by a number of national and international organizations and in regulatory law. This default denies any positive benefit from any level of exposure. However, experimental observations and theoretical biology have found that both linear and J-shaped IR dose-response curves can exist at those very low doses. We develop low dose J-shaped dose-response, based on systems biology, and thus justify its use regarding exposure to IR. This approach incorporates detailed, molecular and cellular descriptions of biological/toxicological mechanisms to develop a dose-response model through a set of nonlinear, differential equations describing the signaling pathways and biochemical mechanisms of cell cycle checkpoint, apoptosis, and tumor incidence due to IR. This approach yields a J-shaped dose response curve while showing where LNT behaviors are likely to occur. The results confirm the hypothesis of the J-shaped dose response curve: the main reason is that, at low-doses of IR, cells stimulate protective systems through a longer cell arrest time per unit of IR dose. We suggest that the policy implications of this approach are an increasingly correct way to deal with precautionary measures in public health.

  14. Benchmarking of computational approaches for fast screening of lithium ion battery electrolyte solvents

    Science.gov (United States)

    Kim, Daejin; Guk, Hyein; Choi, Seung-Hoon; Chung, Dong Hyen

    2017-08-01

    Electrolyte solvents play an important role in lithium-ion batteries. Hence, investigation of the solvent is key to improving battery functionality. We performed benchmark calculations to suggest the best conditions for rapid screening of electrolyte candidates using semi-empirical (SEM) calculations and density functional theory (DFT). A wide selection of Hamiltonians, DFT levels, and basis sets were used for this benchmarking with typical electrolyte solvents. The most efficient condition for reducing computational costs and time is VWN/DNP+ for DFT levels and PM3 for SEM Hamiltonians.

  15. A novel approach for establishing benchmark CBCT/CT deformable image registrations in prostate cancer radiotherapy.

    Science.gov (United States)

    Kim, Jinkoo; Kumar, Sanath; Liu, Chang; Zhong, Hualiang; Pradhan, Deepak; Shah, Mira; Cattaneo, Richard; Yechieli, Raphael; Robbins, Jared R; Elshaikh, Mohamed A; Chetty, Indrin J

    2013-11-21

    Deformable image registration (DIR) is an integral component for adaptive radiation therapy. However, accurate registration between daily cone-beam computed tomography (CBCT) and treatment planning CT is challenging, due to significant daily variations in rectal and bladder fillings as well as the increased noise levels in CBCT images. Another significant challenge is the lack of 'ground-truth' registrations in the clinical setting, which is necessary for quantitative evaluation of various registration algorithms. The aim of this study is to establish benchmark registrations of clinical patient data. Three pairs of CT/CBCT datasets were chosen for this institutional review board approved retrospective study. On each image, in order to reduce the contouring uncertainty, ten independent sets of organs were manually delineated by five physicians. The mean contour set for each image was derived from the ten contours. A set of distinctive points (round natural calcifications and three implanted prostate fiducial markers) were also manually identified. The mean contours and point features were then incorporated as constraints into a B-spline based DIR algorithm. Further, a rigidity penalty was imposed on the femurs and pelvic bones to preserve their rigidity. A piecewise-rigid registration approach was adapted to account for the differences in femur pose and the sliding motion between bones. For each registration, the magnitude of the spatial Jacobian (|JAC|) was calculated to quantify the tissue compression and expansion. Deformation grids and finite-element-model-based unbalanced energy maps were also reviewed visually to evaluate the physical soundness of the resultant deformations. Organ DICE indices (indicating the degree of overlap between registered organs) and residual misalignments of the fiducial landmarks were quantified. Manual organ delineation on CBCT images varied significantly among physicians with overall mean DICE index of only 0.7 among redundant

  16. A novel approach for establishing benchmark CBCT/CT deformable image registrations in prostate cancer radiotherapy

    Science.gov (United States)

    Kim, Jinkoo; Kumar, Sanath; Liu, Chang; Zhong, Hualiang; Pradhan, Deepak; Shah, Mira; Cattaneo, Richard; Yechieli, Raphael; Robbins, Jared R.; Elshaikh, Mohamed A.; Chetty, Indrin J.

    2013-11-01

    Deformable image registration (DIR) is an integral component for adaptive radiation therapy. However, accurate registration between daily cone-beam computed tomography (CBCT) and treatment planning CT is challenging, due to significant daily variations in rectal and bladder fillings as well as the increased noise levels in CBCT images. Another significant challenge is the lack of ‘ground-truth’ registrations in the clinical setting, which is necessary for quantitative evaluation of various registration algorithms. The aim of this study is to establish benchmark registrations of clinical patient data. Three pairs of CT/CBCT datasets were chosen for this institutional review board approved retrospective study. On each image, in order to reduce the contouring uncertainty, ten independent sets of organs were manually delineated by five physicians. The mean contour set for each image was derived from the ten contours. A set of distinctive points (round natural calcifications and three implanted prostate fiducial markers) were also manually identified. The mean contours and point features were then incorporated as constraints into a B-spline based DIR algorithm. Further, a rigidity penalty was imposed on the femurs and pelvic bones to preserve their rigidity. A piecewise-rigid registration approach was adapted to account for the differences in femur pose and the sliding motion between bones. For each registration, the magnitude of the spatial Jacobian (|JAC|) was calculated to quantify the tissue compression and expansion. Deformation grids and finite-element-model-based unbalanced energy maps were also reviewed visually to evaluate the physical soundness of the resultant deformations. Organ DICE indices (indicating the degree of overlap between registered organs) and residual misalignments of the fiducial landmarks were quantified. Manual organ delineation on CBCT images varied significantly among physicians with overall mean DICE index of only 0.7 among redundant

  17. New Approach to Total Dose Specification for Spacecraft Electronics

    Science.gov (United States)

    Xapsos, Michael

    2017-01-01

    Variability of the space radiation environment is investigated with regard to total dose specification for spacecraft electronics. It is shown to have a significant impact. A new approach is developed for total dose requirements that replaces the radiation design margin concept with failure probability during a mission.

  18. An approach to radiation safety department benchmarking in academic and medical facilities.

    Science.gov (United States)

    Harvey, Richard P

    2015-02-01

    Based on anecdotal evidence and networking with colleagues at other facilities, it has become evident that some radiation safety departments are not adequately staffed and radiation safety professionals need to increase their staffing levels. Discussions with management regarding radiation safety department staffing often lead to similar conclusions. Management acknowledges the Radiation Safety Officer (RSO) or Director of Radiation Safety's concern but asks the RSO to provide benchmarking and justification for additional full-time equivalents (FTEs). The RSO must determine a method to benchmark and justify additional staffing needs while struggling to maintain a safe and compliant radiation safety program. Benchmarking and justification are extremely important tools that are commonly used to demonstrate the need for increased staffing in other disciplines and are tools that can be used by radiation safety professionals. Parameters that most RSOs would expect to be positive predictors of radiation safety staff size generally are and can be emphasized in benchmarking and justification report summaries. Facilities with large radiation safety departments tend to have large numbers of authorized users, be broad-scope programs, be subject to increased controls regulations, have large clinical operations, have significant numbers of academic radiation-producing machines, and have laser safety responsibilities.

  19. Appraisement and benchmarking of third-party logistic service provider by exploration of risk-based approach

    Directory of Open Access Journals (Sweden)

    Nitin Kumar Sahu

    2015-12-01

    Full Text Available In the present era, Reverse Logistics Support has monitored as a momentous realm, where stuffs are transferred from point of consumption to origin. The companies who provide the logistic equipments, i.e. Truck, Joseph Cyril Bomford, and Shipment, etc. to its partner’s firms called Third-Party Logistics (3PL service provider. Today, the feasible 3PL service provider evaluation-opt problem is yet an amorous dilemma. The appraisement and benchmarking of logistics service providers in extent of index; allied risk-based indices and their interrelated metrics; outlooked as a great tool for each international firm, in order that firm could obtain their core goals. The novelty of manuscript is that here, a hairy-based approach has been integrated and then implemented upon a novel developed multi hierarchical third-party logistics (3PL service providers appraisement index in purpose to umpire the 3PL provider for their strong and ill’s core indices. Moreover, the overall score (Si system has also been carried out for benchmarking the 3PL provider companies, where s1 has been found as the best 3PL service provider. The developed approach enabled the manager of firms to make the verdict towards the best inclusive evaluation process of 3PL performance appraisement and benchmarking. A numerical illustration has also been provided to validate the verdict support system.

  20. Benchmarking of the construct of dimensionless correlations regarding batch bubble columns with suspended solids: Performance of the Pressure Transform Approach

    CERN Document Server

    Hristov, Jordan

    2010-01-01

    Benchmark of dimensionless data correlations pertinent to batch bubble columns (BC) with suspended solids has been performed by the pressure transform approach (PTA). The main efforts have addressed the correct definition of dimensionless groups referring to the fact that solids dynamics and the bubble dynamics have different velocity and length scales. The correct definition of the initial set of variable in the classical dimensional analysis depends mainly on the experience of the investigator while the pressure transform approach (PTA) avoids errors at this initial stage. PTA addresses the physics of the phenomena occurring in complex systems involving many phases and allows straightforward definitions of dimensionless numbers.

  1. Estimating the need for palliative radiotherapy for brain metastasis: a benchmarking approach.

    Science.gov (United States)

    Kong, W; Jarvis, C; Mackillop, W J

    2015-02-01

    Palliative radiotherapy (PRT) is useful in the management of many patients with brain metastases, but the need for this treatment in the general cancer population is unknown. The objective of this study was to estimate the appropriate rate of use of PRT for brain metastases (PRT.Br). Ontario's population-based cancer registry was used to identify patients who died of cancer. Radiotherapy records from all the province's radiotherapy centres were linked to Ontario's cancer registry to identify patients who received PRT.Br in the last 2 years of life. Multivariate analysis was used to identify social and health system-related barriers to the use of PRT.Br and to identify a subpopulation of patients with unimpeded access to PRT.Br. The rate of use of PRT.Br was measured in this benchmark subpopulation. The benchmark rate was standardised to the case mix of the overall cancer population. The study population included 231,397 patients who died of cancer in Ontario between 1998 and 2007. Overall, 13,944 patients received at least one course of PRT.Br in the last 2 years of life (6.0%). Multivariate analysis showed that the use of PRT.Br was strongly associated with: the availability of radiotherapy at the diagnosing hospital; the socioeconomic status of the community where the patient lived; and the distance from his/her home to the nearest radiotherapy centre. The benchmark subpopulation was defined as patients diagnosed in a hospital with radiotherapy facilities on site and who resided in a high income community, within 50 km of the nearest radiotherapy centre. The standardised benchmark rate of PRT.Br was 8.0% (95% confidence interval 7.5%, 8.5%). The overall shortfall between the actual rate and the benchmark was 25%, but varied by primary cancer site: lung, 27.6%; melanoma, 19.4%; breast, 13.9%. The magnitude of the shortfall in the use of PRT.Br varied widely across the province. At least 8.0% of patients who die of cancer require PRT.Br at least once in the last 2

  2. Assessment of the municipal solid waste management system in Accra, Ghana: A 'Wasteaware' benchmark indicator approach.

    Science.gov (United States)

    Oduro-Appiah, Kwaku; Scheinberg, Anne; Mensah, Anthony; Afful, Abraham; Boadu, Henry Kofi; de Vries, Nanne

    2017-09-01

    This article assesses the performance of the city of Accra, Ghana, in municipal solid waste management as defined by the integrated sustainable waste management framework. The article reports on a participatory process to socialise the Wasteaware benchmark indicators and apply them to an upgraded set of data and information. The process has engaged 24 key stakeholders for 9 months, to diagram the flow of materials and benchmark three physical components and three governance aspects of the city's municipal solid waste management system. The results indicate that Accra is well below some other lower middle-income cities regarding sustainable modernisation of solid waste services. Collection coverage and capture of 75% and 53%, respectively, are a disappointing result, despite (or perhaps because of) 20 years of formal private sector involvement in service delivery. A total of 62% of municipal solid waste continues to be disposed of in controlled landfills and the reported recycling rate of 5% indicates both a lack of good measurement and a lack of interest in diverting waste from disposal. Drains, illegal dumps and beaches are choked with discarded bottles and plastic packaging. The quality of collection, disposal and recycling score between low and medium on the Wasteaware indicators, and the scores for user inclusivity, financial sustainability and local institutional coherence are low. The analysis suggests that waste and recycling would improve through greater provider inclusivity, especially the recognition and integration of the informal sector, and interventions that respond to user needs for more inclusive decision-making.

  3. Comparison of measured and calculated spatial dose distributions for a bench-mark 106Ru/106Rh hot particle source.

    Science.gov (United States)

    Aydarous, A Sh; Charles, M W; Darley, P J

    2008-01-01

    This study is a part of a programme of research to provide validated dose measurement and calculation techniques for beta emitting hot particles by the construction of well-defined model hot particle sources. This enables parallel measurements and calculations to be critically compared. This particular study concentrates on the high-energy beta emitter, (106)Ru/(106)Rh (Emax = 3.54 MeV). This source is a common constituent of failed nuclear fuel, particularly in accident situations. The depth dose distributions were measured using radiochromic dye film (RDF); an imaging photon detector coupled to an LiF thermoluminescent dosemeter (LiF-IPD) and an extrapolation ionisation chamber (ECH). Dose calculations were performed using the Monte Carlo radiation transport code MCNP4C. Doses were measured and calculated as average values over various areas and depths. Of particular interest are the doses at depths of 7 and 30-50 mg cm(-2), and averaged over an area of 1 cm2, as recommended by the International Commission on Radiological Protection for use in routine and accidental over-exposures of the skin. In this case, the average ratios (MCNP/measurement) for RDF, ECH and LiF-IPD were 1.07 +/- 0.02, 1.02 +/- 0.01 and 0.83 +/- 0.16, respectively. There are significantly greater discrepancies between the ECH and LiF-IPD measurement techniques and calculations-particularly for shallow depths and small averaging areas.

  4. Benchmarking of the dose planning method (DPM) Monte Carlo code using electron beams from a racetrack microtron.

    Science.gov (United States)

    Chetty, Indrin J; Moran, Jean M; McShan, Daniel L; Fraass, Benedick A; Wilderman, Scott J; Bielajew, Alex F

    2002-06-01

    A comprehensive set of measurements and calculations has been conducted to investigate the accuracy of the Dose Planning Method (DPM) Monte Carlo code for dose calculations from 10 and 50 MeV scanned electron beams produced from a racetrack microtron. Central axis depth dose measurements and a series of profile scans at various depths were acquired in a water phantom using a Scanditronix type RK ion chamber. Source spatial distributions for the Monte Carlo calculations were reconstructed from in-air ion chamber measurements carried out across the two-dimensional beam profile at 100 cm downstream from the source. The in-air spatial distributions were found to have full width at half maximum of 4.7 and 1.3 cm, at 100 cm from the source, for the 10 and 50 MeV beams, respectively. Energy spectra for the 10 and 50 MeV beams were determined by simulating the components of the microtron treatment head using the code MCNP4B. DPM calculations are on average within +/- 2% agreement with measurement for all depth dose and profile comparisons conducted in this study. The accuracy of the DPM code illustrated in this work suggests that DPM may be used as a valuable tool for electron beam dose calculations.

  5. Wavelet approach for analysis of neutronic power using data of Ringhals stability benchmark

    Energy Technology Data Exchange (ETDEWEB)

    Espinosa-Paredes, Gilberto [Division de Ciencias Basicas e Ingenieria, Universidad Autonoma Metropolitana-Iztapalapa, Av. San Rafael Atlixco, 186, Col. Vicentina, 09340 Mexico D.F. (Mexico)]. E-mail: gepe@xanum.uam.mx; Nunez-Carrera, Alejandro [Comision Nacional de Seguridad Nuclear y Salvaguardias, Dr. Barragan 779, Col. Narvarte, 03020 Mexico D.F. (Mexico); Prieto-Guerrero, Alfonso [Division de Ciencias Basicas e Ingenieria, Universidad Autonoma Metropolitana-Iztapalapa, Av. San Rafael Atlixco, 186, Col. Vicentina, 09340 Mexico D.F. (Mexico); Cecenas, Miguel [Instituto de Investigaciones Electricas, Av. Reforma 113, Col. Palmira, 62490 Cuernavaca, Morelos (Mexico)

    2007-05-15

    We have studied neutronic power oscillation in a boiling water nuclear reactor for three different scenarios of the Ringhals stability benchmark with a proposed wavelets-based method: the first scenario is a stable operating state which was considered as a base case in this study, and the last two correspond to unstable operating conditions of in-phase and out-of-phase events. The results obtained with the methodology presented here suggest that a wavelet-based method can help the understanding and monitoring of the power dynamics in boiling water nuclear reactors. The stability parameters frequency and decay ratio were calculated as a function of time, based on the theory of wavelet ridges. This method allows us to analyze both stationary and highly non-stationary signals. The resonant frequencies of the oscillation are consistent with previous measurements or calculated values.

  6. Sustainable operations management and benchmarking in brewing: A factor weighting approach

    Directory of Open Access Journals (Sweden)

    Daniel P. Bumblauskas

    2017-06-01

    Full Text Available The brewing industry has been moving towards more efficient use of energy, water reuse and stewardship, and the tracking of greenhouse gas (GHG emissions to better manage environmental and social responsibility. Commercial breweries use a great deal of water and energy to convert one gallon (liter of water into one gallon (liter of beer. An analysis was conducted on sustainable operations and supply chain management at various United States and international breweries, specifically Europe, to benchmark brewery performance and establish common metrics for sustainability in the beer supply chain. The primary research questions explored in this article are whether water reclamation and GHG emissions can be properly monitored and measured and if processes can be created to help control waste (lean and emissions. Additional questions include how we can use operations management strategies and techniques such as the Factor-Weighted Method (FWM in industries such as brewing to develop sustainability scorecards.

  7. Benchmark Dose Analysis from Multiple Datasets: The Cumulative Risk Assessment for the N-Methyl Carbamate Pesticides

    Science.gov (United States)

    The US EPA’s N-Methyl Carbamate (NMC) Cumulative Risk assessment was based on the effect on acetylcholine esterase (AChE) activity of exposure to 10 NMC pesticides through dietary, drinking water, and residential exposures, assuming the effects of joint exposure to NMCs is dose-...

  8. Library Benchmarking

    Directory of Open Access Journals (Sweden)

    Wiji Suwarno

    2017-02-01

    Full Text Available The term benchmarking has been encountered in the implementation of total quality (TQM or in Indonesian termed holistic quality management because benchmarking is a tool to look for ideas or learn from the library. Benchmarking is a processof measuring and comparing for continuous business process of systematic and continuous measurement, the process of measuring and comparing for continuous business process of an organization to get information that can help these organization improve their performance efforts.

  9. Financial Benchmarking

    OpenAIRE

    2012-01-01

    This bachelor's thesis is focused on financial benchmarking of TULIPA PRAHA s.r.o. The aim of this work is to evaluate financial situation of the company, identify its strengths and weaknesses and to find out how efficient is the performance of this company in comparison with top companies within the same field by using INFA benchmarking diagnostic system of financial indicators. The theoretical part includes the characteristic of financial analysis, which financial benchmarking is based on a...

  10. [An approach to care indicators benchmarking. Learning to improve patient safety].

    Science.gov (United States)

    de Andrés Gimeno, B; Salazar de la Guerra, R M; Ferrer Arnedo, C; Revuelta Zamorano, M; Ayuso Murillo, D; González Soria, J

    2014-01-01

    Improvements in clinical safety can be achieved by promoting a safety culture, professional training, and learning through benchmarking. The aim of this study was to identify areas for improvement after analysing the safety indicators in two public Hospitals in North-West Madrid Region. Descriptive study performed during 2011 in Hospital Universitario Puerta de Hierro Majadahonda (HUPHM) and Hospital de Guadarrama (HG). The variables under study were 40 indicators on nursing care related to patient safety. Nineteen of them were defined in the SENECA project as care quality standards in order to improve patient safety in the hospitals. The data collected were clinical history, Madrid Health Service assessment reports, care procedures, and direct observation Within the 40 indicators: 22 of them were structured (procedures), HUPHM had 86%, and HG 95% 14 process indicators (training and protocols compliance) with similar results in both hospitals, apart from the care continuity reports and training in hand hygiene. The 4 results indicators (pressure ulcer, falls and pain) showed different results. The analysis of the indicators allowed the following actions to be taken: to identify improvements to be made in each hospital, to develop joint safety recommendations in nursing care protocols in prevention and treatment of chronic wound, to establish systematic pain assessments, and to prepare continuity care reports on all patients transferred from HUPHM to HG. Copyright © 2013 SECA. Published by Elsevier Espana. All rights reserved.

  11. Antibiotic reimbursement in a model delinked from sales: a benchmark-based worldwide approach.

    Science.gov (United States)

    Rex, John H; Outterson, Kevin

    2016-04-01

    Despite the life-saving ability of antibiotics and their importance as a key enabler of all of modern health care, their effectiveness is now threatened by a rising tide of resistance. Unfortunately, the antibiotic pipeline does not match health needs because of challenges in discovery and development, as well as the poor economics of antibiotics. Discovery and development are being addressed by a range of public-private partnerships; however, correcting the poor economics of antibiotics will need an overhaul of the present business model on a worldwide scale. Discussions are now converging on delinking reward from antibiotic sales through prizes, milestone payments, or insurance-like models in which innovation is rewarded with a fixed series of payments of a predictable size. Rewarding all drugs with the same payments could create perverse incentives to produce drugs that provide the least possible innovation. Thus, we propose a payment model using a graded array of benchmarked rewards designed to encourage the development of antibiotics with the greatest societal value, together with appropriate worldwide access to antibiotics to maximise human health.

  12. Threshold limit values of the cadmium concentration in rice in the development of itai-itai disease using benchmark dose analysis.

    Science.gov (United States)

    Nogawa, Kazuhiro; Sakurai, Masaru; Ishizaki, Masao; Kido, Teruhiko; Nakagawa, Hideaki; Suwazono, Yasushi

    2017-08-01

    The aim of this study was to estimate the benchmark dose (BMD) as the threshold limit level of the cadmium (Cd) concentration in rice for itai-itai disease and/or suspected disease; it was based on the data that previously evaluated the association for such diseases with the Cd concentration in rice by using a logistic regression model. From 1971 to 1976, a total of 2446 rice samples were analyzed across the 88 hamlets in the Jinzu river basin. The mean Cd concentration in rice in each hamlet was used as the index of external Cd exposure of the entire population of the hamlet. We employed the incidence of itai-itai disease and/or suspected disease obtained from the available 55 hamlets. As the threshold, the lower limit of the BMD (BMDL) of the Cd concentration in rice for itai-itai disease and/or suspected disease was estimated using a logistic model, setting the benchmark response at 1% or 2%. The estimated BMDLs of the Cd concentration in rice for itai-itai disease and/or suspected disease were 0.62-0.76 and 0.27-0.56 mg kg(-1) in men and women, respectively. The lowest BMDL was 0.27 mg kg(-1) in women. In the present study, the threshold limit level of the Cd concentration in rice for itai-itai disease, which is the most severe form of chronic Cd poisoning, was estimated for the first time. This result provides important information about the worldwide standard for the Cd concentration in rice. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.

  13. Benchmarking the stochastic time-dependent variational approach for excitation dynamics in molecular aggregates

    Science.gov (United States)

    Chorošajev, Vladimir; Gelzinis, Andrius; Valkunas, Leonas; Abramavicius, Darius

    2016-12-01

    Time dependent variational approach is a convenient method to characterize the excitation dynamics in molecular aggregates for different strengths of system-bath interaction a, which does not require any additional perturbative schemes. Until recently, however, this method was only applicable in zero temperature case. It has become possible to extend this method for finite temperatures with the introduction of stochastic time dependent variational approach. Here we present a comparison between this approach and the exact hierarchical equations of motion approach for describing excitation dynamics in a broad range of temperatures. We calculate electronic population evolution, absorption and auxiliary time resolved fluorescence spectra in different regimes and find that the stochastic approach shows excellent agreement with the exact approach when the system-bath coupling is sufficiently large and temperatures are high. The differences between the two methods are larger, when temperatures are lower or the system-bath coupling is small.

  14. Analysis Approach and Data Package for Mayak Public Doses

    Energy Technology Data Exchange (ETDEWEB)

    Eslinger, Paul W.; Napier, Bruce A.

    2013-09-18

    Historical activities at facilities producing nuclear materials for weapons released radioactivity into the air and water. Past studies in the United States have evaluated the release, atmospheric transport and environmental accumulation of 131I from the nuclear facilities at Hanford in Washington State and the resulting dose to members of the public (Farris et al. 1994). A multi-year dose reconstruction effort (Mokrov et al. 2004) is also being conducted to produce representative dose estimates for members of the public living near Mayak, Russia, from atmospheric releases of 131I at the facilities of the Mayak Production Association. The approach to calculating individual doses to members of the public from historical releases of airborne 131I has the following general steps: • Construct estimates of releases 131I to the air from production facilities. • Model the transport of 131I in the air and subsequent deposition on the ground and vegetation. • Model the accumulation of 131I in soil, water and food products (environmental media). • Calculate the dose for an individual by matching the appropriate lifestyle and consumption data for the individual to the concentrations of 131I in environmental media at their residence location. A number of computer codes were developed to facilitate the study of airborne 131I emissions at Hanford. Of particular interest is DESCARTES code that modeled accumulation of 131I in environmental media (Miley et al. 1994). In addition, the CIDER computer code estimated annual doses to individuals (Eslinger et al. 1994) using the equations and parameters specific to Hanford (Snyder et al. 1994). Several of the computer codes developed to model 131I releases from Hanford are general enough to be used for other facilities. Additional codes have been developed, including the new individual dose code CiderF (Eslinger and Napier 2013), and applied to historical releases of 131I from Mayak. This document provides a data package that

  15. Advanced Computational Approaches for Characterizing Stochastic Cellular Responses to Low Dose, Low Dose Rate Exposures

    Energy Technology Data Exchange (ETDEWEB)

    Scott, Bobby, R., Ph.D.

    2003-06-27

    OAK - B135 This project final report summarizes modeling research conducted in the U.S. Department of Energy (DOE), Low Dose Radiation Research Program at the Lovelace Respiratory Research Institute from October 1998 through June 2003. The modeling research described involves critically evaluating the validity of the linear nonthreshold (LNT) risk model as it relates to stochastic effects induced in cells by low doses of ionizing radiation and genotoxic chemicals. The LNT model plays a central role in low-dose risk assessment for humans. With the LNT model, any radiation (or genotoxic chemical) exposure is assumed to increase one¡¯s risk of cancer. Based on the LNT model, others have predicted tens of thousands of cancer deaths related to environmental exposure to radioactive material from nuclear accidents (e.g., Chernobyl) and fallout from nuclear weapons testing. Our research has focused on developing biologically based models that explain the shape of dose-response curves for low-dose radiation and genotoxic chemical-induced stochastic effects in cells. Understanding the shape of the dose-response curve for radiation and genotoxic chemical-induced stochastic effects in cells helps to better understand the shape of the dose-response curve for cancer induction in humans. We have used a modeling approach that facilitated model revisions over time, allowing for timely incorporation of new knowledge gained related to the biological basis for low-dose-induced stochastic effects in cells. Both deleterious (e.g., genomic instability, mutations, and neoplastic transformation) and protective (e.g., DNA repair and apoptosis) effects have been included in our modeling. Our most advanced model, NEOTRANS2, involves differing levels of genomic instability. Persistent genomic instability is presumed to be associated with nonspecific, nonlethal mutations and to increase both the risk for neoplastic transformation and for cancer occurrence. Our research results, based on

  16. Advanced Computational Approaches for Characterizing Stochastic Cellular Responses to Low Dose, Low Dose Rate Exposures

    Energy Technology Data Exchange (ETDEWEB)

    Scott, Bobby, R., Ph.D.

    2003-06-27

    OAK - B135 This project final report summarizes modeling research conducted in the U.S. Department of Energy (DOE), Low Dose Radiation Research Program at the Lovelace Respiratory Research Institute from October 1998 through June 2003. The modeling research described involves critically evaluating the validity of the linear nonthreshold (LNT) risk model as it relates to stochastic effects induced in cells by low doses of ionizing radiation and genotoxic chemicals. The LNT model plays a central role in low-dose risk assessment for humans. With the LNT model, any radiation (or genotoxic chemical) exposure is assumed to increase one¡¯s risk of cancer. Based on the LNT model, others have predicted tens of thousands of cancer deaths related to environmental exposure to radioactive material from nuclear accidents (e.g., Chernobyl) and fallout from nuclear weapons testing. Our research has focused on developing biologically based models that explain the shape of dose-response curves for low-dose radiation and genotoxic chemical-induced stochastic effects in cells. Understanding the shape of the dose-response curve for radiation and genotoxic chemical-induced stochastic effects in cells helps to better understand the shape of the dose-response curve for cancer induction in humans. We have used a modeling approach that facilitated model revisions over time, allowing for timely incorporation of new knowledge gained related to the biological basis for low-dose-induced stochastic effects in cells. Both deleterious (e.g., genomic instability, mutations, and neoplastic transformation) and protective (e.g., DNA repair and apoptosis) effects have been included in our modeling. Our most advanced model, NEOTRANS2, involves differing levels of genomic instability. Persistent genomic instability is presumed to be associated with nonspecific, nonlethal mutations and to increase both the risk for neoplastic transformation and for cancer occurrence. Our research results, based on

  17. Developmental toxicity of inhaled methanol in the CD-1 mouse, with quantitative dose-response modeling for estimation of benchmark doses

    Energy Technology Data Exchange (ETDEWEB)

    Rogers, J.M.; Mole, M.L.; Chernoff, N.; Barbee, B.D.; Turner, C.I.

    1993-01-01

    Pregnant CD-1 mice were exposed to 1,000, 2,000, 5,000, 7,500, 10,000, or 15,000 ppm on methanol for 7 hr/day on days 6-15 of gestation. On day 17 of gestation, remaining mice were weighed, killed and the gravid uterus was removed. Numbers of implantation sites, live and dead fetuses and resorptions were counted, and fetuses were examined externally and weighed as a litter. Significant increases in the incidence of exencephaly and cleft palate were observed at 5,000 ppm and above, increased postimplantation mortality at 7,500 ppm and above (including an increasing incidence of full-litter resorption), and reduced fetal weight at 10,000 ppm and above. A dose-related increase in cervical ribs or ossification sites lateral to the seventh cervical vertebra was significant at 2,000 ppm and above. Thus, the NOAEL for the developmental toxicity in this study is 1,000 ppm. The results of this study indicate that inhaled methanol is developmentally toxic in the mouse at exposure levels which were not maternally toxic. Litters of pregnant mice gavaged orally with 4 g methanol/kg displayed developmental toxic effects similar to those seen in the 10,000 ppm methanol exposure group. (Copyright (c) 1993 Wiley-Liss, Inc.)

  18. User-Centric Approach for Benchmark RDF Data Generator in Big Data Performance Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Purohit, Sumit; Paulson, Patrick R.; Rodriguez, Luke R.

    2016-02-05

    This research focuses on user-centric approach of building such tools and proposes a flexible, extensible, and easy to use framework to support performance analysis of Big Data systems. Finally, case studies from two different domains are presented to validate the framework.

  19. Benchmarking: applications to transfusion medicine.

    Science.gov (United States)

    Apelseth, Torunn Oveland; Molnar, Laura; Arnold, Emmy; Heddle, Nancy M

    2012-10-01

    Benchmarking is as a structured continuous collaborative process in which comparisons for selected indicators are used to identify factors that, when implemented, will improve transfusion practices. This study aimed to identify transfusion medicine studies reporting on benchmarking, summarize the benchmarking approaches used, and identify important considerations to move the concept of benchmarking forward in the field of transfusion medicine. A systematic review of published literature was performed to identify transfusion medicine-related studies that compared at least 2 separate institutions or regions with the intention of benchmarking focusing on 4 areas: blood utilization, safety, operational aspects, and blood donation. Forty-five studies were included: blood utilization (n = 35), safety (n = 5), operational aspects of transfusion medicine (n = 5), and blood donation (n = 0). Based on predefined criteria, 7 publications were classified as benchmarking, 2 as trending, and 36 as single-event studies. Three models of benchmarking are described: (1) a regional benchmarking program that collects and links relevant data from existing electronic sources, (2) a sentinel site model where data from a limited number of sites are collected, and (3) an institutional-initiated model where a site identifies indicators of interest and approaches other institutions. Benchmarking approaches are needed in the field of transfusion medicine. Major challenges include defining best practices and developing cost-effective methods of data collection. For those interested in initiating a benchmarking program, the sentinel site model may be most effective and sustainable as a starting point, although the regional model would be the ideal goal.

  20. Measuring and Benchmarking Technical Efficiency of Public Hospitals in Tianjin, China: A Bootstrap-Data Envelopment Analysis Approach.

    Science.gov (United States)

    Li, Hao; Dong, Siping

    2015-01-01

    China has long been stuck in applying traditional data envelopment analysis (DEA) models to measure technical efficiency of public hospitals without bias correction of efficiency scores. In this article, we have introduced the Bootstrap-DEA approach from the international literature to analyze the technical efficiency of public hospitals in Tianjin (China) and tried to improve the application of this method for benchmarking and inter-organizational learning. It is found that the bias corrected efficiency scores of Bootstrap-DEA differ significantly from those of the traditional Banker, Charnes, and Cooper (BCC) model, which means that Chinese researchers need to update their DEA models for more scientific calculation of hospital efficiency scores. Our research has helped shorten the gap between China and the international world in relative efficiency measurement and improvement of hospitals. It is suggested that Bootstrap-DEA be widely applied into afterward research to measure relative efficiency and productivity of Chinese hospitals so as to better serve for efficiency improvement and related decision making. © The Author(s) 2015.

  1. Comparison of Two Approaches for Nuclear Data Uncertainty Propagation in MCNPX for Selected Fast Spectrum Critical Benchmarks

    Science.gov (United States)

    Zhu, T.; Rochman, D.; Vasiliev, A.; Ferroukhi, H.; Wieselquist, W.; Pautz, A.

    2014-04-01

    Nuclear data uncertainty propagation based on stochastic sampling (SS) is becoming more attractive while leveraging modern computer power. Two variants of the SS approach are compared in this paper. The Total Monte Carlo (TMC) method by the Nuclear Research and Consultancy Group (NRG) generates perturbed ENDF-6-formatted nuclear data by varying nuclear reaction model parameters. At Paul Scherrer Institute (PSI) the Nuclear data Uncertainty Stochastic Sampling (NUSS) system generates perturbed ACE-formatted nuclear data files by applying multigroup nuclear data covariances onto pointwise ACE-formatted nuclear data. Uncertainties of 239Pu and 235U from ENDF/B-VII.1, ZZ-SCALE6/COVA-44G and TENDL covariance libraries are considered in NUSS and propagated in MCNPX calculations for well-studied Jezebel and Godiva fast spectrum critical benchmarks. The corresponding uncertainty results obtained by TMC are compared with NUSS results and the deterministic Sensitivity/Uncertainty method of TSUNAMI-3D from SCALE6 package is also applied to serve as a separate verification. The discrepancies in the propagated 239Pu and 235U uncertainties due to method and covariance differences are discussed.

  2. Interactive benchmarking

    DEFF Research Database (Denmark)

    Lawson, Lartey; Nielsen, Kurt

    2005-01-01

    distance functions. The frontier is given by an explicit quantile, e.g. “the best 90 %”. Using the explanatory model of the inefficiency, the user can adjust the frontiers by submitting state variables that influence the inefficiency. An efficiency study of Danish dairy farms is implemented......We discuss individual learning by interactive benchmarking using stochastic frontier models. The interactions allow the user to tailor the performance evaluation to preferences and explore alternative improvement strategies by selecting and searching the different frontiers using directional...... in the suggested benchmarking tool. The study investigates how different characteristics on dairy farms influences the technical efficiency....

  3. Benchmark of Client and Server-Side Catchment Delineation Approaches on Web-Based Systems

    Science.gov (United States)

    Demir, I.; Sermet, M. Y.; Sit, M. A.

    2016-12-01

    Recent advances in internet and cyberinfrastructure technologies have provided the capability to acquire large scale spatial data from various gauges and sensor networks. The collection of environmental data increased demand for applications which are capable of managing and processing large-scale and high-resolution data sets. With the amount and resolution of data sets provided, one of the challenging tasks for organizing and customizing hydrological data sets is delineation of watersheds on demand. Watershed delineation is a process for creating a boundary that represents the contributing area for a specific control point or water outlet, with intent of characterization and analysis of portions of a study area. Although many GIS tools and software for watershed analysis are available on desktop systems, there is a need for web-based and client-side techniques for creating a dynamic and interactive environment for exploring hydrological data. In this project, we demonstrated several watershed delineation techniques on the web with various techniques implemented on the client-side using JavaScript and WebGL, and on the server-side using Python and C++. We also developed a client-side GPGPU (General Purpose Graphical Processing Unit) algorithm to analyze high-resolution terrain data for watershed delineation which allows parallelization using GPU. The web-based real-time analysis of watershed segmentation can be helpful for decision-makers and interested stakeholders while eliminating the need of installing complex software packages and dealing with large-scale data sets. Utilization of the client-side hardware resources also eliminates the need of servers due its crowdsourcing nature. Our goal for future work is to improve other hydrologic analysis methods such as rain flow tracking by adapting presented approaches.

  4. Evaluation of the applicability of the Benchmark approach to existing toxicological data. Framework: Chemical compounds in the working place

    NARCIS (Netherlands)

    Appel MJ; Bouman HGM; Pieters MN; Slob W; CSR

    2001-01-01

    Vijf stoffen in de werkomgeving waarvoor risico-evaluaties beschikbaar waren, werden geselecteerd voor analyse met de benchmark-benadering. De kritische studies werden voor elk van deze stoffen geanalyseerd. De onderzochte toxicologische parameters betroffen zowel continue als ordinale gegevens.

  5. Dose-response curve estimation: a semiparametric mixture approach.

    Science.gov (United States)

    Yuan, Ying; Yin, Guosheng

    2011-12-01

    In the estimation of a dose-response curve, parametric models are straightforward and efficient but subject to model misspecifications; nonparametric methods are robust but less efficient. As a compromise, we propose a semiparametric approach that combines the advantages of parametric and nonparametric curve estimates. In a mixture form, our estimator takes a weighted average of the parametric and nonparametric curve estimates, in which a higher weight is assigned to the estimate with a better model fit. When the parametric model assumption holds, the semiparametric curve estimate converges to the parametric estimate and thus achieves high efficiency; when the parametric model is misspecified, the semiparametric estimate converges to the nonparametric estimate and remains consistent. We also consider an adaptive weighting scheme to allow the weight to vary according to the local fit of the models. We conduct extensive simulation studies to investigate the performance of the proposed methods and illustrate them with two real examples.

  6. New approach for food allergy management using low-dose oral food challenges and low-dose oral immunotherapies

    Directory of Open Access Journals (Sweden)

    Noriyuki Yanagida

    2016-04-01

    With food allergies, removing the need to eliminate a food that could be consumed in low doses could significantly improve quality of life. This review discusses the importance of an OFC and OIT that use low doses of causative foods as the target volumes. Utilizing an OFC or OIT with a low dose as the target volume could be a novel approach for accelerating the tolerance to causative foods.

  7. A mathematical approach to optimal selection of dose values in the additive dose method of ERP dosimetry

    Energy Technology Data Exchange (ETDEWEB)

    Hayes, R.B.; Haskell, E.H.; Kenner, G.H. [Utah Univ., Salt Lake City, UT (United States)

    1996-01-01

    Additive dose methods commonly used in electron paramagnetic resonance (EPR) dosimetry are time consuming and labor intensive. We have developed a mathematical approach for determining optimal spacing of applied doses and the number of spectra which should be taken at each dose level. Expected uncertainitites in the data points are assumed to be normally distributed with a fixed standard deviation and linearity of dose response is also assumed. The optimum spacing and number of points necessary for the minimal error can be estimated, as can the likely error in the resulting estimate. When low doses are being estimated for tooth enamel samples the optimal spacing is shown to be a concentration of points near the zero dose value with fewer spectra taken at a single high dose value within the range of known linearity. Optimization of the analytical process results in increased accuracy and sample throughput.

  8. HPC Analytics Support. Requirements for Uncertainty Quantification Benchmarks

    Energy Technology Data Exchange (ETDEWEB)

    Paulson, Patrick R. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Purohit, Sumit [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Rodriguez, Luke R. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2015-05-01

    This report outlines techniques for extending benchmark generation products so they support uncertainty quantification by benchmarked systems. We describe how uncertainty quantification requirements can be presented to candidate analytical tools supporting SPARQL. We describe benchmark data sets for evaluating uncertainty quantification, as well as an approach for using our benchmark generator to produce data sets for generating benchmark data sets.

  9. The Precautionary Principle and Statistical Approaches to Uncertainty

    DEFF Research Database (Denmark)

    Keiding, Niels; Budtz-Jørgensen, Esben

    2005-01-01

    Bayesian model averaging; Benchmark approach to safety standars in toxicology; dose-response relationships; environmental standards; exposure measurement uncertainty; Popper falsification......Bayesian model averaging; Benchmark approach to safety standars in toxicology; dose-response relationships; environmental standards; exposure measurement uncertainty; Popper falsification...

  10. The Precautionary Principle and statistical approaches to uncertainty

    DEFF Research Database (Denmark)

    Keiding, Niels; Budtz-Jørgensen, Esben

    2003-01-01

    Bayesian model averaging; Benchmark approach to safety standards in toxicology; dose-response relationship; environmental standards; exposure measurement uncertainty; Popper falsification......Bayesian model averaging; Benchmark approach to safety standards in toxicology; dose-response relationship; environmental standards; exposure measurement uncertainty; Popper falsification...

  11. PNNL Information Technology Benchmarking

    Energy Technology Data Exchange (ETDEWEB)

    DD Hostetler

    1999-09-08

    Benchmarking is a methodology for searching out industry best practices that lead to superior performance. It is exchanging information, not just with any organization, but with organizations known to be the best within PNNL, in industry, or in dissimilar industries with equivalent functions. It is used as a continuous improvement tool for business and technical processes, products, and services. Information technology--comprising all computer and electronic communication products and services--underpins the development and/or delivery of many PNNL products and services. This document describes the Pacific Northwest National Laboratory's (PNNL's) approach to information technology (IT) benchmarking. The purpose is to engage other organizations in the collaborative process of benchmarking in order to improve the value of IT services provided to customers. TM document's intended audience consists of other US Department of Energy (DOE) national laboratories and their IT staff. Although the individual participants must define the scope of collaborative benchmarking, an outline of IT service areas for possible benchmarking is described.

  12. Occupational dose constraints in interventional cardiology procedures: the DIMOND approach

    Energy Technology Data Exchange (ETDEWEB)

    Tsapaki, Virginia [Medical Physics Department, Konstantopoulio Agia Olga Hospital, Athens (Greece); Kottou, Sophia [Medical Physics Department, Athens University, Medical School, Athens (Greece); Vano, Eliseo [Medical Physics Service and Radiology Department, San Carlos University Hospital and Complutense University, Madrid (Spain); Komppa, Tuomo [Stuk, Radiation and Nuclear Safety Authority, Helsinki (Finland); Padovani, Renato [Servizio di Fisica Medica, Ospedale S Maria della Misericordia, Udine (Italy); Dowling, Annita [Medical Physics and Bioengineering Department, St James' s Hospital and Haughton Institute, Dublin (Ireland); Molfetas, Michael [Medical Physics Department, ' Evangelismos' Hospital, Athens (Greece); Neofotistou, Vassiliki [Medical Physics Department, Regional Athens General Hospital ' G Gennimatas' , Athens (Greece)

    2004-03-21

    Radiation fields involved in angiographic suites are most uneven with intensity and gradient varying widely with projection geometry. The European Commission DIMOND III project addressed among others, the issues regarding optimization of staff doses with an attempt to propose preliminary occupational dose constraints. Two thermoluminescent dosemeters (TLD) were used to assess operators' extremity doses (left shoulder and left foot) during 20 coronary angiographies (CAs) and 20 percutaneous transluminal coronary angioplasties (PTCAs) in five European centres. X-ray equipment, radiation protection measures used and the dose delivered to the patient in terms of dose-area product (DAP) were recorded so as to subsequently associate them with operator's dose. The range of staff doses noted for the same TLD position, centre and procedure type emphasizes the importance of protective measures and technical characteristics of x-ray equipment. Correlation of patient's DAP with staff shoulder dose is moderate whereas correlation of patient's DAP with staff foot dose is poor in both CA and PTCA. Therefore, it is difficult to predict operator's dose from patient's DAP mainly due to the different use of protective measures. A preliminary occupational dose constraint value was defined by calculating cardiologists' annual effective dose and found to be 0.6 mSv.

  13. How Benchmarking and Higher Education Came Together

    Science.gov (United States)

    Levy, Gary D.; Ronco, Sharron L.

    2012-01-01

    This chapter introduces the concept of benchmarking and how higher education institutions began to use benchmarking for a variety of purposes. Here, benchmarking is defined as a strategic and structured approach whereby an organization compares aspects of its processes and/or outcomes to those of another organization or set of organizations to…

  14. Confidence Level Based Approach to Total Dose Specification for Spacecraft Electronics

    Science.gov (United States)

    Xapsos, M. A.; Stauffer, C.; Phan, A.; McClure, S. S.; Ladbury, R. L.; Pellish, J. A.; Campola, M. J.; Label, K. A.

    2017-01-01

    A confidence level based approach to total dose radiation hardness assurance is presented for spacecraft electronics. It is applicable to both ionizing and displacement damage dose. Results are compared to the traditional approach that uses radiation design margin and advantages of the new approach are discussed.

  15. Evaluation of various approaches for assessing dose indicators and patient organ doses resulting from radiotherapy cone-beam CT

    Energy Technology Data Exchange (ETDEWEB)

    Rampado, Osvaldo, E-mail: orampado@cittadellasalute.to.it; Giglioli, Francesca Romana; Rossetti, Veronica; Ropolo, Roberto [Struttura Complessa Fisica Sanitaria, Azienda Ospedaliero Universitaria Città della Salute e della Scienza, Corso Bramante 88, Torino 10126 (Italy); Fiandra, Christian; Ragona, Riccardo [Radiation Oncology Department, University of Turin, Torino 10126 (Italy)

    2016-05-15

    Purpose: The aim of this study was to evaluate various approaches for assessing patient organ doses resulting from radiotherapy cone-beam CT (CBCT), by the use of thermoluminescent dosimeter (TLD) measurements in anthropomorphic phantoms, a Monte Carlo based dose calculation software, and different dose indicators as presently defined. Methods: Dose evaluations were performed on a CBCT Elekta XVI (Elekta, Crawley, UK) for different protocols and anatomical regions. The first part of the study focuses on using PCXMC software (PCXMC 2.0, STUK, Helsinki, Finland) for calculating organ doses, adapting the input parameters to simulate the exposure geometry, and beam dose distribution in an appropriate way. The calculated doses were compared to readouts of TLDs placed in an anthropomorphic Rando phantom. After this validation, the software was used for analyzing organ dose variability associated with patients’ differences in size and gender. At the same time, various dose indicators were evaluated: kerma area product (KAP), cumulative air-kerma at the isocenter (K{sub air}), cone-beam dose index, and central cumulative dose. The latter was evaluated in a single phantom and in a stack of three adjacent computed tomography dose index phantoms. Based on the different dose indicators, a set of coefficients was calculated to estimate organ doses for a range of patient morphologies, using their equivalent diameters. Results: Maximum organ doses were about 1 mGy for head and neck and 25 mGy for chest and pelvis protocols. The differences between PCXMC and TLDs doses were generally below 10% for organs within the field of view and approximately 15% for organs at the boundaries of the radiation beam. When considering patient size and gender variability, differences in organ doses up to 40% were observed especially in the pelvic region; for the organs in the thorax, the maximum differences ranged between 20% and 30%. Phantom dose indexes provided better correlation with organ

  16. Estimating the Technical Improvement of Energy Efficiency in the Automotive Industry—Stochastic and Deterministic Frontier Benchmarking Approaches

    Directory of Open Access Journals (Sweden)

    Seog-Chan Oh

    2014-09-01

    Full Text Available The car manufacturing industry, one of the largest energy consuming industries, has been making a considerable effort to improve its energy intensity by implementing energy efficiency programs, in many cases supported by government research or financial programs. While many car manufacturers claim that they have made substantial progress in energy efficiency improvement over the past years through their energy efficiency programs, the objective measurement of energy efficiency improvement has not been studied due to the lack of suitable quantitative methods. This paper proposes stochastic and deterministic frontier benchmarking models such as the stochastic frontier analysis (SFA model and the data envelopment analysis (DEA model to measure the effectiveness of energy saving initiatives in terms of the technical improvement of energy efficiency for the automotive industry, particularly vehicle assembly plants. Illustrative examples of the application of the proposed models are presented and demonstrate the overall benchmarking process to determine best practice frontier lines and to measure technical improvement based on the magnitude of frontier line shifts over time. Log likelihood ratio and Spearman rank-order correlation coefficient tests are conducted to determine the significance of the SFA model and its consistency with the DEA model. ENERGY STAR® EPI (Energy Performance Index are also calculated.

  17. Kvantitativ benchmark - Produktionsvirksomheder

    DEFF Research Database (Denmark)

    Sørensen, Ole H.; Andersen, Vibeke

    Rapport med resultatet af kvantitativ benchmark over produktionsvirksomhederne i VIPS projektet.......Rapport med resultatet af kvantitativ benchmark over produktionsvirksomhederne i VIPS projektet....

  18. Benchmarking in Student Affairs.

    Science.gov (United States)

    Mosier, Robert E.; Schwarzmueller, Gary J.

    2002-01-01

    Discusses the use of benchmarking in student affairs, focusing on issues related to student housing. Provides examples of how benchmarking has influenced administrative practice at many institutions. (EV)

  19. Radiography benchmark 2014

    Energy Technology Data Exchange (ETDEWEB)

    Jaenisch, G.-R., E-mail: Gerd-Ruediger.Jaenisch@bam.de; Deresch, A., E-mail: Gerd-Ruediger.Jaenisch@bam.de; Bellon, C., E-mail: Gerd-Ruediger.Jaenisch@bam.de [Federal Institute for Materials Research and Testing, Unter den Eichen 87, 12205 Berlin (Germany); Schumm, A.; Lucet-Sanchez, F.; Guerin, P. [EDF R and D, 1 avenue du Général de Gaulle, 92141 Clamart (France)

    2015-03-31

    The purpose of the 2014 WFNDEC RT benchmark study was to compare predictions of various models of radiographic techniques, in particular those that predict the contribution of scattered radiation. All calculations were carried out for homogenous materials and a mono-energetic X-ray point source in the energy range between 100 keV and 10 MeV. The calculations were to include the best physics approach available considering electron binding effects. Secondary effects like X-ray fluorescence and bremsstrahlung production were to be taken into account if possible. The problem to be considered had two parts. Part I examined the spectrum and the spatial distribution of radiation behind a single iron plate. Part II considered two equally sized plates, made of iron and aluminum respectively, only evaluating the spatial distribution. Here we present the results of above benchmark study, comparing them to MCNP as the assumed reference model. The possible origins of the observed deviations are discussed.

  20. An approach to the precise dosing of fluids

    Energy Technology Data Exchange (ETDEWEB)

    Mueller, Axel; Gunkel, Michael; Kappler, Horst; Rolland, Thomas; Magnete, Thomas

    2010-07-01

    Automotive dosing pumps have been available on the market for 25 years now. Initially used for fuel fired parking heaters in mobile systems - trucks, passenger cars e. g. -, this type of a reasonable dosing unit nowadays is applied in many fields. Based on the experience of delivering fuels, the dosing pump was advanced to deliver and admeasure more or less any kind of liquid media. One of the most innovative operational areas of such compact metering units is the fuel cell reformer technology, wherein a constant flow of a certain amount of fuel is desired. This type of pumps combines the abilities of priming, delivering respectively metering of liquids, thus helping to optimize existent systems. Thanks to the characteristics of the compact dosing unit, complex hydraulic systems can be avoided. In contrast to separated systems for delivering fluids and metering them subsequently, the extensive integration of functions leads to less complex, more robust systems. Some components may become dispensable, such as sensors, shut-off valves or injectors. Thus, the amount of electrical and hydraulic interfaces may be reduced to the minimum, so that the total costs of the system become significantly lower. Dosing units deliver fuel in a balanced manner. As they are designed as electromagnetically driven piston pumps, the piston is moved one to several times a second. The dosing pumps are able to pump a certain, small volume per stroke. Hence, based on this accurate volume, the total flow rate is determined by the frequency of the piston's movement only, which is the basis for easy control. This advantage, i. e. precise metering, is paid for with the disadvantage of the pulsing flow which is due to the principle of a piston pump. Current investigations into the flow characteristics show the significant potential which lies in the combination both principles: constant flow and precise metering. This effect can be achieved by designing the pump adequately or by using

  1. Benchmarking of energy time series

    Energy Technology Data Exchange (ETDEWEB)

    Williamson, M.A.

    1990-04-01

    Benchmarking consists of the adjustment of time series data from one source in order to achieve agreement with similar data from a second source. The data from the latter source are referred to as the benchmark(s), and often differ in that they are observed at a lower frequency, represent a higher level of temporal aggregation, and/or are considered to be of greater accuracy. This report provides an extensive survey of benchmarking procedures which have appeared in the statistical literature, and reviews specific benchmarking procedures currently used by the Energy Information Administration (EIA). The literature survey includes a technical summary of the major benchmarking methods and their statistical properties. Factors influencing the choice and application of particular techniques are described and the impact of benchmark accuracy is discussed. EIA applications and procedures are reviewed and evaluated for residential natural gas deliveries series and coal production series. It is found that the current method of adjusting the natural gas series is consistent with the behavior of the series and the methods used in obtaining the initial data. As a result, no change is recommended. For the coal production series, a staged approach based on a first differencing technique is recommended over the current procedure. A comparison of the adjustments produced by the two methods is made for the 1987 Indiana coal production series. 32 refs., 5 figs., 1 tab.

  2. Comparison of radon doses based on different radon monitoring approaches.

    Science.gov (United States)

    Vaupotič, Janja; Smrekar, Nataša; Žunić, Zora S

    2017-04-01

    In 43 places (23 schools, 3 kindergartens, 16 offices and one dwelling), indoor radon has been monitored as an intercomparison experiment, using α-scintillation cells (SC - Jožef Stefan Institute, Slovenia), various kinds of solid state nuclear track detectors (KfK - Karlsruhe Institute of Technology, Germany; UFO - National Institute of Radiological Sciences, Chiba, Japan; RET - University College Dublin, Ireland) and active electronic devices (EQF, Sarad, Germany). At the same place, the radon levels and, consequently, the effective doses obtained with different radon devices differed substantially (by a factor of 2 or more), and no regularity was observed as regards which detector would show a higher or lower dose. Copyright © 2016 Elsevier Ltd. All rights reserved.

  3. A Novel Approach for Evaluating Carbamate Mixtures for Dose Additivity

    Science.gov (United States)

    Two mathematical approaches were used to test the hypothesis ofdose-addition for a binary and a seven-chemical mixture ofN-methyl carbamates, toxicologically similar chemicals that inhibit cholinesterase (ChE). In the more novel approach, mixture data were not included in the ana...

  4. Effect of subchronic 2,3,7,8-tetrachlorodibenzo-p-dioxin exposure on immune system and target gene responses in mice: calculation of benchmark doses for CYP1A1 and CYP1A2 related enzyme activities

    Energy Technology Data Exchange (ETDEWEB)

    Vogel, C. [Medical Institute of Environmental Hygiene at the Heinrich Heine University of Duesseldorf, Division of Toxicology, Auf`m Hennekamp 50, D-40225 Duesseldorf (Germany); Donat, S. [Medical Institute of Environmental Hygiene at the Heinrich Heine University of Duesseldorf, Division of Toxicology, Auf`m Hennekamp 50, D-40225 Duesseldorf (Germany); Doehr, O. [Medical Institute of Environmental Hygiene at the Heinrich Heine University of Duesseldorf, Division of Toxicology, Auf`m Hennekamp 50, D-40225 Duesseldorf (Germany); Kremer, J. [Medical Institute of Environmental Hygiene at the Heinrich Heine University of Duesseldorf, Division of Immunology Auf`m Hennekamp 50, D-40225 Duesseldorf (Germany); Esser, C. [Medical Institute of Environmental Hygiene at the Heinrich Heine University of Duesseldorf, Division of Immunology Auf`m Hennekamp 50, D-40225 Duesseldorf (Germany); Roller, M. [Medical Institute of Environmental Hygiene at the Heinrich Heine University of Duesseldorf, Division of Experimental Hygiene Auf`m Hennekamp 50, D-40225 Duesseldorf (Germany); Abel, J. [Medical Institute of Environmental Hygiene at the Heinrich Heine University of Duesseldorf, Division of Toxicology, Auf`m Hennekamp 50, D-40225 Duesseldorf (Germany)

    1997-04-01

    The dose-effect relationships were analysed for several noncarcinogenic endpoints, such as immunological and biochemical responses at subchronic, low dose exposure of female C57BL/6 mice to 2,3,7,8-tetrachlorodibenzo-p-dioxin (TCDD). The animals were treated i.p. with TCDD according to the initial- and maintenance-dose principle for a period of 135 days. The initial doses were 1, 10 and 100 ng TCDD/kg, the weekly maintenance doses were 0.2, 2 and 20 ng TCDD/kg, respectively. At days 23, 79 and 135 of TCDD treatment 10 animals of each dose group were killed. As immunological parameters the number of thymocytes and the pattern of thymocyte subpopulations were determined. In liver, lung and thymus, mRNA expression of TGF-{alpha}, TGF-{beta}{sub 1}, TGF-{beta}{sub 2}, TGF-{beta}{sub 3}, TNF-{alpha}, IL-1{beta} and different CYP1 isoforms (CYP1A1, CYP1A2, CYP1B1) was analysed. In the livers, activities of 7-ethoxyresorufin-O-deethylase (EROD) and 7-methoxyresorufin-O-demethylase (MROD) were measured. TCDD content in the liver was determined. The main results are summarized as follows: (1) The TCDD doses were not sufficient to elicit dose-dependent changes of pattern of thymocyte subpopulation. (2) TCDD failed to change the mRNA expression of TGF-{alpha}, TGF-{beta} and TNF-{alpha}, but led to an increase of IL-1{beta} mRNA expression in liver, lung and thymus. The results show that the TCDD induced IL-1{beta} mRNA increase is at least as sensitive a marker as the induction of CYP1A isoforms. (3) The expression of CYP1B1 mRNA remained unchanged at the doses tested, while CYP1A1 and CYP1A2 mRNA expression was dose-dependently enhanced. EROD and MROD activities in the liver paralleled the increases of CYP1A1 and CYP1A2 mRNA expression. (4) Regression analysis of the data showed that most of the parameters tested fit a linear model. (5) From the data, a benchmark dose for EROD/MROD activities in the livers of female C57BL/6 mice of about 0.03 ng TCDD/kg per day was

  5. Bayesian penalized log-likelihood ratio approach for dose response clinical trial studies.

    Science.gov (United States)

    Tang, Yuanyuan; Cai, Chunyan; Sun, Liangrui; He, Jianghua

    2017-02-13

    In literature, there are a few unified approaches to test proof of concept and estimate a target dose, including the multiple comparison procedure using modeling approach, and the permutation approach proposed by Klingenberg. We discuss and compare the operating characteristics of these unified approaches and further develop an alternative approach in a Bayesian framework based on the posterior distribution of a penalized log-likelihood ratio test statistic. Our Bayesian approach is much more flexible to handle linear or nonlinear dose-response relationships and is more efficient than the permutation approach. The operating characteristics of our Bayesian approach are comparable to and sometimes better than both approaches in a wide range of dose-response relationships. It yields credible intervals as well as predictive distribution for the response rate at a specific dose level for the target dose estimation. Our Bayesian approach can be easily extended to continuous, categorical, and time-to-event responses. We illustrate the performance of our proposed method with extensive simulations and Phase II clinical trial data examples.

  6. The OSIRIS Weight of Evidence approach: ITS for the endpoints repeated-dose toxicity (RepDose ITS).

    Science.gov (United States)

    Tluczkiewicz, Inga; Batke, Monika; Kroese, Dinant; Buist, Harrie; Aldenberg, Tom; Pauné, Eduard; Grimm, Helvi; Kühne, Ralph; Schüürmann, Gerrit; Mangelsdorf, Inge; Escher, Sylvia E

    2013-11-01

    In the FP6 European project OSIRIS, Integrated Testing Strategies (ITSs) for relevant toxicological endpoints were developed to avoid new animal testing and thus to reduce time and costs. The present paper describes the development of an ITS for repeated-dose toxicity called RepDose ITS which evaluates the conditions under which in vivo non-guideline studies are reliable. In a tiered approach three aspects of these "non-guideline" studies are assessed: the documentation of the study (reliability), the quality of the study design (adequacy) and the scope of examination (validity). The reliability is addressed by the method "Knock-out criteria", which consists of four essential criteria for repeated-dose toxicity studies. A second tool, termed QUANTOS (Quality Assessment of Non-guideline Toxicity Studies), evaluates and weights the adequacy of the study by using intra-criterion and inter-criteria weighting. Finally, the Coverage approach calculates a probability that the detected Lowest-Observed-Effect-Level (LOEL) is similar to the LOEL of a guideline study dependent on the examined targets and organs of the non-guideline study. If the validity and adequacy of the non-guideline study are insufficient for risk assessment, the ITS proposes to apply category approach or the Threshold of Toxicological Concern (TTC) concept, and only as a last resort new animal-testing.

  7. Benchmarking v ICT

    OpenAIRE

    Blecher, Jan

    2009-01-01

    The aim of this paper is to describe benefits of benchmarking IT in wider context and benchmarking scope at all. I specify benchmarking as a process and mention basic rules and guidelines. Further I define IT benchmarking domains and describe possibilities of their use. Best known type of IT benchmark is cost benchmark which represents only a subset of benchmark opportunities. In this paper, is cost benchmark rather an imaginary first step to benchmarking contribution to company. IT benchmark...

  8. Benchmarking of control strategies for ATAD technology: a first approach to the automatic control of sludge treatment systems.

    Science.gov (United States)

    Zambrano, J A; Gil-Martinez, M; Garcia-Sanz, M; Irizar, I

    2009-01-01

    Autothermal Thermophilic Aerobic Digestion (ATAD technology) is a promising alternative to conventional digestion systems. Aeration is a key factor in the performance of these kinds of reactors, in relation to effluent quality and operating costs. At present, the realisation of automatic control in ATADs is in its infancy. Additionally, the lack of robust sensors also makes the control of these processes difficult: only redox potential and temperature sensors are reliable for operation in full-scale plants. Based as it is on the existing simulation protocols for benchmarking of control strategies for wastewater treatment plants (WWTP), this paper presents the definition and implementation of a similar protocol but specifically adapted to the needs of ATAD technology. The implemented simulation protocol has been used to validate two different control strategies for aeration (ST1 and ST2). In comparison to an open-loop operation for the ATAD, simulation results showed that the ST1 strategy was able to save aeration costs of around 2-4%. Unlike ST1, ST2 achieved maximum sludge stabilisation but at the expense of higher aeration costs.

  9. DSP Platform Benchmarking : DSP Platform Benchmarking

    OpenAIRE

    Xinyuan, Luo

    2009-01-01

    Benchmarking of DSP kernel algorithms was conducted in the thesis on a DSP processor for teaching in the course TESA26 in the department of Electrical Engineering. It includes benchmarking on cycle count and memory usage. The goal of the thesis is to evaluate the quality of a single MAC DSP instruction set and provide suggestions for further improvement in instruction set architecture accordingly. The scope of the thesis is limited to benchmark the processor only based on assembly coding. The...

  10. Research on computer systems benchmarking

    Science.gov (United States)

    Smith, Alan Jay (Principal Investigator)

    1996-01-01

    This grant addresses the topic of research on computer systems benchmarking and is more generally concerned with performance issues in computer systems. This report reviews work in those areas during the period of NASA support under this grant. The bulk of the work performed concerned benchmarking and analysis of CPUs, compilers, caches, and benchmark programs. The first part of this work concerned the issue of benchmark performance prediction. A new approach to benchmarking and machine characterization was reported, using a machine characterizer that measures the performance of a given system in terms of a Fortran abstract machine. Another report focused on analyzing compiler performance. The performance impact of optimization in the context of our methodology for CPU performance characterization was based on the abstract machine model. Benchmark programs are analyzed in another paper. A machine-independent model of program execution was developed to characterize both machine performance and program execution. By merging these machine and program characterizations, execution time can be estimated for arbitrary machine/program combinations. The work was continued into the domain of parallel and vector machines, including the issue of caches in vector processors and multiprocessors. All of the afore-mentioned accomplishments are more specifically summarized in this report, as well as those smaller in magnitude supported by this grant.

  11. Identification of the minimum effective dose for normally distributed data using a Bayesian variable selection approach.

    Science.gov (United States)

    Otava, Martin; Shkedy, Ziv; Hothorn, Ludwig A; Talloen, Willem; Gerhard, Daniel; Kasim, Adetayo

    2017-02-16

    The identification of the minimum effective dose is of high importance in the drug development process. In early stage screening experiments, establishing the minimum effective dose can be translated into a model selection based on information criteria. The presented alternative, Bayesian variable selection approach, allows for selection of the minimum effective dose, while taking into account model uncertainty. The performance of Bayesian variable selection is compared with the generalized order restricted information criterion on two dose-response experiments and through the simulations study. Which method has performed better depends on the complexity of the underlying model and the effect size relative to noise.

  12. The analysis of dose-response curve from bioassays with quantal response: Deterministic or statistical approaches?

    Science.gov (United States)

    Mougabure-Cueto, G; Sfara, V

    2016-04-25

    Dose-response relations can be obtained from systems at any structural level of biological matter, from the molecular to the organismic level. There are two types of approaches for analyzing dose-response curves: a deterministic approach, based on the law of mass action, and a statistical approach, based on the assumed probabilities distribution of phenotypic characters. Models based on the law of mass action have been proposed to analyze dose-response relations across the entire range of biological systems. The purpose of this paper is to discuss the principles that determine the dose-response relations. Dose-response curves of simple systems are the result of chemical interactions between reacting molecules, and therefore are supported by the law of mass action. In consequence, the shape of these curves is perfectly sustained by physicochemical features. However, dose-response curves of bioassays with quantal response are not explained by the simple collision of molecules but by phenotypic variations among individuals and can be interpreted as individual tolerances. The expression of tolerance is the result of many genetic and environmental factors and thus can be considered a random variable. In consequence, the shape of its associated dose-response curve has no physicochemical bearings; instead, they are originated from random biological variations. Due to the randomness of tolerance there is no reason to use deterministic equations for its analysis; on the contrary, statistical models are the appropriate tools for analyzing these dose-response relations.

  13. Case-mix adjustment approach to benchmarking prevalence rates of nosocomial infection in hospitals in Cyprus and Greece.

    Science.gov (United States)

    Kritsotakis, Evangelos I; Dimitriadis, Ioannis; Roumbelaki, Maria; Vounou, Emelia; Kontou, Maria; Papakyriakou, Panikos; Koliou-Mazeri, Maria; Varthalitis, Ioannis; Vrouchos, George; Troulakis, George; Gikas, Achilleas

    2008-08-01

    To examine the effect of heterogeneous case mix for a benchmarking analysis and interhospital comparison of the prevalence rates of nosocomial infection. Cross-sectional survey. Eleven hospitals located in Cyprus and in the region of Crete in Greece. The survey included all inpatients in the medical, surgical, pediatric, and gynecology-obstetrics wards, as well as those in intensive care units. Centers for Disease Control and Prevention criteria were used to define nosocomial infection. The information collected for all patients included demographic characteristics, primary admission diagnosis, Karnofsky functional status index, Charlson comorbidity index, McCabe-Jackson severity of illness classification, use of antibiotics, and prior exposures to medical and surgical risk factors. Outcome data were also recorded for all patients. Case mix-adjusted rates were calculated by using a multivariate logistic regression model for nosocomial infection risk and an indirect standardization method.Results. The overall prevalence rate of nosocomial infection was 7.0% (95% confidence interval, 5.9%-8.3%) among 1,832 screened patients. Significant variation in nosocomial infection rates was observed across hospitals (range, 2.2%-9.6%). Logistic regression analysis indicated that the mean predicted risk of nosocomial infection across hospitals ranged from 3.7% to 10.3%, suggesting considerable variation in patient risk. Case mix-adjusted rates ranged from 2.6% to 12.4%, and the relative ranking of hospitals was affected by case-mix adjustment in 8 cases (72.8%). Nosocomial infection was significantly and independently associated with mortality (adjusted odds ratio, 3.6 [95% confidence interval, 2.1-6.1]). The first attempt to rank the risk of nosocomial infection in these regions demonstrated the importance of accounting for heterogeneous case mix before attempting interhospital comparisons.

  14. New approach for food allergy management using low-dose oral food challenges and low-dose oral immunotherapies.

    Science.gov (United States)

    Yanagida, Noriyuki; Okada, Yu; Sato, Sakura; Ebisawa, Motohiro

    2016-04-01

    A number of studies have suggested that a large subset of children (approximately 70%) who react to unheated milk or egg can tolerate extensively heated forms of these foods. A diet that includes baked milk or egg is well tolerated and appears to accelerate the development of regular milk or egg tolerance when compared with strict avoidance. However, the indications for an oral food challenge (OFC) using baked products are limited for patients with high specific IgE values or large skin prick test diameters. Oral immunotherapies (OITs) are becoming increasingly popular for the management of food allergies. However, the reported efficacy of OIT is not satisfactory, given the high frequency of symptoms and requirement for long-term therapy. With food allergies, removing the need to eliminate a food that could be consumed in low doses could significantly improve quality of life. This review discusses the importance of an OFC and OIT that use low doses of causative foods as the target volumes. Utilizing an OFC or OIT with a low dose as the target volume could be a novel approach for accelerating the tolerance to causative foods.

  15. Benchmarking the sustainability performance of the Brazilian non-GM and GM soybean meal chains: An indicator-based approach

    NARCIS (Netherlands)

    Gaitan Cremaschi, D.; Pashaei Kamali, F.; Evert, van F.K.; Meuwissen, M.P.M.; Oude Lansink, A.G.J.M.

    2015-01-01

    A commonly accepted approach for measuring the sustainability of agricultural products is the first step toward treating traded products differentially according to their sustainability. If we were able to measure sustainability, business stakeholders could optimize food production chains, consumers

  16. Benchmarking the sustainability performance of the Brazilian non-GM and GM soybean meal chains: An indicator-based approach

    NARCIS (Netherlands)

    Gaitan Cremaschi, D.; Pashaei Kamali, F.; Evert, van F.K.; Meuwissen, M.P.M.; Oude Lansink, A.G.J.M.

    2015-01-01

    A commonly accepted approach for measuring the sustainability of agricultural products is the first step toward treating traded products differentially according to their sustainability. If we were able to measure sustainability, business stakeholders could optimize food production chains, consumers

  17. [Benchmarking in health care: conclusions and recommendations].

    Science.gov (United States)

    Geraedts, Max; Selbmann, Hans-Konrad

    2011-01-01

    The German Health Ministry funded 10 demonstration projects and accompanying research of benchmarking in health care. The accompanying research work aimed to infer generalisable findings and recommendations. We performed a meta-evaluation of the demonstration projects and analysed national and international approaches to benchmarking in health care. It was found that the typical benchmarking sequence is hardly ever realised. Most projects lack a detailed analysis of structures and processes of the best performers as a starting point for the process of learning from and adopting best practice. To tap the full potential of benchmarking in health care, participation in voluntary benchmarking projects should be promoted that have been demonstrated to follow all the typical steps of a benchmarking process.

  18. Aquatic Life Benchmarks

    Data.gov (United States)

    U.S. Environmental Protection Agency — The Aquatic Life Benchmarks is an EPA-developed set of criteria for freshwater species. These benchmarks are based on toxicity values reviewed by EPA and used in the...

  19. Benchmarking a DSP processor

    OpenAIRE

    Lennartsson, Per; Nordlander, Lars

    2002-01-01

    This Master thesis describes the benchmarking of a DSP processor. Benchmarking means measuring the performance in some way. In this report, we have focused on the number of instruction cycles needed to execute certain algorithms. The algorithms we have used in the benchmark are all very common in signal processing today. The results we have reached in this thesis have been compared to benchmarks for other processors, performed by Berkeley Design Technology, Inc. The algorithms were programm...

  20. Benchmarking Learning and Teaching: Developing a Method

    Science.gov (United States)

    Henderson-Smart, Cheryl; Winning, Tracey; Gerzina, Tania; King, Shalinie; Hyde, Sarah

    2006-01-01

    Purpose: To develop a method for benchmarking teaching and learning in response to an institutional need to validate a new program in Dentistry at the University of Sydney, Australia. Design/methodology/approach: After a collaborative partner, University of Adelaide, was identified, the areas of teaching and learning to be benchmarked, PBL…

  1. Radiation Detection Computational Benchmark Scenarios

    Energy Technology Data Exchange (ETDEWEB)

    Shaver, Mark W.; Casella, Andrew M.; Wittman, Richard S.; McDonald, Ben S.

    2013-09-24

    Modeling forms an important component of radiation detection development, allowing for testing of new detector designs, evaluation of existing equipment against a wide variety of potential threat sources, and assessing operation performance of radiation detection systems. This can, however, result in large and complex scenarios which are time consuming to model. A variety of approaches to radiation transport modeling exist with complementary strengths and weaknesses for different problems. This variety of approaches, and the development of promising new tools (such as ORNL’s ADVANTG) which combine benefits of multiple approaches, illustrates the need for a means of evaluating or comparing different techniques for radiation detection problems. This report presents a set of 9 benchmark problems for comparing different types of radiation transport calculations, identifying appropriate tools for classes of problems, and testing and guiding the development of new methods. The benchmarks were drawn primarily from existing or previous calculations with a preference for scenarios which include experimental data, or otherwise have results with a high level of confidence, are non-sensitive, and represent problem sets of interest to NA-22. From a technical perspective, the benchmarks were chosen to span a range of difficulty and to include gamma transport, neutron transport, or both and represent different important physical processes and a range of sensitivity to angular or energy fidelity. Following benchmark identification, existing information about geometry, measurements, and previous calculations were assembled. Monte Carlo results (MCNP decks) were reviewed or created and re-run in order to attain accurate computational times and to verify agreement with experimental data, when present. Benchmark information was then conveyed to ORNL in order to guide testing and development of hybrid calculations. The results of those ADVANTG calculations were then sent to PNNL for

  2. Update on the Code Intercomparison and Benchmark for Muon Fluence and Absorbed Dose Induced by an 18 GeV Electron Beam After Massive Iron Shielding

    Energy Technology Data Exchange (ETDEWEB)

    Fasso, A. [SLAC; Ferrari, A. [CERN; Ferrari, A. [HZDR, Dresden; Mokhov, N. V. [Fermilab; Mueller, S. E. [HZDR, Dresden; Nelson, W. R. [SLAC; Roesler, S. [CERN; Sanami, t.; Striganov, S. I. [Fermilab; Versaci, R. [Unlisted, CZ

    2016-12-01

    In 1974, Nelson, Kase and Svensson published an experimental investigation on muon shielding around SLAC high-energy electron accelerators [1]. They measured muon fluence and absorbed dose induced by 14 and 18 GeV electron beams hitting a copper/water beamdump and attenuated in a thick steel shielding. In their paper, they compared the results with the theoretical models available at that time. In order to compare their experimental results with present model calculations, we use the modern transport Monte Carlo codes MARS15, FLUKA2011 and GEANT4 to model the experimental setup and run simulations. The results are then compared between the codes, and with the SLAC data.

  3. Code intercomparison and benchmark for muon fluence and absorbed dose induced by an 18-GeV electron beam after massive iron shielding

    CERN Document Server

    Fasso, Alberto; Ferrari, Anna; Mokhov, Nikolai V; Mueller, Stefan E; Nelson, Walter Ralph; Roesler, Stefan; Sanami, Toshiya; Striganov, Sergei I; Versaci, Roberto

    2015-01-01

    In 1974, Nelson, Kase, and Svenson published an experimental investigation on muon shielding using the SLAC high energy LINAC. They measured muon fluence and absorbed dose induced by a 18 GeV electron beam hitting a copper/water beam dump and attenuated in a thick steel shielding. In their paper, they compared the results with the theoretical mode ls available at the time. In order to compare their experimental results with present model calculations, we use the modern transport Monte Carlo codes MARS15, FLUKA2011 and GEANT4 to model the experimental setup and run simulations. The results will then be compared between the codes, and with the SLAC data.

  4. The COST Benchmark

    DEFF Research Database (Denmark)

    Jensen, Christian Søndergaard; Tiesyte, Dalia; Tradisauskas, Nerius

    2006-01-01

    , and more are underway. As a result, there is an increasing need for an independent benchmark for spatio-temporal indexes. This paper characterizes the spatio-temporal indexing problem and proposes a benchmark for the performance evaluation and comparison of spatio-temporal indexes. Notably, the benchmark...

  5. Benchmarking semantic web technology

    CERN Document Server

    García-Castro, R

    2009-01-01

    This book addresses the problem of benchmarking Semantic Web Technologies; first, from a methodological point of view, proposing a general methodology to follow in benchmarking activities over Semantic Web Technologies and, second, from a practical point of view, presenting two international benchmarking activities that involved benchmarking the interoperability of Semantic Web technologies using RDF(S) as the interchange language in one activity and OWL in the other.The book presents in detail how the different resources needed for these interoperability benchmarking activities were defined:

  6. Dynamic behaviour of a planar micro-beam loaded by a fluid-gap: Analytical and numerical approach in a high frequency range, benchmark solutions

    Science.gov (United States)

    Novak, A.; Honzik, P.; Bruneau, M.

    2017-08-01

    Miniaturized vibrating MEMS devices, active (receivers or emitters) or passive devices, and their use for either new applications (hearing, meta-materials, consumer devices,…) or metrological purposes under non-standard conditions, are involved today in several acoustic domains. More in-depth characterisation than the classical ones available until now are needed. In this context, the paper presents analytical and numerical approaches for describing the behaviour of three kinds of planar micro-beams of rectangular shape (suspended rigid or clamped elastic planar beam) loaded by a backing cavity or a fluid-gap, surrounded by very thin slits, and excited by an incident acoustic field. The analytical approach accounts for the coupling between the vibrating structure and the acoustic field in the backing cavity, the thermal and viscous diffusion processes in the boundary layers in the slits and the cavity, the modal behaviour for the vibrating structure, and the non-uniformity of the acoustic field in the backing cavity which is modelled in using an integral formulation with a suitable Green's function. Benchmark solutions are proposed in terms of beam motion (from which the sensitivity, input impedance, and pressure transfer function can be calculated). A numerical implementation (FEM) is handled against which the analytical results are tested.

  7. A four-step approach to evaluate mixtures for consistency with dose addition.

    Science.gov (United States)

    Hertzberg, Richard C; Pan, Yi; Li, Ruosha; Haber, Lynne T; Lyles, Robert H; Herr, David W; Moser, Virginia C; Simmons, Jane Ellen

    2013-11-16

    Mixture risk assessment is often hampered by the lack of dose-response information on the mixture being assessed, forcing reliance on component formulas such as dose addition. We present a four-step approach for evaluating chemical mixture data for consistency with dose addition for use in supporting a component based mixture risk assessment. Following the concepts in the U.S. EPA mixture risk guidance (U.S. EPA, 2000a,b), toxicological interaction for a defined mixture (all components known) is departure from a clearly articulated definition of component additivity. For the common approach of dose additivity, the EPA guidance identifies three desirable characteristics, foremost of which is that the component chemicals are toxicologically similar. The other two characteristics are empirical: the mixture components have toxic potencies that are fixed proportions of each other (throughout the dose range of interest), and the mixture dose term in the dose additive prediction formula, which we call the combined prediction model (CPM), can be represented by a linear combination of the component doses. A consequent property of the proportional toxic potencies is that the component chemicals must share a common dose-response model, where only the dose coefficients depend on the chemical components. A further consequence is that the mixture data must be described by the same mathematical function ("mixture model") as the components, but with a distinct coefficient for the total mixture dose. The mixture response is predicted from the component dose-response curves by using the dose additive CPM and the prediction is then compared with the observed mixture results. The four steps are to evaluate: (1) toxic proportionality by determining how well the CPM matches the single chemical models regarding mean and variance; (2) fit of the mixture model to the mixture data; (3) agreement between the mixture data and the CPM prediction; and (4) consistency between the CPM and the

  8. Erosion of a confined stratified layer by a vertical jet – Detailed assessment of a CFD approach against the OECD/NEA PSI benchmark

    Energy Technology Data Exchange (ETDEWEB)

    Kelm, S., E-mail: s.kelm@fz-juelich.de [Forschungszentrum Jülich GmbH, 52425 Jülich (Germany); Kapulla, R., E-mail: ralf.kapulla@psi.ch [Paul Scherrer Institute, 5232 Villigen PSI (Switzerland); Allelein, H.-J., E-mail: allelein@lrst.rwth-aachen.de [Forschungszentrum Jülich GmbH, 52425 Jülich (Germany); RWTH Aachen University, 52080 Aachen (Germany)

    2017-02-15

    Highlights: • Systematic of a U-RANS approach, capable to be applied at containment scale. • Validation against measured and derived point-wise and field data. • Validation by means of transported quantities (concentration) but also underlying flow field and turbulent kinetic energy. • U-RANS approach yields in overall consistent and plausible results. • But unexpected difference in SST and k–ε identified for free-stream flow. - Abstract: Recently, a blind CFD benchmark exercise was conducted by the OECD/NEA (2013–2014) based on an experiment in the PANDA facility at the Paul Scherrer Institute (PSI) in Switzerland, investigating the turbulent erosion of a stratified helium rich layer in the upper region of the test vessel by means of a vertical air-helium jet impinging from below. In addition to the ‘classical’ pointwise measurements available for similar experiments conducted in the past, significant additional efforts were spent on the experimental characterization of the underlying flow field and turbulent quantities by means of particle image velocimetry (PIV) for the benchmark. This data is well suited for a detailed assessment of the driving jet flow and its interaction with the stratified layer. Both are essential in order to avoid elimination of different errors, which is possible if validation is performed in a global manner. Different impacts on the simulation results, in particular on the jet profile and on the mixing progress, are discussed in this paper. A systematic validation is carried out based on measured and derived quantities. It is identified that e.g. the mesh resolution in the jet and mixing zone has only a minor impact, while small changes in turbulence modeling strategy or the chosen model constants, like Sc{sub t}, significantly affect the simulation results. Finally, the chosen unsteady RANS model represents mixing process consistently in the transient progression and instantaneous flow variables, while an unexpected

  9. Implementing an Accurate and Rapid Sparse Sampling Approach for Low-Dose Atomic Resolution STEM Imaging

    Energy Technology Data Exchange (ETDEWEB)

    Kovarik, Libor; Stevens, Andrew J.; Liyu, Andrey V.; Browning, Nigel D.

    2016-10-17

    Aberration correction for scanning transmission electron microscopes (STEM) has dramatically increased spatial image resolution for beam-stable materials, but it is the sample stability rather than the microscope that often limits the practical resolution of STEM images. To extract physical information from images of beam sensitive materials it is becoming clear that there is a critical dose/dose-rate below which the images can be interpreted as representative of the pristine material, while above it the observation is dominated by beam effects. Here we describe an experimental approach for sparse sampling in the STEM and in-painting image reconstruction in order to reduce the electron dose/dose-rate to the sample during imaging. By characterizing the induction limited rise-time and hysteresis in scan coils, we show that sparse line-hopping approach to scan randomization can be implemented that optimizes both the speed of the scan and the amount of the sample that needs to be illuminated by the beam. The dose and acquisition time for the sparse sampling is shown to be effectively decreased by factor of 5x relative to conventional acquisition, permitting imaging of beam sensitive materials to be obtained without changing the microscope operating parameters. The use of sparse line-hopping scan to acquire STEM images is demonstrated with atomic resolution aberration corrected Z-contrast images of CaCO3, a material that is traditionally difficult to image by TEM/STEM because of dose issues.

  10. Transcriptional profiling of the dose response: a more powerful approach for characterizing drug activities.

    Directory of Open Access Journals (Sweden)

    Rui-Ru Ji

    2009-09-01

    Full Text Available The dose response curve is the gold standard for measuring the effect of a drug treatment, but is rarely used in genomic scale transcriptional profiling due to perceived obstacles of cost and analysis. One barrier to examining transcriptional dose responses is that existing methods for microarray data analysis can identify patterns, but provide no quantitative pharmacological information. We developed analytical methods that identify transcripts responsive to dose, calculate classical pharmacological parameters such as the EC50, and enable an in-depth analysis of coordinated dose-dependent treatment effects. The approach was applied to a transcriptional profiling study that evaluated four kinase inhibitors (imatinib, nilotinib, dasatinib and PD0325901 across a six-logarithm dose range, using 12 arrays per compound. The transcript responses proved a powerful means to characterize and compare the compounds: the distribution of EC50 values for the transcriptome was linked to specific targets, dose-dependent effects on cellular processes were identified using automated pathway analysis, and a connection was seen between EC50s in standard cellular assays and transcriptional EC50s. Our approach greatly enriches the information that can be obtained from standard transcriptional profiling technology. Moreover, these methods are automated, robust to non-optimized assays, and could be applied to other sources of quantitative data.

  11. RISKIND verification and benchmark comparisons

    Energy Technology Data Exchange (ETDEWEB)

    Biwer, B.M.; Arnish, J.J.; Chen, S.Y.; Kamboj, S.

    1997-08-01

    This report presents verification calculations and benchmark comparisons for RISKIND, a computer code designed to estimate potential radiological consequences and health risks to individuals and the population from exposures associated with the transportation of spent nuclear fuel and other radioactive materials. Spreadsheet calculations were performed to verify the proper operation of the major options and calculational steps in RISKIND. The program is unique in that it combines a variety of well-established models into a comprehensive treatment for assessing risks from the transportation of radioactive materials. Benchmark comparisons with other validated codes that incorporate similar models were also performed. For instance, the external gamma and neutron dose rate curves for a shipping package estimated by RISKIND were compared with those estimated by using the RADTRAN 4 code and NUREG-0170 methodology. Atmospheric dispersion of released material and dose estimates from the GENII and CAP88-PC codes. Verification results have shown the program to be performing its intended function correctly. The benchmark results indicate that the predictions made by RISKIND are within acceptable limits when compared with predictions from similar existing models.

  12. A Novel Admixture-Based Pharmacogenetic Approach to Refine Warfarin Dosing in Caribbean Hispanics.

    Directory of Open Access Journals (Sweden)

    Jorge Duconge

    Full Text Available This study is aimed at developing a novel admixture-adjusted pharmacogenomic approach to individually refine warfarin dosing in Caribbean Hispanic patients.A multiple linear regression analysis of effective warfarin doses versus relevant genotypes, admixture, clinical and demographic factors was performed in 255 patients and further validated externally in another cohort of 55 individuals.The admixture-adjusted, genotype-guided warfarin dosing refinement algorithm developed in Caribbean Hispanics showed better predictability (R2 = 0.70, MAE = 0.72mg/day than a clinical algorithm that excluded genotypes and admixture (R2 = 0.60, MAE = 0.99mg/day, and outperformed two prior pharmacogenetic algorithms in predicting effective dose in this population. For patients at the highest risk of adverse events, 45.5% of the dose predictions using the developed pharmacogenetic model resulted in ideal dose as compared with only 29% when using the clinical non-genetic algorithm (p<0.001. The admixture-driven pharmacogenetic algorithm predicted 58% of warfarin dose variance when externally validated in 55 individuals from an independent validation cohort (MAE = 0.89 mg/day, 24% mean bias.Results supported our rationale to incorporate individual's genotypes and unique admixture metrics into pharmacogenetic refinement models in order to increase predictability when expanding them to admixed populations like Caribbean Hispanics.ClinicalTrials.gov NCT01318057.

  13. A mathematical approach to optimizing the radiation dose distribution in heterogeneous tumours

    Energy Technology Data Exchange (ETDEWEB)

    Stavreva, N.A. [Waikato Univ., Hamilton (New Zealand). Dept. of Physics; Stavrev, P.V. [Waikato Univ., Hamilton (New Zealand). Dept. of Physics; Round, W.H. [Waikato Univ., Hamilton (New Zealand). Dept. of Physics

    1996-12-31

    This paper offers a general mathematical approach to dose distribution optimization which allows tumours with different degrees of complexity to be considered. Two different biological criteria - (A) keeping the control probability of the different parts of the tumour (local tumour control probability) uniform throughout the tumour and (B) minimizing the mean dose delivered to the tumour are studied. For both criteria we impose the requirement that the whole tumour control probability be kept on a certain desired level. It is proved that the adoption of the first criterion requires a dose distribution logarithmic with the cell density and proportional to the inverse of the cell radiosensitivity while the adoption of the second criterion necessitates a homogeneous dose distribution when the cell radiosensitivity is constant. The corresponding formula for the dose distribution in case of heterogeneous cell radiosensitivity is also given. The two criteria are compared in terms of local tumour control probability and mean dose delivered to the tumour. It is concluded that maintaining constant local tumour control probability (criterion A) may be of greater clinical importance then minimizing the mean dose (criterion B). (orig.).

  14. Ab initio molecular dynamics with noisy forces: Validating the quantum Monte Carlo approach with benchmark calculations of molecular vibrational properties

    Energy Technology Data Exchange (ETDEWEB)

    Luo, Ye, E-mail: xw111luoye@gmail.com; Sorella, Sandro, E-mail: sorella@sissa.it [International School for Advanced Studies (SISSA), and CRS Democritos, CNR-INFM, Via Bonomea 265, I-34136 Trieste (Italy); Zen, Andrea, E-mail: zen.andrea.x@gmail.com [Dipartimento di Fisica, Università di Roma “La Sapienza,” Piazzale Aldo Moro 2, I-00185 Rome (Italy)

    2014-11-21

    We present a systematic study of a recently developed ab initio simulation scheme based on molecular dynamics and quantum Monte Carlo. In this approach, a damped Langevin molecular dynamics is employed by using a statistical evaluation of the forces acting on each atom by means of quantum Monte Carlo. This allows the use of an highly correlated wave function parametrized by several variational parameters and describing quite accurately the Born-Oppenheimer energy surface, as long as these parameters are determined at the minimum energy condition. However, in a statistical method both the minimization method and the evaluation of the atomic forces are affected by the statistical noise. In this work, we study systematically the accuracy and reliability of this scheme by targeting the vibrational frequencies of simple molecules such as the water monomer, hydrogen sulfide, sulfur dioxide, ammonia, and phosphine. We show that all sources of systematic errors can be controlled and reliable frequencies can be obtained with a reasonable computational effort. This work provides convincing evidence that this molecular dynamics scheme can be safely applied also to realistic systems containing several atoms.

  15. Taking Stock of Corporate Benchmarking Practices: Panacea or Pandora's Box?

    Science.gov (United States)

    Fleisher, Craig S.; Burton, Sara

    1995-01-01

    Discusses why corporate communications/public relations (cc/pr) should be benchmarked (an approach used by cc/pr managers to demonstrate the value of their activities to skeptical organizational executives). Discusses myths about cc/pr benchmarking; types, targets, and focus of cc/pr benchmarking; a process model; and critical decisions about…

  16. Is there a uniform approach to the management of diffuse parenchymal lung disease (DPLD in the UK? A national benchmarking exercise

    Directory of Open Access Journals (Sweden)

    Partridge Martyn R

    2007-03-01

    Full Text Available Abstract Background Benchmarking is the comparison of a process to the work or results of others. We conducted a national benchmarking exercise to determine how UK pulmonologists manage common clinical scenarios in diffuse parenchymal lung disease (DPLD, and to determine current use and availability of investigative resources. We compared management decisions to existing international guidelines. Methods Consultant members of the British Thoracic Society were mailed a questionnaire seeking their views on the management of three common scenarios in DPLD. They were asked to choose from various management options for each case. Information was also obtained from the respondents on time served as a consultant, type of institution in which they worked and the availability of a local radiologist and histopathologist with an interest/expertise in thoracic medicine. Results 370 out of 689 consultants replied (54% response rate. There were many differences in the approach to the management of all three cases. Given a scenario of relapsing pulmonary sarcoidosis in a lady with multiple co-morbidities, half of respondents would institute treatment with a variety of immunosuppressants while a half would simply observe. 42% would refer a 57-year old lady with new onset DPLD for a surgical lung biopsy, while a similar number would not. 80% would have referred her for transplantation, but a fifth would not. 50% of consultants from district general hospitals would have opted for a surgical biopsy compared to 24% from cardiothoracic centres: this may reflect greater availability of a radiologist with special interest in thoracic imaging in cardiothoracic centres, obviating the need for tissue diagnosis. Faced with an elderly male with high resolution CT thorax (HRCT evidence of usual interstitial pneumonia (UIP, three quarters would observe, while a quarter would start immunosuppressants. 11% would refer for a surgical biopsy. 14% of UK pulmonologists responding

  17. Benchmarking in Identifying Priority Directions of Development of Telecommunication Operators

    OpenAIRE

    Zaharchenko Lolita A.; Kolesnyk Oksana A.

    2013-01-01

    The article analyses evolution of development and possibilities of application of benchmarking in the telecommunication sphere. It studies essence of benchmarking on the basis of generalisation of approaches of different scientists to definition of this notion. In order to improve activity of telecommunication operators, the article identifies the benchmarking technology and main factors, that determine success of the operator in the modern market economy, and the mechanism of benchmarking an...

  18. The Conic Benchmark Format

    DEFF Research Database (Denmark)

    Friberg, Henrik A.

    This document constitutes the technical reference manual of the Conic Benchmark Format with le extension: .cbf or .CBF. It unies linear, second-order cone (also known as conic quadratic) and semidenite optimization with mixed-integer variables. The format has been designed with benchmark libraries...... in mind, and therefore focuses on compact and easily parsable representations. The problem structure is separated from the problem data, and the format moreover facilitate benchmarking of hotstart capability through sequences of changes....

  19. Benchmarking monthly homogenization algorithms

    Directory of Open Access Journals (Sweden)

    V. K. C. Venema

    2011-08-01

    Full Text Available The COST (European Cooperation in Science and Technology Action ES0601: Advances in homogenization methods of climate series: an integrated approach (HOME has executed a blind intercomparison and validation study for monthly homogenization algorithms. Time series of monthly temperature and precipitation were evaluated because of their importance for climate studies and because they represent two important types of statistics (additive and multiplicative. The algorithms were validated against a realistic benchmark dataset. The benchmark contains real inhomogeneous data as well as simulated data with inserted inhomogeneities. Random break-type inhomogeneities were added to the simulated datasets modeled as a Poisson process with normally distributed breakpoint sizes. To approximate real world conditions, breaks were introduced that occur simultaneously in multiple station series within a simulated network of station data. The simulated time series also contained outliers, missing data periods and local station trends. Further, a stochastic nonlinear global (network-wide trend was added.

    Participants provided 25 separate homogenized contributions as part of the blind study as well as 22 additional solutions submitted after the details of the imposed inhomogeneities were revealed. These homogenized datasets were assessed by a number of performance metrics including (i the centered root mean square error relative to the true homogeneous value at various averaging scales, (ii the error in linear trend estimates and (iii traditional contingency skill scores. The metrics were computed both using the individual station series as well as the network average regional series. The performance of the contributions depends significantly on the error metric considered. Contingency scores by themselves are not very informative. Although relative homogenization algorithms typically improve the homogeneity of temperature data, only the best ones improve

  20. An adaptive model switching approach for phase I dose-finding trials.

    Science.gov (United States)

    Daimon, Takashi; Zohar, Sarah

    2013-01-01

    Model-based phase I dose-finding designs rely on a single model throughout the study for estimating the maximum tolerated dose (MTD). Thus, one major concern is about the choice of the most suitable model to be used. This is important because the dose allocation process and the MTD estimation depend on whether or not the model is reliable, or whether or not it gives a better fit to toxicity data. The aim of our work was to propose a method that would remove the need for a model choice prior to the trial onset and then allow it sequentially at each patient's inclusion. In this paper, we described model checking approach based on the posterior predictive check and model comparison approach based on the deviance information criterion, in order to identify a more reliable or better model during the course of a trial and to support clinical decision making. Further, we presented two model switching designs for a phase I cancer trial that were based on the aforementioned approaches, and performed a comparison between designs with or without model switching, through a simulation study. The results showed that the proposed designs had the advantage of decreasing certain risks, such as those of poor dose allocation and failure to find the MTD, which could occur if the model is misspecified. Copyright © 2013 John Wiley & Sons, Ltd.

  1. Benchmarking in Identifying Priority Directions of Development of Telecommunication Operators

    Directory of Open Access Journals (Sweden)

    Zaharchenko Lolita A.

    2013-12-01

    Full Text Available The article analyses evolution of development and possibilities of application of benchmarking in the telecommunication sphere. It studies essence of benchmarking on the basis of generalisation of approaches of different scientists to definition of this notion. In order to improve activity of telecommunication operators, the article identifies the benchmarking technology and main factors, that determine success of the operator in the modern market economy, and the mechanism of benchmarking and component stages of carrying out benchmarking by a telecommunication operator. It analyses the telecommunication market and identifies dynamics of its development and tendencies of change of the composition of telecommunication operators and providers. Having generalised the existing experience of benchmarking application, the article identifies main types of benchmarking of telecommunication operators by the following features: by the level of conduct of (branch, inter-branch and international benchmarking; by relation to participation in the conduct (competitive and joint; and with respect to the enterprise environment (internal and external.

  2. Internet based benchmarking

    DEFF Research Database (Denmark)

    Bogetoft, Peter; Nielsen, Kurt

    2005-01-01

    We discuss the design of interactive, internet based benchmarking using parametric (statistical) as well as nonparametric (DEA) models. The user receives benchmarks and improvement potentials. The user is also given the possibility to search different efficiency frontiers and hereby to explore...

  3. Thermal Performance Benchmarking (Presentation)

    Energy Technology Data Exchange (ETDEWEB)

    Moreno, G.

    2014-11-01

    This project will benchmark the thermal characteristics of automotive power electronics and electric motor thermal management systems. Recent vehicle systems will be benchmarked to establish baseline metrics, evaluate advantages and disadvantages of different thermal management systems, and identify areas of improvement to advance the state-of-the-art.

  4. Handleiding benchmark VO

    NARCIS (Netherlands)

    Blank, j.l.t.

    2008-01-01

    OnderzoeksrapportenArchiefTechniek, Bestuur en Management> Over faculteit> Afdelingen> Innovation Systems> IPSE> Onderzoek> Publicaties> Onderzoeksrapporten> Handleiding benchmark VO Handleiding benchmark VO 25 november 2008 door IPSE Studies Door J.L.T. Blank. Handleiding voor het lezen van de i

  5. Benchmark af erhvervsuddannelserne

    DEFF Research Database (Denmark)

    Bogetoft, Peter; Wittrup, Jesper

    I dette arbejdspapir diskuterer vi, hvorledes de danske erhvervsskoler kan benchmarkes, og vi præsenterer resultaterne af en række beregningsmodeller. Det er begrebsmæssigt kompliceret at benchmarke erhvervsskolerne. Skolerne udbyder en lang række forskellige uddannelser. Det gør det vanskeligt...

  6. Benchmarking af kommunernes sagsbehandling

    DEFF Research Database (Denmark)

    Amilon, Anna

    Fra 2007 skal Ankestyrelsen gennemføre benchmarking af kommuernes sagsbehandlingskvalitet. Formålet med benchmarkingen er at udvikle praksisundersøgelsernes design med henblik på en bedre opfølgning og at forbedre kommunernes sagsbehandling. Dette arbejdspapir diskuterer metoder for benchmarking...

  7. How Activists Use Benchmarks

    DEFF Research Database (Denmark)

    Seabrooke, Leonard; Wigan, Duncan

    2015-01-01

    Non-governmental organisations use benchmarks as a form of symbolic violence to place political pressure on firms, states, and international organisations. The development of benchmarks requires three elements: (1) salience, that the community of concern is aware of the issue and views...... are put to the test. The first is a reformist benchmarking cycle where organisations defer to experts to create a benchmark that conforms with the broader system of politico-economic norms. The second is a revolutionary benchmarking cycle driven by expert-activists that seek to contest strong vested...... interests and challenge established politico-economic norms. Differentiating these cycles provides insights into how activists work through organisations and with expert networks, as well as how campaigns on complex economic issues can be mounted and sustained....

  8. A novel approach to pharmacodynamic assessment of antimicrobial agents: new insights to dosing regimen design.

    Directory of Open Access Journals (Sweden)

    Vincent H Tam

    Full Text Available Pharmacodynamic modeling has been increasingly used as a decision support tool to guide dosing regimen selection, both in the drug development and clinical settings. Killing by antimicrobial agents has been traditionally classified categorically as concentration-dependent (which would favor less fractionating regimens or time-dependent (for which more frequent dosing is preferred. While intuitive and useful to explain empiric data, a more informative approach is necessary to provide a robust assessment of pharmacodynamic profiles in situations other than the extremes of the spectrum (e.g., agents which exhibit partial concentration-dependent killing. A quantitative approach to describe the interaction of an antimicrobial agent and a pathogen is proposed to fill this unmet need. A hypothetic antimicrobial agent with linear pharmacokinetics is used for illustrative purposes. A non-linear functional form (sigmoid Emax of killing consisted of 3 parameters is used. Using different parameter values in conjunction with the relative growth rate of the pathogen and antimicrobial agent concentration ranges, various conventional pharmacodynamic surrogate indices (e.g., AUC/MIC, Cmax/MIC, %T>MIC could be satisfactorily linked to outcomes. In addition, the dosing intensity represented by the average kill rate of a dosing regimen can be derived, which could be used for quantitative comparison. The relevance of our approach is further supported by experimental data from our previous investigations using a variety of gram-negative bacteria and antimicrobial agents (moxifloxacin, levofloxacin, gentamicin, amikacin and meropenem. The pharmacodynamic profiles of a wide range of antimicrobial agents can be assessed by a more flexible computational tool to support dosing selection.

  9. IMRT dose fractionation for head and neck cancer: Variation in current approaches will make standardisation difficult

    Energy Technology Data Exchange (ETDEWEB)

    Ho, Kean F. (Academic Dept. of Radiation Oncology, Univ. of Manchester, Manchester (United Kingdom)); Fowler, Jack F. (Dept. of Human Oncology and Medical Physics, Univ. of Wisconsin, Wisconsin (United States)); Sykes, Andrew J.; Yap, Beng K.; Lee, Lip W.; Slevin, Nick J. (Dept. of Clinical Oncology, Christie Hospital NHS Foundation Trust, Manchester (United Kingdom))

    2009-04-15

    Introduction. Altered fractionation has demonstrated clinical benefits compared to the conventional 2 Gy/day standard of 70 Gy. When using synchronous chemotherapy, there is uncertainty about optimum fractionation. IMRT with its potential for Simultaneous Integrated Boost (SIB) adds further to this uncertainty. This survey will examine international practice of IMRT fractionation and suggest possible reasons for diversity in approach. Material and methods. Fourteen international cancer centres were surveyed for IMRT dose/fractionation practised in each centre. Results. Twelve different types of dose fractionation were reported. Conventional 70-72 Gy (daily 2 Gy/fraction) was used in 3/14 centres with concurrent chemotherapy while 11/14 centres used altered fractionation. Two centres used >1 schedule. Reported schedules and number of centres included 6 fractions/week DAHANCA regime (3), modest hypofractionation (=2.2 Gy/fraction) (3), dose-escalated hypofractionation (=2.3 Gy/fraction) (4), hyperfractionation (1), continuous acceleration (1) and concomitant boost (1). Reasons for dose fractionation variability include (i) dose escalation; (ii) total irradiated volume; (iii) number of target volumes; (iv) synchronous systemic treatment; (v) shorter overall treatment time; (vi) resources availability; (vii) longer time on treatment couch; (viii) variable GTV margins; (ix) confidence in treatment setup; (x) late tissue toxicity and (xi) use of lower neck anterior fields. Conclusions. This variability in IMRT fractionation makes any meaningful comparison of treatment results difficult. Some standardization is needed particularly for design of multi-centre randomized clinical trials.

  10. Experimental validation of the filtering approach for dose monitoring in proton therapy at low energy.

    Science.gov (United States)

    Attanasi, F; Belcari, N; Camarda, M; Del Guerra, A; Moehrs, S; Rosso, V; Vecchio, S; Lanconelli, N; Cirrone, G A P; Di Rosa, F; Russo, G

    2008-06-01

    The higher physical selectivity of proton therapy demands higher accuracy in monitoring of the delivered dose, especially when the target volume is located next to critical organs and a fractionated therapy is applied. A method to verify a treatment plan and to ensure the high quality of the hadrontherapy is to use Positron Emission Tomography (PET), which takes advantage of the nuclear reactions between protons and nuclei in the tissue during irradiation producing beta(+)-emitting isotopes. Unfortunately, the PET image is not directly proportional to the delivered radiation dose distribution; this is the reason why, at the present time, the verification of depth dose profiles with PET techniques is limited to a comparison between the measured activity and the one predicted for the planned treatment by a Monte Carlo model. In this paper we test the feasibility of a different scheme, which permits to reconstruct the expected PET signal from the planned radiation dose distribution along beam direction in a simpler and more direct way. The considered filter model, based on the description of the PET image as a convolution of the dose distribution with a filter function, has already demonstrated its potential applicability to beam energies above 70 MeV. Our experimental investigation provides support to the possibility of extending the same approach to the lower energy range ([40, 70] MeV), in the perspective of its clinical application in eye proton therapy.

  11. Ultra-low-dose oral contraceptive pill: a new approach to a conventional requirement

    Directory of Open Access Journals (Sweden)

    Meenakshi Ahuja

    2017-01-01

    Full Text Available Combined oral contraceptives (COCs offer a convenient, safe, effective, and reversible method of contraception. However, their use is limited by side effects. Several strategies have been suggested to make COC use more acceptable among women. Reduction in the dose of estrogen is a commonly accepted approach to reduce the side effects of COC. Use of newer generation of progestins, such as gestodene, reduces the androgenic side effects generally associated with progestogens. Furthermore, reduction in hormone-free interval, as a 24/4 regimen, can reduce the risk of escape ovulation (hence preventing contraceptive failure and breakthrough bleeding. It also reduces hormonal fluctuations, thereby reducing the withdrawal symptoms. A COC with gestodene 60 µg and ethinylestradiol (EE 15 µg offers the lowest hormonal dose in 24/4 treatment regimen. This regimen has been shown to offer good contraceptive efficacy and cycle control. With the progress of treatment cycles, the incidence of breakthrough bleeding reduces. Gestodene/EE low dose 24/4 regimen was associated with lower incidence of estrogen-related adverse events, such as headache, breast tenderness, and nausea. Furthermore, COCs containing low dose of estrogen have not been associated with any adverse effect on haemostasis in healthy women. Ultra-low-dose COCs can be considered in women who are at risk of developing estrogen-related side effects.

  12. Verification and validation benchmarks.

    Energy Technology Data Exchange (ETDEWEB)

    Oberkampf, William Louis; Trucano, Timothy Guy

    2007-02-01

    Verification and validation (V&V) are the primary means to assess the accuracy and reliability of computational simulations. V&V methods and procedures have fundamentally improved the credibility of simulations in several high-consequence fields, such as nuclear reactor safety, underground nuclear waste storage, and nuclear weapon safety. Although the terminology is not uniform across engineering disciplines, code verification deals with assessing the reliability of the software coding, and solution verification deals with assessing the numerical accuracy of the solution to a computational model. Validation addresses the physics modeling accuracy of a computational simulation by comparing the computational results with experimental data. Code verification benchmarks and validation benchmarks have been constructed for a number of years in every field of computational simulation. However, no comprehensive guidelines have been proposed for the construction and use of V&V benchmarks. For example, the field of nuclear reactor safety has not focused on code verification benchmarks, but it has placed great emphasis on developing validation benchmarks. Many of these validation benchmarks are closely related to the operations of actual reactors at near-safety-critical conditions, as opposed to being more fundamental-physics benchmarks. This paper presents recommendations for the effective design and use of code verification benchmarks based on manufactured solutions, classical analytical solutions, and highly accurate numerical solutions. In addition, this paper presents recommendations for the design and use of validation benchmarks, highlighting the careful design of building-block experiments, the estimation of experimental measurement uncertainty for both inputs and outputs to the code, validation metrics, and the role of model calibration in validation. It is argued that the understanding of predictive capability of a computational model is built on the level of

  13. Ultra-low-dose oral contraceptive pill: a new approach to a conventional requirement

    OpenAIRE

    Meenakshi Ahuja; Pramod Pujari

    2017-01-01

    Combined oral contraceptives (COCs) offer a convenient, safe, effective, and reversible method of contraception. However, their use is limited by side effects. Several strategies have been suggested to make COC use more acceptable among women. Reduction in the dose of estrogen is a commonly accepted approach to reduce the side effects of COC. Use of newer generation of progestins, such as gestodene, reduces the androgenic side effects generally associated with progestogens. Furthermore, reduc...

  14. Benchmark dose of saliva fluoride concentration in adolescents and it's relationship to the prevalence of dental fluorosis%儿童唾液氟基准剂量及与氟斑牙相关关系的研究

    Institute of Scientific and Technical Information of China (English)

    于阳阳; 王连芳; 赵伟; 邹冬荣; 郭蕊

    2016-01-01

    Objective To study the benchmark dose (BMD) of fluoride concentration in saliva,and to evaluate the significance of saliva fluoride on control and prevention of endemic fluorosis.Methods In September 2014,middle school students in endemic fluorosis areas and non-endemic fluorosis areas in North China Petoleum were selected as objects.The contents of fluoride in water,urine and saliva were determined.The correlation of fluoride content in water,urine fluoride and fluoride concentration in saliva was analyzed.According to the levels of the saliva fluoride concentration,the children were divided into 11 groups,< 1.00,1.00-,2.00-,3.00-,4.00-,5.00-,6.00-,7.00-,8.00-,9.00-and ≥ 10.00 mg/L.The prevalence of dental fluorosis and defected dental fluorosis were investigated and the saliva fluoride concentration was calculated by Banch-Mark Dose Software.Results Compared with non endemic areas,the fluoride contents in water,urine and saliva [(2.13 ± 0.13),(1.29 ±0.73),(4.01 ± 3.61) mg/L] were higher than that in endemic areas [(0.67 ± 0.13),(0.38 ± 0.08),(0.75 ± 0.12) mg/L,t =158.730,24.780,18.114,all P < 0.01].The fluoride concentration in saliva was positively correlated with the fluoride content in water and urine in endemic areas (r =0.626,0.945,all P < 0.01).The (BMDs and benchmark dose lower bound (BMDLs) were 0.91,0.54,3.72,3.32 mg/L respectively,calculated by Banch-Mark Dose Software.With the increase of fluoride concentration in saliva,the prevalence of dental fluorosis and defect dental fluorosis had increased too,especially when the fluoride content in saliva was more than 4 mg/L.There were significant doseresponse relationships between the urine fluoride and the prevalence of dental fluorosis and defected dental fluorosis.Conclusion The fluoride concentration in saliva could be used as one of the evaluation indexes of fluorosis,and the BMD of saliva fluoride concentration in endemic fluorosis areas is suggested as 0.91 mg/L.%目的 探讨唾液氟

  15. The OSIRIS Weight of Evidence approach: ITS for the endpoints repeated-dose toxicity (RepDose ITS)

    NARCIS (Netherlands)

    Tluczkiewicz, I.; Batke, M.; Kroese, D.; Buist, H.; Aldenberg, T.; Pauné, E.; Grimm, H.; Kühne, R.; Schüürmann, G.; Mangelsdorf, I.; Escher, S.E.

    2013-01-01

    In the FP6 European project OSIRIS, Integrated Testing Strategies (ITSs) for relevant toxicological endpoints were developed to avoid new animal testing and thus to reduce time and costs. The present paper describes the development of an ITS for repeated-dose toxicity called RepDose ITS which evalua

  16. Developing integrated benchmarks for DOE performance measurement

    Energy Technology Data Exchange (ETDEWEB)

    Barancik, J.I.; Kramer, C.F.; Thode, Jr. H.C.

    1992-09-30

    The objectives of this task were to describe and evaluate selected existing sources of information on occupational safety and health with emphasis on hazard and exposure assessment, abatement, training, reporting, and control identifying for exposure and outcome in preparation for developing DOE performance benchmarks. Existing resources and methodologies were assessed for their potential use as practical performance benchmarks. Strengths and limitations of current data resources were identified. Guidelines were outlined for developing new or improved performance factors, which then could become the basis for selecting performance benchmarks. Data bases for non-DOE comparison populations were identified so that DOE performance could be assessed relative to non-DOE occupational and industrial groups. Systems approaches were described which can be used to link hazards and exposure, event occurrence, and adverse outcome factors, as needed to generate valid, reliable, and predictive performance benchmarks. Data bases were identified which contain information relevant to one or more performance assessment categories . A list of 72 potential performance benchmarks was prepared to illustrate the kinds of information that can be produced through a benchmark development program. Current information resources which may be used to develop potential performance benchmarks are limited. There is need to develop an occupational safety and health information and data system in DOE, which is capable of incorporating demonstrated and documented performance benchmarks prior to, or concurrent with the development of hardware and software. A key to the success of this systems approach is rigorous development and demonstration of performance benchmark equivalents to users of such data before system hardware and software commitments are institutionalized.

  17. Manganese induced toxic effects on heart detected by benchmark dose method%应用基准剂量法探讨锰所致心脏的毒性作用

    Institute of Scientific and Technical Information of China (English)

    江兰

    2012-01-01

    [ Objective ] To discuss the application of benchmark dose ( BM D) of manganese in prevention and treatment of manganese induced cardiac harm. [Methods] Welder and electrician of a factory in Guangdong Province were selected as study object, and divided into 8 groups according to their urine manganese content. The person-time and rate of ECG abnormality was calculated. [Results] Rate of ECG abnormality was increased along with the increase in urinary manganese content among workers. The BMD and BMOL was 1.420 and 0.510 μmnol/L respectively which was calculated on urine manganese content and ECG abnormal rate of dose-response relationship. [ Conclusion] The results of BMD show that rate of ECG abnormality is not the sensitive biological indicator for determination of urine manganese biological exposure limits.%目的 探讨尿锰的基准剂量(BMD)在锰所致心脏危害防治中的应用.方法 选择广东省某厂电焊工人和电工为调查对象,根据工人尿锰含量将工人分为8组,分别统计各组工人的心电图异常人次和心电图异常率.结果 随着工人尿锰含量的增加工人心电图异常率呈逐渐增加的趋势.根据工人尿中锰含量及其心电图异常率的剂量-反应关系计算的尿锰含量的BMD和基准剂量下限值(BMDL)为1.420和0.510 μmol/L.结论 根据该项BMD的计算结果可知,心电图异常率并不是制定尿锰生物接触限值敏感的生物指标.

  18. Benchmarking expert system tools

    Science.gov (United States)

    Riley, Gary

    1988-01-01

    As part of its evaluation of new technologies, the Artificial Intelligence Section of the Mission Planning and Analysis Div. at NASA-Johnson has made timing tests of several expert system building tools. Among the production systems tested were Automated Reasoning Tool, several versions of OPS5, and CLIPS (C Language Integrated Production System), an expert system builder developed by the AI section. Also included in the test were a Zetalisp version of the benchmark along with four versions of the benchmark written in Knowledge Engineering Environment, an object oriented, frame based expert system tool. The benchmarks used for testing are studied.

  19. Methodology for Benchmarking IPsec Gateways

    Directory of Open Access Journals (Sweden)

    Adam Tisovský

    2012-08-01

    Full Text Available The paper analyses forwarding performance of IPsec gateway over the rage of offered loads. It focuses on the forwarding rate and packet loss particularly at the gateway’s performance peak and at the state of gateway’s overload. It explains possible performance degradation when the gateway is overloaded by excessive offered load. The paper further evaluates different approaches for obtaining forwarding performance parameters – a widely used throughput described in RFC 1242, maximum forwarding rate with zero packet loss and us proposed equilibrium throughput. According to our observations equilibrium throughput might be the most universal parameter for benchmarking security gateways as the others may be dependent on the duration of test trials. Employing equilibrium throughput would also greatly shorten the time required for benchmarking. Lastly, the paper presents methodology and a hybrid step/binary search algorithm for obtaining value of equilibrium throughput.

  20. iDrug-Target: predicting the interactions between drug compounds and target proteins in cellular networking via benchmark dataset optimization approach.

    Science.gov (United States)

    Xiao, Xuan; Min, Jian-Liang; Lin, Wei-Zhong; Liu, Zi; Cheng, Xiang; Chou, Kuo-Chen

    2015-01-01

    Information about the interactions of drug compounds with proteins in cellular networking is very important for drug development. Unfortunately, all the existing predictors for identifying drug-protein interactions were trained by a skewed benchmark data-set where the number of non-interactive drug-protein pairs is overwhelmingly larger than that of the interactive ones. Using this kind of highly unbalanced benchmark data-set to train predictors would lead to the outcome that many interactive drug-protein pairs might be mispredicted as non-interactive. Since the minority interactive pairs often contain the most important information for drug design, it is necessary to minimize this kind of misprediction. In this study, we adopted the neighborhood cleaning rule and synthetic minority over-sampling technique to treat the skewed benchmark datasets and balance the positive and negative subsets. The new benchmark datasets thus obtained are called the optimized benchmark datasets, based on which a new predictor called iDrug-Target was developed that contains four sub-predictors: iDrug-GPCR, iDrug-Chl, iDrug-Ezy, and iDrug-NR, specialized for identifying the interactions of drug compounds with GPCRs (G-protein-coupled receptors), ion channels, enzymes, and NR (nuclear receptors), respectively. Rigorous cross-validations on a set of experiment-confirmed datasets have indicated that these new predictors remarkably outperformed the existing ones for the same purpose. To maximize users' convenience, a public accessible Web server for iDrug-Target has been established at http://www.jci-bioinfo.cn/iDrug-Target/ , by which users can easily get their desired results. It has not escaped our notice that the aforementioned strategy can be widely used in many other areas as well.

  1. Financial Integrity Benchmarks

    Data.gov (United States)

    City of Jackson, Mississippi — This data compiles standard financial integrity benchmarks that allow the City to measure its financial standing. It measure the City's debt ratio and bond ratings....

  2. GeodeticBenchmark_GEOMON

    Data.gov (United States)

    Vermont Center for Geographic Information — The GeodeticBenchmark_GEOMON data layer consists of geodetic control monuments (points) that have a known position or spatial reference. The locations of these...

  3. Diagnostic Algorithm Benchmarking

    Science.gov (United States)

    Poll, Scott

    2011-01-01

    A poster for the NASA Aviation Safety Program Annual Technical Meeting. It describes empirical benchmarking on diagnostic algorithms using data from the ADAPT Electrical Power System testbed and a diagnostic software framework.

  4. On Big Data Benchmarking

    OpenAIRE

    Han, Rui; Lu, Xiaoyi

    2014-01-01

    Big data systems address the challenges of capturing, storing, managing, analyzing, and visualizing big data. Within this context, developing benchmarks to evaluate and compare big data systems has become an active topic for both research and industry communities. To date, most of the state-of-the-art big data benchmarks are designed for specific types of systems. Based on our experience, however, we argue that considering the complexity, diversity, and rapid evolution of big data systems, fo...

  5. Benchmarking in Foodservice Operations.

    Science.gov (United States)

    2007-11-02

    51. Lingle JH, Schiemann WA. From balanced scorecard to strategic gauges: Is measurement worth it? Mgt Rev. 1996; 85(3):56-61. 52. Struebing L...studies lasted from nine to twelve months, and could extend beyond that time for numerous reasons (49). Benchmarking was not industrial tourism , a...not simply data comparison, a fad, a means for reducing resources, a quick-fix program, or industrial tourism . Benchmarking was a complete process

  6. A novel model-based approach for dose determination of glycopyrronium bromide in COPD

    Directory of Open Access Journals (Sweden)

    Arievich Helen

    2012-12-01

    Full Text Available Abstract Background Glycopyrronium bromide (NVA237 is an inhaled long-acting muscarinic antagonist in development for treatment of COPD. This study compared the efficacy and safety of once-daily (OD and twice-daily (BID glycopyrronium bromide regimens, using a novel model-based approach, in patients with moderate-to-severe COPD. Methods Double-blind, randomized, dose-finding trial with an eight-treatment, two-period, balanced incomplete block design. Patients (smoking history ≥10 pack-years, post-bronchodilator FEV1 ≥30% and 1/FVC 1 at Day 28. Results 385 patients (mean age 61.2 years; mean post-bronchodilator FEV1 53% predicted were randomized; 88.6% completed. All OD and BID dosing regimens produced dose-dependent bronchodilation; at Day 28, increases in mean trough FEV1 versus placebo were statistically significant for all regimens, ranging from 51 mL (glycopyrronium bromide 12.5 μg OD to 160 mL (glycopyrronium bromide 50 μg BID. Pharmacodynamic steady-state was reached by Day 7. There was a small separation (≤37 mL between BID and OD dose–response curves for mean trough FEV1 at steady-state in favour of BID dosing. Over 24 hours, separation between OD and BID regimens was even smaller (FEV1 AUC0-24h maximum difference for equivalent daily dose regimens: 8 mL. Dose–response results for FEV1 at 12 hours, FEV1 AUC0-12h and FEV1 AUC0-4h at steady-state showed OD regimens provided greater improvement over placebo than BID regimens for total daily doses of 25 μg, 50 μg and 100 μg, while the reverse was true for OD versus BID regimens from 12–24 hours. The 12.5 μg BID dose produced a marginally higher improvement in trough FEV1 versus placebo than 50 μg OD, however, the response at 12 hours over placebo was suboptimal (74 mL. Glycopyrronium bromide was safe and well tolerated at all doses. Conclusions Glycopyrronium bromide 50 μg OD provides significant bronchodilation over a 24 hour period

  7. Benchmarking File System Benchmarking: It *IS* Rocket Science

    OpenAIRE

    Seltzer, Margo I.; Tarasov, Vasily; Bhanage, Saumitra; Zadok, Erez

    2011-01-01

    The quality of file system benchmarking has not improved in over a decade of intense research spanning hundreds of publications. Researchers repeatedly use a wide range of poorly designed benchmarks, and in most cases, develop their own ad-hoc benchmarks. Our community lacks a definition of what we want to benchmark in a file system. We propose several dimensions of file system benchmarking and review the wide range of tools and techniques in widespread use. We experimentally show that even t...

  8. The KMAT: Benchmarking Knowledge Management.

    Science.gov (United States)

    de Jager, Martha

    Provides an overview of knowledge management and benchmarking, including the benefits and methods of benchmarking (e.g., competitive, cooperative, collaborative, and internal benchmarking). Arthur Andersen's KMAT (Knowledge Management Assessment Tool) is described. The KMAT is a collaborative benchmarking tool, designed to help organizations make…

  9. NEW METHODICAL APPROACH FOR CALCULATION OF THE INDIVIDUALIZED INTERNAL DOSES OF PERSONS AFFECTED DUE TO THE CHERNOBYL ACCIDENT

    Directory of Open Access Journals (Sweden)

    E. A. Drozd

    2014-01-01

    Full Text Available The basis of methodical approach for calculation of the individualized internal doses is the con-firmed original scientific hypothesis that every group of individuals which are homogeneous on demographic characteristics (gender and age, on a curve of dose distribution that is constructed according to the data of individual measurements of Cs137 in the human body (WB measurements, has the determined location, thus, that is constant in time, i.e. percentiles of dose distribution corresponding to the average internal dose of every age group of men and women on a curve of dose distribution occupy the certain, steady in time, location. Keywords: individualized internal dose, percentile of dose distribution, stability.

  10. Comparison of linear and nonlinear programming approaches for "worst case dose" and "minmax" robust optimization of intensity-modulated proton therapy dose distributions.

    Science.gov (United States)

    Zaghian, Maryam; Cao, Wenhua; Liu, Wei; Kardar, Laleh; Randeniya, Sharmalee; Mohan, Radhe; Lim, Gino

    2017-03-01

    Robust optimization of intensity-modulated proton therapy (IMPT) takes uncertainties into account during spot weight optimization and leads to dose distributions that are resilient to uncertainties. Previous studies demonstrated benefits of linear programming (LP) for IMPT in terms of delivery efficiency by considerably reducing the number of spots required for the same quality of plans. However, a reduction in the number of spots may lead to loss of robustness. The purpose of this study was to evaluate and compare the performance in terms of plan quality and robustness of two robust optimization approaches using LP and nonlinear programming (NLP) models. The so-called "worst case dose" and "minmax" robust optimization approaches and conventional planning target volume (PTV)-based optimization approach were applied to designing IMPT plans for five patients: two with prostate cancer, one with skull-based cancer, and two with head and neck cancer. For each approach, both LP and NLP models were used. Thus, for each case, six sets of IMPT plans were generated and assessed: LP-PTV-based, NLP-PTV-based, LP-worst case dose, NLP-worst case dose, LP-minmax, and NLP-minmax. The four robust optimization methods behaved differently from patient to patient, and no method emerged as superior to the others in terms of nominal plan quality and robustness against uncertainties. The plans generated using LP-based robust optimization were more robust regarding patient setup and range uncertainties than were those generated using NLP-based robust optimization for the prostate cancer patients. However, the robustness of plans generated using NLP-based methods was superior for the skull-based and head and neck cancer patients. Overall, LP-based methods were suitable for the less challenging cancer cases in which all uncertainty scenarios were able to satisfy tight dose constraints, while NLP performed better in more difficult cases in which most uncertainty scenarios were hard to meet

  11. Passive dosing: an approach to control mutagen exposure in the Ames fluctuation test.

    Science.gov (United States)

    Bougeard, Cynthia; Gallampois, Christine; Brack, Werner

    2011-04-01

    One of the major challenges for mutagenicity assessment of environmental samples and individual compounds for example in the Ames fluctuation test (AFT) is the establishment and control of a well defined exposure concentration. Thus, a combination of passive dosing with silicone O-rings (SRs) together with an analytical confirmation of the freely dissolved concentration (FDC) is presented. FDCs are often determined with a combination of solid phase micro-extraction (SPME) with gas chromatography (GC). For compounds with poor performance in GC, a high performance liquid chromatography-mass spectrometry (HPLC-MS) analysis of bi-distilled water dosed with identically loaded SRs is suggested to avoid interference of the bacterial culture. The approach was tested for six amino-, nitro-, and keto-substituted polycyclic aromatic compounds with a logK(OW) range of 2.5-5.1 without metabolic activation. The method provided reliable concentration-effect relationships and freely dissolved 50% effect concentrations (DEC(50)) 3-33 times lower than nominal effect concentrations (NEC(50)) derived in parallel solvent-dosed AFT. Partition coefficients and NEC(50)/DEC(50) ratios were well correlated with lipophilicity.

  12. The notion of hormesis and the dose-response theory: a unified approach.

    Science.gov (United States)

    Murado, M A; Vázquez, J A

    2007-02-07

    According to an opinion which is vigorous and insistently defended for approximately one decade, hormesis (the response of a biological entity to an effector, with stimulatory results at low doses and inhibitory results at high doses) radically puts into question the classic theory of dose-response (DR) relationships and demands a profound revision of environmental protection policies. Herein we show that DR theory, with the modifications which we propose, allows the modelling of various kinds of biphasic responses which are phenomenologically similar to hormetic ones and of well-defined origin, as well as responses which have been treated as genuinely hormetic. Our descriptive approach may also represent a useful resource for experimental design, directed towards identifying some of the potentially heterogeneous mechanisms which underlie the hormetic phenomenon. Finally, it also allows to discuss some factors which prevent the use of the notion of hormesis-perhaps useful in a clinical context, under strictly controlled conditions-to make decisions on environmental protection measures.

  13. Benchmarking in Mobarakeh Steel Company

    OpenAIRE

    Sasan Ghasemi; Mohammad Nazemi; Mehran Nejati

    2008-01-01

    Benchmarking is considered as one of the most effective ways of improving performance in companies. Although benchmarking in business organizations is a relatively new concept and practice, it has rapidly gained acceptance worldwide. This paper introduces the benchmarking project conducted in Esfahan's Mobarakeh Steel Company, as the first systematic benchmarking project conducted in Iran. It aims to share the process deployed for the benchmarking project in this company and illustrate how th...

  14. Benchmarking in Mobarakeh Steel Company

    Directory of Open Access Journals (Sweden)

    Sasan Ghasemi

    2008-05-01

    Full Text Available Benchmarking is considered as one of the most effective ways of improving performance incompanies. Although benchmarking in business organizations is a relatively new concept and practice, ithas rapidly gained acceptance worldwide. This paper introduces the benchmarking project conducted in Esfahan’s Mobarakeh Steel Company, as the first systematic benchmarking project conducted in Iran. It aimsto share the process deployed for the benchmarking project in this company and illustrate how the projectsystematic implementation led to succes.

  15. Benchmarking Non-Hardware Balance-of-System (Soft) Costs for U.S. Photovoltaic Systems Using a Bottom-Up Approach and Installer Survey

    Energy Technology Data Exchange (ETDEWEB)

    Ardani, Kristen [National Renewable Energy Lab. (NREL), Golden, CO (United States); Margolis, Robert [National Renewable Energy Lab. (NREL), Golden, CO (United States); Feldman, David [National Renewable Energy Lab. (NREL), Golden, CO (United States); Ong, Sean [National Renewable Energy Lab. (NREL), Golden, CO (United States); Barbose, Galen [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Wiser, Ryan [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2012-11-01

    This report presents results from the first U.S. Department of Energy (DOE) sponsored, bottom-up data-collection and analysis of non-hardware balance-of-system costs—often referred to as “business process” or “soft” costs—for residential and commercial photovoltaic (PV) systems. Annual expenditure and labor-hour-productivity data are analyzed to benchmark 2010 soft costs related to the DOE priority areas of (1) customer acquisition; (2) permitting, inspection, and interconnection; (3) installation labor; and (4) installer labor for arranging third-party financing. Annual expenditure and labor-hour data were collected from 87 PV installers. After eliminating outliers, the survey sample consists of 75 installers, representing approximately 13% of all residential PV installations and 4% of all commercial installations added in 2010. Including assumed permitting fees, in 2010 the average soft costs benchmarked in this analysis total $1.50/W for residential systems (ranging from $0.66/W to $1.66/W between the 20th and 80th percentiles). For commercial systems, the median 2010 benchmarked soft costs (including assumed permitting fees) are $0.99/W for systems smaller than 250 kW (ranging from $0.51/W to $1.45/W between the 20th and 80th percentiles) and $0.25/W for systems larger than 250 kW (ranging from $0.17/W to $0.78/W between the 20th and 80th percentiles). Additional soft costs not benchmarked in the present analysis (e.g., installer profit, overhead, financing, and contracting) are significant and would add to these figures. The survey results provide a benchmark for measuring—and helping to accelerate—progress over the next decade toward achieving the DOE SunShot Initiative’s soft-cost-reduction targets. We conclude that the selected non-hardware business processes add considerable cost to U.S. PV systems, constituting 23% of residential PV system price, 17% of small commercial system price, and 5% of large commercial system price (in 2010

  16. A first-principles approach to total-dose hardness assurance

    Energy Technology Data Exchange (ETDEWEB)

    Fleetwood, D.M. [Sandia National Labs., Albuquerque, NM (United States). Radiation Technology and Assurance Dept.

    1995-11-01

    A first-principles approach to radiation hardness assurance was described that provides the technical background to the present US and European total-dose radiation hardness assurance test methods for MOS technologies, TM 1019.4 and BS 22900. These test methods could not have been developed otherwise, as their existence depends not on a wealth of empirical comparisons of IC data from ground and space testing, but on a fundamental understanding of MOS defect growth and annealing processes. Rebound testing should become less of a problem for advanced MOS small-signal electronics technologies for systems with total dose requirements below 50--100 krad(SiO{sub 2}) because of trends toward much thinner gate oxides. For older technologies with thicker gate oxides and for power devices, rebound testing is unavoidable without detailed characterization studies to assess the impact of interface traps on devices response in space. The QML approach is promising for future hardened technologies. A sufficient understanding of process effects on radiation hardness has been developed that should be able to reduce testing costs in the future for hardened parts. Finally, it is hoped that the above discussions have demonstrated that the foundation for cost-effective hardness assurance tests is laid with studies of the basic mechanisms of radiation effects. Without a diligent assessment of new radiation effects mechanisms in future technologies, one cannot be assured that the present generation of radiation test standards will continue to apply.

  17. Benchmarking Pthreads performance

    Energy Technology Data Exchange (ETDEWEB)

    May, J M; de Supinski, B R

    1999-04-27

    The importance of the performance of threads libraries is growing as clusters of shared memory machines become more popular POSIX threads, or Pthreads, is an industry threads library standard. We have implemented the first Pthreads benchmark suite. In addition to measuring basic thread functions, such as thread creation, we apply the L.ogP model to standard Pthreads communication mechanisms. We present the results of our tests for several hardware platforms. These results demonstrate that the performance of existing Pthreads implementations varies widely; parts of nearly all of these implementations could be further optimized. Since hardware differences do not fully explain these performance variations, optimizations could improve the implementations. 2. Incorporating Threads Benchmarks into SKaMPI is an MPI benchmark suite that provides a general framework for performance analysis [7]. SKaMPI does not exhaustively test the MPI standard. Instead, it

  18. Deviating From the Benchmarks

    DEFF Research Database (Denmark)

    Rocha, Vera; Van Praag, Mirjam; Carneiro, Anabela

    survival? The analysis is based on a matched employer-employee dataset and covers about 17,500 startups in manufacturing and services. We adopt a new procedure to estimate individual benchmarks for the quantity and quality of initial human resources, acknowledging correlations between hiring decisions...... the benchmark can be substantial, are persistent over time, and hinder the survival of firms. The implications may, however, vary according to the sector and the ownership structure at entry. Given the stickiness of initial choices, wrong human capital decisions at entry turn out to be a close to irreversible...

  19. Benchmarking for Best Practice

    CERN Document Server

    Zairi, Mohamed

    1998-01-01

    Benchmarking for Best Practice uses up-to-the-minute case-studies of individual companies and industry-wide quality schemes to show how and why implementation has succeeded. For any practitioner wanting to establish best practice in a wide variety of business areas, this book makes essential reading. .It is also an ideal textbook on the applications of TQM since it describes concepts, covers definitions and illustrates the applications with first-hand examples. Professor Mohamed Zairi is an international expert and leading figure in the field of benchmarking. His pioneering work in this area l

  20. Benchmark duration of work hours for development of fatigue symptoms in Japanese workers with adjustment for job-related stress.

    Science.gov (United States)

    Suwazono, Yasushi; Dochi, Mirei; Kobayashi, Etsuko; Oishi, Mitsuhiro; Okubo, Yasushi; Tanaka, Kumihiko; Sakata, Kouichi

    2008-12-01

    The objective of this study was to calculate benchmark durations and lower 95% confidence limits for benchmark durations of working hours associated with subjective fatigue symptoms by applying the benchmark dose approach while adjusting for job-related stress using multiple logistic regression analyses. A self-administered questionnaire was completed by 3,069 male and 412 female daytime workers (age 18-67 years) in a Japanese steel company. The eight dependent variables in the Cumulative Fatigue Symptoms Index were decreased vitality, general fatigue, physical disorders, irritability, decreased willingness to work, anxiety, depressive feelings, and chronic tiredness. Independent variables were daily working hours, four subscales (job demand, job control, interpersonal relationship, and job suitability) of the Brief Job Stress Questionnaire, and other potential covariates. Using significant parameters for working hours and those for other covariates, the benchmark durations of working hours were calculated for the corresponding Index property. Benchmark response was set at 5% or 10%. Assuming a condition of worst job stress, the benchmark duration/lower 95% confidence limit for benchmark duration of working hours per day with a benchmark response of 5% or 10% were 10.0/9.4 or 11.7/10.7 (irritability) and 9.2/8.9 or 10.4/9.8 (chronic tiredness) in men and 8.9/8.4 or 9.8/8.9 (chronic tiredness) in women. The threshold amounts of working hours for fatigue symptoms under the worst job-related stress were very close to the standard daily working hours in Japan. The results strongly suggest that special attention should be paid to employees whose working hours exceed threshold amounts based on individual levels of job-related stress.

  1. HPCS HPCchallenge Benchmark Suite

    Science.gov (United States)

    2007-11-02

    measured HPCchallenge Benchmark performance on various HPC architectures — from Cray X1s to Beowulf clusters — in the presentation and paper...from Cray X1s to Beowulf clusters — using the updated results at http://icl.cs.utk.edu/hpcc/hpcc_results.cgi Even a small percentage of random

  2. Benchmarking Danish Industries

    DEFF Research Database (Denmark)

    Gammelgaard, Britta; Bentzen, Eric; Aagaard Andreassen, Mette

    2003-01-01

    compatible survey. The International Manufacturing Strategy Survey (IMSS) doesbring up the question of supply chain management, but unfortunately, we did not have access to thedatabase. Data from the members of the SCOR-model, in the form of benchmarked performance data,may exist, but are nonetheless...

  3. Benchmarks: WICHE Region 2012

    Science.gov (United States)

    Western Interstate Commission for Higher Education, 2013

    2013-01-01

    Benchmarks: WICHE Region 2012 presents information on the West's progress in improving access to, success in, and financing of higher education. The information is updated annually to monitor change over time and encourage its use as a tool for informed discussion in policy and education communities. To establish a general context for the…

  4. Surveys and Benchmarks

    Science.gov (United States)

    Bers, Trudy

    2012-01-01

    Surveys and benchmarks continue to grow in importance for community colleges in response to several factors. One is the press for accountability, that is, for colleges to report the outcomes of their programs and services to demonstrate their quality and prudent use of resources, primarily to external constituents and governing boards at the state…

  5. Benchmarking and Performance Management

    Directory of Open Access Journals (Sweden)

    Adrian TANTAU

    2010-12-01

    Full Text Available The relevance of the chosen topic is explained by the meaning of the firm efficiency concept - the firm efficiency means the revealed performance (how well the firm performs in the actual market environment given the basic characteristics of the firms and their markets that are expected to drive their profitability (firm size, market power etc.. This complex and relative performance could be due to such things as product innovation, management quality, work organization, some other factors can be a cause even if they are not directly observed by the researcher. The critical need for the management individuals/group to continuously improve their firm/company’s efficiency and effectiveness, the need for the managers to know which are the success factors and the competitiveness determinants determine consequently, what performance measures are most critical in determining their firm’s overall success. Benchmarking, when done properly, can accurately identify both successful companies and the underlying reasons for their success. Innovation and benchmarking firm level performance are critical interdependent activities. Firm level variables, used to infer performance, are often interdependent due to operational reasons. Hence, the managers need to take the dependencies among these variables into account when forecasting and benchmarking performance. This paper studies firm level performance using financial ratio and other type of profitability measures. It uses econometric models to describe and then propose a method to forecast and benchmark performance.

  6. Brainstorming as a Tool for the Benchmarking For Achieving Results in the Service-Oriented-Businesses (A Online Survey: Study Approach

    Directory of Open Access Journals (Sweden)

    R. Surya Kiran, D. Shiva Sai Kumar, D. Sateesh Kumar, V. Dilip Kumar, Vikas Kumar Singh

    2013-08-01

    Full Text Available How to benchmark is the problem and this paper produces out an outline on a typical research methodology using the brainstorming technique in order to come to the effective conclusions. With the commencement of the STEP (Socio-Cultural, Technical ,Economical and Political reforms in the previous years , business environments are in a state of dynamic change and the change process is still continuing .There had been a tremendous acceleration from the tradition and the inward looking regime to a progressive and the outward looking regime of the policy framework. With the L.P.G. (Liberalization , Privatization and the Globalization in almost all the sectors of the STEM (Science, Technology, Engineering and Medicine, the roles of the different sectors are undergoing Fundamental/Conceptual changes opening up new sets for analyzing the SWOT (Strength, Weakness, Opportunity and Threat for the business sectors .The main aim of the Six Sigma concept is to make the results were right the first time, every time. So benchmarking is to be done for the profitability and the revenue growth of the organizations . Brainstorming results could be well interpreted with the superposition matrix considering the ABC and the VED analysis as the same has been tested in the designs of the inventory control .

  7. Benchmarking Non-Hardware Balance-of-System (Soft) Costs for U.S. Photovoltaic Systems, Using a Bottom-Up Approach and Installer Survey - Second Edition

    Energy Technology Data Exchange (ETDEWEB)

    Friedman, B.; Ardani, K.; Feldman, D.; Citron, R.; Margolis, R.; Zuboy, J.

    2013-10-01

    This report presents results from the second U.S. Department of Energy (DOE) sponsored, bottom-up data-collection and analysis of non-hardware balance-of-system costs -- often referred to as 'business process' or 'soft' costs -- for U.S. residential and commercial photovoltaic (PV) systems. In service to DOE's SunShot Initiative, annual expenditure and labor-hour-productivity data are analyzed to benchmark 2012 soft costs related to (1) customer acquisition and system design (2) permitting, inspection, and interconnection (PII). We also include an in-depth analysis of costs related to financing, overhead, and profit. Soft costs are both a major challenge and a major opportunity for reducing PV system prices and stimulating SunShot-level PV deployment in the United States. The data and analysis in this series of benchmarking reports are a step toward the more detailed understanding of PV soft costs required to track and accelerate these price reductions.

  8. A Markov decision process approach to temporal modulation of dose fractions in radiation therapy planning.

    Science.gov (United States)

    Kim, M; Ghate, A; Phillips, M H

    2009-07-21

    The current state of the art in cancer treatment by radiation optimizes beam intensity spatially such that tumors receive high dose radiation whereas damage to nearby healthy tissues is minimized. It is common practice to deliver the radiation over several weeks, where the daily dose is a small constant fraction of the total planned. Such a 'fractionation schedule' is based on traditional models of radiobiological response where normal tissue cells possess the ability to repair sublethal damage done by radiation. This capability is significantly less prominent in tumors. Recent advances in quantitative functional imaging and biological markers are providing new opportunities to measure patient response to radiation over the treatment course. This opens the door for designing fractionation schedules that take into account the patient's cumulative response to radiation up to a particular treatment day in determining the fraction on that day. We propose a novel approach that, for the first time, mathematically explores the benefits of such fractionation schemes. This is achieved by building a stylistic Markov decision process (MDP) model, which incorporates some key features of the problem through intuitive choices of state and action spaces, as well as transition probability and reward functions. The structure of optimal policies for this MDP model is explored through several simple numerical examples.

  9. Assessing doses to terrestrial wildlife at a radioactive waste disposal site: Inter-comparison of modelling approaches

    Energy Technology Data Exchange (ETDEWEB)

    Johansen, M.P., E-mail: mathew.johansen@ansto.gov.au [Australian Nuclear Science and Technology Organisation, Locked Bag 2001, Kirrawee DC, NSW, 2232 (Australia); Barnett, C.L., E-mail: clb@ceh.ac.uk [Centre for Ecology and Hydrology, Lancaster (United Kingdom); Beresford, N.A., E-mail: nab@ceh.ac.uk [Centre for Ecology and Hydrology, Lancaster (United Kingdom); Brown, J.E., E-mail: justin.brown@nrpa.no [Norwegian Radiation Protection Authority, Oesteraas (Norway); Cerne, M., E-mail: marko.cerne@ijs.si [Jozef Stefan Institute, Ljubljana (Slovenia); Howard, B.J., E-mail: bjho@ceh.ac.uk [Centre for Ecology and Hydrology, Lancaster (United Kingdom); Kamboj, S., E-mail: skamboj@anl.gov [Argonne National Laboratory, IL (United States); Keum, D.-K., E-mail: dkkeum@kaeri.re.kr [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of); Smodis, B. [Jozef Stefan Institute, Ljubljana (Slovenia); Twining, J.R., E-mail: jrt@ansto.gov.au [Australian Nuclear Science and Technology Organisation, Locked Bag 2001, Kirrawee DC, NSW, 2232 (Australia); Vandenhove, H., E-mail: hvandenh@sckcen.be [Belgian Nuclear Research Centre, Mol (Belgium); Vives i Batlle, J., E-mail: jvbatll@sckcen.be [Belgian Nuclear Research Centre, Mol (Belgium); Wood, M.D., E-mail: m.d.wood@salford.ac.uk [University of Salford, Manchester (United Kingdom); Yu, C., E-mail: cyu@anl.gov [Argonne National Laboratory, IL (United States)

    2012-06-15

    Radiological doses to terrestrial wildlife were examined in this model inter-comparison study that emphasised factors causing variability in dose estimation. The study participants used varying modelling approaches and information sources to estimate dose rates and tissue concentrations for a range of biota types exposed to soil contamination at a shallow radionuclide waste burial site in Australia. Results indicated that the dominant factor causing variation in dose rate estimates (up to three orders of magnitude on mean total dose rates) was the soil-to-organism transfer of radionuclides that included variation in transfer parameter values as well as transfer calculation methods. Additional variation was associated with other modelling factors including: how participants conceptualised and modelled the exposure configurations (two orders of magnitude); which progeny to include with the parent radionuclide (typically less than one order of magnitude); and dose calculation parameters, including radiation weighting factors and dose conversion coefficients (typically less than one order of magnitude). Probabilistic approaches to model parameterisation were used to encompass and describe variable model parameters and outcomes. The study confirms the need for continued evaluation of the underlying mechanisms governing soil-to-organism transfer of radionuclides to improve estimation of dose rates to terrestrial wildlife. The exposure pathways and configurations available in most current codes are limited when considering instances where organisms access subsurface contamination through rooting, burrowing, or using different localised waste areas as part of their habitual routines. - Highlights: Black-Right-Pointing-Pointer Assessment of modelled dose rates to terrestrial biota from radionuclides. Black-Right-Pointing-Pointer The substantial variation among current approaches is quantifiable. Black-Right-Pointing-Pointer The dominant variable was soil

  10. Microfluidic approach for fast labeling optimization and dose-on-demand implementation

    Energy Technology Data Exchange (ETDEWEB)

    Pascali, Giancarlo, E-mail: pascali@ifc.cnr.i [Radiopharmacy Department, Institute of Clinical Physiology, Via Moruzzi 1, 56124 Pisa (Italy); Mazzone, Grazia [Radiopharmacy Department, Institute of Clinical Physiology, Via Moruzzi 1, 56124 Pisa (Italy); IUSS, Piazza Ghislieri, 27100 Pavia (Italy); Saccomanni, Giuseppe; Manera, Clementina [Dipartimento di Scienze Farmaceutiche, Universita di Pisa, Via Bonanno 6, 56126 Pisa (Italy); Salvadori, Piero A. [Radiopharmacy Department, Institute of Clinical Physiology, Via Moruzzi 1, 56124 Pisa (Italy)

    2010-07-15

    Introduction: The diffusion of PET as a pivotal molecular imaging modality has emphasized the need for new positron-emitting radiotracers to be used in diagnostic applications and research. Microfluidic represents an innovative approach, owing to its potential to increase radiochemical productivity in terms of yields, time reduction, precursor consumption and flexible experimental planning. Methods: We focused on fluorine-18 labeling and used a microfluidic platform to perform sequential reactions, by using the same batch of {sup 18}F-labeling solution on one or more substrates, during the same experimental session. A solid-phase extraction (SPE) workup procedure was also implemented in the system to provide a repeatable purification step. Results: We were able to quickly optimize the conditions for labeling of ethyl and propyl ditosylate and of a new cannabinoid type 2 (CB2) receptor agonist, CB41. In all substrates, we obtained good incorporation yields (60% to 85%) in short (<90 s) reaction times. Single dosages of the CB2 ligand were sequentially prepared, upon request, in satisfactory quantities and purity for small animal PET scanning. Conclusion: This work demonstrates the usefulness of a microfluidic-based system for a rapid optimization of temperature, flow rate of reactants and their relative ratio in the labeling of different precursors by using the same {sup 18}F-fluoride batch. This approach was used to obtain in sequence several injectable doses of a novel CB2 ligand, thus providing the proof of principle that microfluidic systems permit a dose-on-demand production of new radiotracers.

  11. An approach to using conventional brachytherapy software for clinical treatment planning of complex, Monte Carlo-based brachytherapy dose distributions

    Energy Technology Data Exchange (ETDEWEB)

    Rivard, Mark J.; Melhus, Christopher S.; Granero, Domingo; Perez-Calatayud, Jose; Ballester, Facundo [Department of Radiation Oncology, Tufts University School of Medicine, Boston, Massachusetts 02111 (United States); Radiation Oncology Department, Physics Section, ' ' La Fe' ' University Hospital, Avenida Campanar 21, E-46009 Valencia (Spain); Department of Atomic, Molecular, and Nuclear Physics, University of Valencia, C/Dr. Moliner 50, E-46100 Burjassot, Spain and IFIC (University of Valencia-CSIC), C/Dr. Moliner 50, E-46100 Burjassot (Spain)

    2009-06-15

    Certain brachytherapy dose distributions, such as those for LDR prostate implants, are readily modeled by treatment planning systems (TPS) that use the superposition principle of individual seed dose distributions to calculate the total dose distribution. However, dose distributions for brachytherapy treatments using high-Z shields or having significant material heterogeneities are not currently well modeled using conventional TPS. The purpose of this study is to establish a new treatment planning technique (Tufts technique) that could be applied in some clinical situations where the conventional approach is not acceptable and dose distributions present cylindrical symmetry. Dose distributions from complex brachytherapy source configurations determined with Monte Carlo methods were used as input data. These source distributions included the 2 and 3 cm diameter Valencia skin applicators from Nucletron, 4-8 cm diameter AccuBoost peripheral breast brachytherapy applicators from Advanced Radiation Therapy, and a 16 mm COMS-based eye plaque using {sup 103}Pd, {sup 125}I, and {sup 131}Cs seeds. Radial dose functions and 2D anisotropy functions were obtained by positioning the coordinate system origin along the dose distribution cylindrical axis of symmetry. Origin:tissue distance and active length were chosen to minimize TPS interpolation errors. Dosimetry parameters were entered into the PINNACLE TPS, and dose distributions were subsequently calculated and compared to the original Monte Carlo-derived dose distributions. The new planning technique was able to reproduce brachytherapy dose distributions for all three applicator types, producing dosimetric agreement typically within 2% when compared with Monte Carlo-derived dose distributions. Agreement between Monte Carlo-derived and planned dose distributions improved as the spatial resolution of the fitted dosimetry parameters improved. For agreement within 5% throughout the clinical volume, spatial resolution of

  12. Benchmarking Peer Production Mechanisms, Processes & Practices

    Science.gov (United States)

    Fischer, Thomas; Kretschmer, Thomas

    2008-01-01

    This deliverable identifies key approaches for quality management in peer production by benchmarking peer production practices and processes in other areas. (Contains 29 footnotes, 13 figures and 2 tables.)[This report has been authored with contributions of: Kaisa Honkonen-Ratinen, Matti Auvinen, David Riley, Jose Pinzon, Thomas Fischer, Thomas…

  13. Benchmarking i den offentlige sektor

    DEFF Research Database (Denmark)

    Bukh, Per Nikolaj; Dietrichson, Lars; Sandalgaard, Niels

    2008-01-01

    I artiklen vil vi kort diskutere behovet for benchmarking i fraværet af traditionelle markedsmekanismer. Herefter vil vi nærmere redegøre for, hvad benchmarking er med udgangspunkt i fire forskellige anvendelser af benchmarking. Regulering af forsyningsvirksomheder vil blive behandlet, hvorefter...

  14. Benchmarking of LSTM Networks

    OpenAIRE

    Breuel, Thomas M.

    2015-01-01

    LSTM (Long Short-Term Memory) recurrent neural networks have been highly successful in a number of application areas. This technical report describes the use of the MNIST and UW3 databases for benchmarking LSTM networks and explores the effect of different architectural and hyperparameter choices on performance. Significant findings include: (1) LSTM performance depends smoothly on learning rates, (2) batching and momentum has no significant effect on performance, (3) softmax training outperf...

  15. Massive dose vitamin A programme in India--need for a targeted approach.

    Science.gov (United States)

    Kapil, Umesh; Sachdev, H P S

    2013-09-01

    The National Prophylaxis Programme against Nutritional Blindness due to vitamin A deficiency (NPPNB due to VAD) was started in 1970 with the specific aim of preventing nutritional blindness due to keratomalacia . The Programme was launched as an urgent remedial measure to combat the unacceptably high magnitude of xerophthalmic blindness in the country seen in the 1950s and 1960s. Clinical VAD has declined drastically during the last 40 years. Also, indicators of child health have shown substantial gains in different States in the country. The prevalence of severe undernutrition has come down significantly. Immunization coverage for measles and other vaccine preventable diseases has improved from 5-7 per cent in early seventies to currently 60-90 per cent, in different States. Similarly, there has been a significant improvement in the overall dietary intake of young children. There has been virtual disappearance of keratomalacia, and a sharp decline in the prevalence of Bitot spots. Prophylactic mega dose administration of vitamin A is primarily advocated because of the claim of 23 per cent reduction in childhood mortality. However, benefits on this scale have been found only in areas with rudimentary health care facilities where clinical deficiency is common, and there is substantial heterogeneity, especially with inclusion of all trials. There is an urgent need for adopting a targeted rather than universal prophylactic mega dose vitamin A supplementation in preschool children. This approach is justified on the basis of currently available evidence documenting a substantial decline in VAD prevalence, substantial heterogeneity and uncertainty about mortality effects in present era with improved health care, and resource constraints with competing priorities.

  16. The FLUKA Monte Carlo code coupled with the NIRS approach for clinical dose calculations in carbon ion therapy

    Science.gov (United States)

    Magro, G.; Dahle, T. J.; Molinelli, S.; Ciocca, M.; Fossati, P.; Ferrari, A.; Inaniwa, T.; Matsufuji, N.; Ytre-Hauge, K. S.; Mairani, A.

    2017-05-01

    Particle therapy facilities often require Monte Carlo (MC) simulations to overcome intrinsic limitations of analytical treatment planning systems (TPS) related to the description of the mixed radiation field and beam interaction with tissue inhomogeneities. Some of these uncertainties may affect the computation of effective dose distributions; therefore, particle therapy dedicated MC codes should provide both absorbed and biological doses. Two biophysical models are currently applied clinically in particle therapy: the local effect model (LEM) and the microdosimetric kinetic model (MKM). In this paper, we describe the coupling of the NIRS (National Institute for Radiological Sciences, Japan) clinical dose to the FLUKA MC code. We moved from the implementation of the model itself to its application in clinical cases, according to the NIRS approach, where a scaling factor is introduced to rescale the (carbon-equivalent) biological dose to a clinical dose level. A high level of agreement was found with published data by exploring a range of values for the MKM input parameters, while some differences were registered in forward recalculations of NIRS patient plans, mainly attributable to differences with the analytical TPS dose engine (taken as reference) in describing the mixed radiation field (lateral spread and fragmentation). We presented a tool which is being used at the Italian National Center for Oncological Hadrontherapy to support the comparison study between the NIRS clinical dose level and the LEM dose specification.

  17. Within-animal variation as an indication of the minimal magnitude of the critical effect size for continuous toxicological parameters applicable in the benchmark dose approach

    NARCIS (Netherlands)

    Dekkers, S.; Telman, J.; Rennen, M.A.J.; Appel, M.J.; Heer, C.de

    2006-01-01

    In this study, the within-animal variation in routinely studied continuous toxicological parameters was estimated from temporal fluctuations in individual healthy nonexposed animals. Assuming that these fluctuations are nonadverse, this within-animal variation may be indicative of the minimal magnit

  18. Comparison of an assumption-free Bayesian approach with Optimal Sampling Schedule to a maximum a posteriori Approach for Personalizing Cyclophosphamide Dosing.

    Science.gov (United States)

    Laínez, José M; Orcun, Seza; Pekny, Joseph F; Reklaitis, Gintaras V; Suvannasankha, Attaya; Fausel, Christopher; Anaissie, Elias J; Blau, Gary E

    2014-01-01

    Variable metabolism, dose-dependent efficacy, and a narrow therapeutic target of cyclophosphamide (CY) suggest that dosing based on individual pharmacokinetics (PK) will improve efficacy and minimize toxicity. Real-time individualized CY dose adjustment was previously explored using a maximum a posteriori (MAP) approach based on a five serum-PK sampling in patients with hematologic malignancy undergoing stem cell transplantation. The MAP approach resulted in an improved toxicity profile without sacrificing efficacy. However, extensive PK sampling is costly and not generally applicable in the clinic. We hypothesize that the assumption-free Bayesian approach (AFBA) can reduce sampling requirements, while improving the accuracy of results. Retrospective analysis of previously published CY PK data from 20 patients undergoing stem cell transplantation. In that study, Bayesian estimation based on the MAP approach of individual PK parameters was accomplished to predict individualized day-2 doses of CY. Based on these data, we used the AFBA to select the optimal sampling schedule and compare the projected probability of achieving the therapeutic end points. By optimizing the sampling schedule with the AFBA, an effective individualized PK characterization can be obtained with only two blood draws at 4 and 16 hours after administration on day 1. The second-day doses selected with the AFBA were significantly different than the MAP approach and averaged 37% higher probability of attaining the therapeutic targets. The AFBA, based on cutting-edge statistical and mathematical tools, allows an accurate individualized dosing of CY, with simplified PK sampling. This highly accessible approach holds great promise for improving efficacy, reducing toxicities, and lowering treatment costs. © 2013 Pharmacotherapy Publications, Inc.

  19. [Results of the evaluation of German benchmarking networks funded by the Ministry of Health].

    Science.gov (United States)

    de Cruppé, Werner; Blumenstock, Gunnar; Fischer, Imma; Selbmann, Hans-Konrad; Geraedts, Max

    2011-01-01

    Nine out of ten demonstration projects on clinical benchmarking funded by the German Ministry of Health were evaluated. Project reports and interviews were uniformly analysed using a list of criteria and a scheme to categorize the realized benchmarking approach. At the end of the funding period four benchmarking networks had implemented all benchmarking steps, and six were continued after funding had expired. The improvement of outcome quality cannot yet be assessed. Factors promoting the introduction of benchmarking networks with regard to organisational and process aspects of benchmarking implementation were derived.

  20. Tissue-based environmental quality benchmarks and standards.

    Science.gov (United States)

    Meador, James P; Warne, Michael St J; Chapman, Peter M; Chan, King Ming; Yu, Shen; Leung, Kenneth M Y

    2014-01-01

    Although the use of tissue concentrations (residues) of chemical contaminants as the dose metric to characterize chemical toxicity to aquatic organisms has been gaining acceptance over the past 20 years, tissue concentrations are less commonly used in water quality management and have yet to be formally adopted as benchmarks or environmental quality standards (EQS). This synthesis paper addresses advantages and disadvantages for the development and application of tissue-based EQS as an alternative and supplement to exposure-based EQS determined with water and sediment concentration data. Tissue-based EQS can be readily developed in parallel with conventional toxicity tests, and achieved by quantification of chemical concentrations in tissue alongside traditional concentration-response toxicity testing. Tissue-residue toxicity metrics can be used as benchmarks for screening and monitoring water and sediment quality, to derive equivalent water or sediment EQS, and for ecological risk assessments and weight of evidence approaches for assessing ecosystem impairment. Tissue-based toxicity metrics and associated EQS provide several advantages; however, there are some limitations to consider and key knowledge gaps to fill.

  1. A rational approach to dose reduction in CT: individualized scan protocols

    Energy Technology Data Exchange (ETDEWEB)

    Wilting, J.E.; Zwartkruis, A.; Timmer, J. [Philips Medical Systems Nederland BV, Best (Netherlands); Leeuwen, M.S. van; Feldberg, M. [University Medical Center, Utrecht (Netherlands); Kamphuis, A.G.A. [Rode Kruis Ziekenhuis, Beverwijk (Netherlands)

    2001-12-01

    The aim of this study was to demonstrate that dose reduction and constant image quality can be achieved by adjusting X-ray dose to patient size. To establish the relation between patient size, image quality and dose we scanned 19 patients with reduced dose. Image noise was measured. Four radiologists scored image quality subjectively, whereby a higher score meant less image quality. A reference patient diameter was determined for which the dose was just sufficient. Then 22 patients were scanned with the X-ray dose adjusted to their size. Again, image noise was measured and subjective image quality was scored. The dose reduction compared with the standard protocol was calculated. In the first group the measured noise was correlated to the patient diameter ({rho}=0.78). This correlation is lost in the second group ({rho}=-0.13). The correlation between patient diameter and subjective image quality scores changes from {rho}=0.60 (group 1) to {rho}=-0.69 (group 2). Compared with the standard protocol, the dose was reduced (mean 28%, range 0-76%) in 19 of 22 patients (86%). Dose reduction and constant noise can be achieved when the X-ray dose is adjusted to the patient diameter. With constant image noise the subjective image quality increases with larger patients. (orig.)

  2. Benchmarking and Regulation

    DEFF Research Database (Denmark)

    Agrell, Per J.; Bogetoft, Peter

    nchmarking methods, and in particular Data Envelopment Analysis (DEA), have become well-established and informative tools for economic regulation. DEA is now routinely used by European regulators to set reasonable revenue caps for energy transmission and distribution system operators....... The application of benchmarking in regulation, however, requires specific steps in terms of data validation, model specification and outlier detection that are not systematically documented in open publications, leading to discussions about regulatory stability and economic feasibility of these techniques....... In this paper, we review the modern foundations for frontier-based regulation and we discuss its actual use in several jurisdictions....

  3. 2001 benchmarking guide.

    Science.gov (United States)

    Hoppszallern, S

    2001-01-01

    Our fifth annual guide to benchmarking under managed care presents data that is a study in market dynamics and adaptation. New this year are financial indicators on HMOs exiting the market and those remaining. Hospital financial ratios and details on department performance are included. The physician group practice numbers show why physicians are scrutinizing capitated payments. Overall, hospitals in markets with high managed care penetration are more successful in managing labor costs and show productivity gains in imaging services, physical therapy and materials management.

  4. Benchmarking Query Execution Robustness

    Science.gov (United States)

    Wiener, Janet L.; Kuno, Harumi; Graefe, Goetz

    Benchmarks that focus on running queries on a well-tuned database system ignore a long-standing problem: adverse runtime conditions can cause database system performance to vary widely and unexpectedly. When the query execution engine does not exhibit resilience to these adverse conditions, addressing the resultant performance problems can contribute significantly to the total cost of ownership for a database system in over-provisioning, lost efficiency, and increased human administrative costs. For example, focused human effort may be needed to manually invoke workload management actions or fine-tune the optimization of specific queries.

  5. A Benchmarking System for Domestic Water Use

    Directory of Open Access Journals (Sweden)

    Dexter V. L. Hunt

    2014-05-01

    Full Text Available The national demand for water in the UK is predicted to increase, exacerbated by a growing UK population, and home-grown demands for energy and food. When set against the context of overstretched existing supply sources vulnerable to droughts, particularly in increasingly dense city centres, the delicate balance of matching minimal demands with resource secure supplies becomes critical. When making changes to "internal" demands the role of technological efficiency and user behaviour cannot be ignored, yet existing benchmarking systems traditionally do not consider the latter. This paper investigates the practicalities of adopting a domestic benchmarking system (using a band rating that allows individual users to assess their current water use performance against what is possible. The benchmarking system allows users to achieve higher benchmarks through any approach that reduces water consumption. The sensitivity of water use benchmarks are investigated by making changes to user behaviour and technology. The impact of adopting localised supplies (i.e., Rainwater harvesting—RWH and Grey water—GW and including "external" gardening demands are investigated. This includes the impacts (in isolation and combination of the following: occupancy rates (1 to 4; roof size (12.5 m2 to 100 m2; garden size (25 m2 to 100 m2 and geographical location (North West, Midlands and South East, UK with yearly temporal effects (i.e., rainfall and temperature. Lessons learnt from analysis of the proposed benchmarking system are made throughout this paper, in particular its compatibility with the existing Code for Sustainable Homes (CSH accreditation system. Conclusions are subsequently drawn for the robustness of the proposed system.

  6. 专利资产指数:一种新的专利组合分析方法%Patent asset index:A novel approach to benchmark patent portfolios

    Institute of Scientific and Technical Information of China (English)

    钟华; 单连慧; 李海存

    2014-01-01

    分析了现有专利排名方法的局限性,介绍了专利资产指数这一新的专利组合分析方法。该方法采用组合规模、市场范围、技术重要程度3个指标评价企业专利,提供了一个能够更准确评估企业知识产权价值的工具。%The limitations of patent ranking were analyzed, followed by a description of the patent asset index, a novel approach to benchmark patent porifolios, which assesses patents according to their portfolios, market and technologies, and is thus a more accurate tool for assessing the value of intellectual property rights.

  7. Characterization of the bronchodilatory dose response to indacaterol in patients with chronic obstructive pulmonary disease using model-based approaches.

    Science.gov (United States)

    Renard, Didier; Looby, Michael; Kramer, Benjamin; Lawrence, David; Morris, David; Stanski, Donald R

    2011-04-26

    Indacaterol is a once-daily long-acting inhaled β2-agonist indicated for maintenance treatment of moderate-to-severe chronic obstructive pulmonary disease (COPD). The large inter-patient and inter-study variability in forced expiratory volume in 1 second (FEV1) with bronchodilators makes determination of optimal doses difficult in conventional dose-ranging studies. We considered alternative methods of analysis. We utilized a novel modelling approach to provide a robust analysis of the bronchodilatory dose response to indacaterol. This involved pooled analysis of study-level data to characterize the bronchodilatory dose response, and nonlinear mixed-effects analysis of patient-level data to characterize the impact of baseline covariates. The study-level analysis pooled summary statistics for each steady-state visit in 11 placebo-controlled studies. These study-level summaries encompassed data from 7476 patients at indacaterol doses of 18.75-600 μg once daily, and showed that doses of 75 μg and above achieved clinically important improvements in predicted trough FEV1 response. Indacaterol 75 μg achieved 74% of the maximum effect on trough FEV1, and exceeded the midpoint of the 100-140 mL range that represents the minimal clinically important difference (MCID; ≥120 mL vs placebo), with a 90% probability that the mean improvement vs placebo exceeded the MCID. Indacaterol 150 μg achieved 85% of the model-predicted maximum effect on trough FEV1 and was numerically superior to all comparators (99.9% probability of exceeding MCID). Indacaterol 300 μg was the lowest dose that achieved the model-predicted maximum trough response.The patient-level analysis included data from 1835 patients from two dose-ranging studies of indacaterol 18.75-600 μg once daily. This analysis provided a characterization of dose response consistent with the study-level analysis, and demonstrated that disease severity, as captured by baseline FEV1, significantly affects the dose response

  8. Characterization of the bronchodilatory dose response to indacaterol in patients with chronic obstructive pulmonary disease using model-based approaches

    Directory of Open Access Journals (Sweden)

    Lawrence David

    2011-04-01

    Full Text Available Abstract Background Indacaterol is a once-daily long-acting inhaled β2-agonist indicated for maintenance treatment of moderate-to-severe chronic obstructive pulmonary disease (COPD. The large inter-patient and inter-study variability in forced expiratory volume in 1 second (FEV1 with bronchodilators makes determination of optimal doses difficult in conventional dose-ranging studies. We considered alternative methods of analysis. Methods We utilized a novel modelling approach to provide a robust analysis of the bronchodilatory dose response to indacaterol. This involved pooled analysis of study-level data to characterize the bronchodilatory dose response, and nonlinear mixed-effects analysis of patient-level data to characterize the impact of baseline covariates. Results The study-level analysis pooled summary statistics for each steady-state visit in 11 placebo-controlled studies. These study-level summaries encompassed data from 7476 patients at indacaterol doses of 18.75-600 μg once daily, and showed that doses of 75 μg and above achieved clinically important improvements in predicted trough FEV1 response. Indacaterol 75 μg achieved 74% of the maximum effect on trough FEV1, and exceeded the midpoint of the 100-140 mL range that represents the minimal clinically important difference (MCID; ≥120 mL vs placebo, with a 90% probability that the mean improvement vs placebo exceeded the MCID. Indacaterol 150 μg achieved 85% of the model-predicted maximum effect on trough FEV1 and was numerically superior to all comparators (99.9% probability of exceeding MCID. Indacaterol 300 μg was the lowest dose that achieved the model-predicted maximum trough response. The patient-level analysis included data from 1835 patients from two dose-ranging studies of indacaterol 18.75-600 μg once daily. This analysis provided a characterization of dose response consistent with the study-level analysis, and demonstrated that disease severity, as captured by

  9. Patient-specific Monte Carlo-based dose-kernel approach for inverse planning in afterloading brachytherapy.

    Science.gov (United States)

    D'Amours, Michel; Pouliot, Jean; Dagnault, Anne; Verhaegen, Frank; Beaulieu, Luc

    2011-12-01

    Brachytherapy planning software relies on the Task Group report 43 dosimetry formalism. This formalism, based on a water approximation, neglects various heterogeneous materials present during treatment. Various studies have suggested that these heterogeneities should be taken into account to improve the treatment quality. The present study sought to demonstrate the feasibility of incorporating Monte Carlo (MC) dosimetry within an inverse planning algorithm to improve the dose conformity and increase the treatment quality. The method was based on precalculated dose kernels in full patient geometries, representing the dose distribution of a brachytherapy source at a single dwell position using MC simulations and the Geant4 toolkit. These dose kernels are used by the inverse planning by simulated annealing tool to produce a fast MC-based plan. A test was performed for an interstitial brachytherapy breast treatment using two different high-dose-rate brachytherapy sources: the microSelectron iridium-192 source and the electronic brachytherapy source Axxent operating at 50 kVp. A research version of the inverse planning by simulated annealing algorithm was combined with MC to provide a method to fully account for the heterogeneities in dose optimization, using the MC method. The effect of the water approximation was found to depend on photon energy, with greater dose attenuation for the lower energies of the Axxent source compared with iridium-192. For the latter, an underdosage of 5.1% for the dose received by 90% of the clinical target volume was found. A new method to optimize afterloading brachytherapy plans that uses MC dosimetric information was developed. Including computed tomography-based information in MC dosimetry in the inverse planning process was shown to take into account the full range of scatter and heterogeneity conditions. This led to significant dose differences compared with the Task Group report 43 approach for the Axxent source. Copyright © 2011

  10. Patient-Specific Monte Carlo-Based Dose-Kernel Approach for Inverse Planning in Afterloading Brachytherapy

    Energy Technology Data Exchange (ETDEWEB)

    D' Amours, Michel [Departement de Radio-Oncologie et Centre de Recherche en Cancerologie de l' Universite Laval, Hotel-Dieu de Quebec, Quebec, QC (Canada); Department of Physics, Physics Engineering, and Optics, Universite Laval, Quebec, QC (Canada); Pouliot, Jean [Department of Radiation Oncology, University of California, San Francisco, School of Medicine, San Francisco, CA (United States); Dagnault, Anne [Departement de Radio-Oncologie et Centre de Recherche en Cancerologie de l' Universite Laval, Hotel-Dieu de Quebec, Quebec, QC (Canada); Verhaegen, Frank [Department of Radiation Oncology, Maastro Clinic, GROW Research Institute, Maastricht University Medical Centre, Maastricht (Netherlands); Department of Oncology, McGill University, Montreal, QC (Canada); Beaulieu, Luc, E-mail: beaulieu@phy.ulaval.ca [Departement de Radio-Oncologie et Centre de Recherche en Cancerologie de l' Universite Laval, Hotel-Dieu de Quebec, Quebec, QC (Canada); Department of Physics, Physics Engineering, and Optics, Universite Laval, Quebec, QC (Canada)

    2011-12-01

    Purpose: Brachytherapy planning software relies on the Task Group report 43 dosimetry formalism. This formalism, based on a water approximation, neglects various heterogeneous materials present during treatment. Various studies have suggested that these heterogeneities should be taken into account to improve the treatment quality. The present study sought to demonstrate the feasibility of incorporating Monte Carlo (MC) dosimetry within an inverse planning algorithm to improve the dose conformity and increase the treatment quality. Methods and Materials: The method was based on precalculated dose kernels in full patient geometries, representing the dose distribution of a brachytherapy source at a single dwell position using MC simulations and the Geant4 toolkit. These dose kernels are used by the inverse planning by simulated annealing tool to produce a fast MC-based plan. A test was performed for an interstitial brachytherapy breast treatment using two different high-dose-rate brachytherapy sources: the microSelectron iridium-192 source and the electronic brachytherapy source Axxent operating at 50 kVp. Results: A research version of the inverse planning by simulated annealing algorithm was combined with MC to provide a method to fully account for the heterogeneities in dose optimization, using the MC method. The effect of the water approximation was found to depend on photon energy, with greater dose attenuation for the lower energies of the Axxent source compared with iridium-192. For the latter, an underdosage of 5.1% for the dose received by 90% of the clinical target volume was found. Conclusion: A new method to optimize afterloading brachytherapy plans that uses MC dosimetric information was developed. Including computed tomography-based information in MC dosimetry in the inverse planning process was shown to take into account the full range of scatter and heterogeneity conditions. This led to significant dose differences compared with the Task Group report

  11. A Technical Approach to Expedited Processing of NTPR Radiation Dose Assessments

    Science.gov (United States)

    2011-10-01

    Dose Cohort .............................................................................. 87 B-2 Treatment of Exclusions...magnitude given the same scenario of exposure. • Uniformity in the treatment of upper-bound doses is maintained. • Approved NTPR standard methods...Pharynx (140-149) TONSILS TONSILS Tonsils ET Region+ Surrogate Oral Cavity and Pharynx (140-149) TRACHEA TRACHEA Trachea Lung Surrogate Lung (162

  12. Benchmarking concentrating photovoltaic systems

    Science.gov (United States)

    Duerr, Fabian; Muthirayan, Buvaneshwari; Meuret, Youri; Thienpont, Hugo

    2010-08-01

    Integral to photovoltaics is the need to provide improved economic viability. To achieve this goal, photovoltaic technology has to be able to harness more light at less cost. A large variety of concentrating photovoltaic concepts has provided cause for pursuit. To obtain a detailed profitability analysis, a flexible evaluation is crucial for benchmarking the cost-performance of this variety of concentrating photovoltaic concepts. To save time and capital, a way to estimate the cost-performance of a complete solar energy system is to use computer aided modeling. In this work a benchmark tool is introduced based on a modular programming concept. The overall implementation is done in MATLAB whereas Advanced Systems Analysis Program (ASAP) is used for ray tracing calculations. This allows for a flexible and extendable structuring of all important modules, namely an advanced source modeling including time and local dependence, and an advanced optical system analysis of various optical designs to obtain an evaluation of the figure of merit. An important figure of merit: the energy yield for a given photovoltaic system at a geographical position over a specific period, can be calculated.

  13. Benchmarking: A Method for Continuous Quality Improvement in Health

    OpenAIRE

    Ettorchi-Tardy, Amina; Levif, Marie; Michel, Philippe

    2012-01-01

    Benchmarking, a management approach for implementing best practices at best cost, is a recent concept in the healthcare system. The objectives of this paper are to better understand the concept and its evolution in the healthcare sector, to propose an operational definition, and to describe some French and international experiences of benchmarking in the healthcare sector. To this end, we reviewed the literature on this approach's emergence in the industrial sector, its evolution, its fields ...

  14. Entropy-based benchmarking methods

    OpenAIRE

    2012-01-01

    We argue that benchmarking sign-volatile series should be based on the principle of movement and sign preservation, which states that a bench-marked series should reproduce the movement and signs in the original series. We show that the widely used variants of Denton (1971) method and the growth preservation method of Causey and Trager (1981) may violate this principle, while its requirements are explicitly taken into account in the pro-posed entropy-based benchmarking methods. Our illustrati...

  15. Standard Guide for Benchmark Testing of Light Water Reactor Calculations

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    2010-01-01

    1.1 This guide covers general approaches for benchmarking neutron transport calculations in light water reactor systems. A companion guide (Guide E2005) covers use of benchmark fields for testing neutron transport calculations and cross sections in well controlled environments. This guide covers experimental benchmarking of neutron fluence calculations (or calculations of other exposure parameters such as dpa) in more complex geometries relevant to reactor surveillance. Particular sections of the guide discuss: the use of well-characterized benchmark neutron fields to provide an indication of the accuracy of the calculational methods and nuclear data when applied to typical cases; and the use of plant specific measurements to indicate bias in individual plant calculations. Use of these two benchmark techniques will serve to limit plant-specific calculational uncertainty, and, when combined with analytical uncertainty estimates for the calculations, will provide uncertainty estimates for reactor fluences with ...

  16. Intra and inter-organizational learning from benchmarking IS services

    DEFF Research Database (Denmark)

    Mengiste, Shegaw Anagaw; Kræmmergaard, Pernille; Hansen, Bettina

    2016-01-01

    in benchmarking their IS services and functions since 2006. Particularly, this research tackled existing IS benchmarking approaches and methods by turning to a learning-oriented perspective and by empirically exploring the dynamic process of intra and inter-organizational learning from benchmarking IS/IT services......This paper reports a case study of benchmarking IS services in Danish municipalities. Drawing on Holmqvist’s (2004) organizational learning model of exploration and exploitation, the paper explores intra and inter-organizational learning dynamics among Danish municipalities that are involved....... The paper also makes a contribution by emphasizing the importance of informal cross-municipality consortiums to facilitate learning and experience sharing across municipalities. The findings of the case study demonstrated that the IS benchmarking scheme is relatively successful in sharing good practices...

  17. HPC Benchmark Suite NMx Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Intelligent Automation Inc., (IAI) and University of Central Florida (UCF) propose to develop a comprehensive numerical test suite for benchmarking current and...

  18. Evaluating secondary neutron doses of a refined shielded design for a medical cyclotron using the TLD approach

    Science.gov (United States)

    Lin, Jye-Bin; Tseng, Hsien-Chun; Liu, Wen-Shan; Lin, Ding-Bang; Hsieh, Teng-San; Chen, Chien-Yi

    2013-11-01

    An increasing number of cyclotrons at medical centers in Taiwan have been installed to generate radiopharmaceutical products. An operating cyclotron generates immense amounts of secondary neutrons from reactions such the 18O(p, n)18F, used in the production of FDG. This intense radiation can be hazardous to public health, particularly to medical personnel. To increase the yield of 18F-FDG from 4200 GBq in 2005 to 48,600 GBq in 2011, Chung Shan Medical University Hospital (CSMUH) has prolonged irradiation time without changing the target or target current to meet requirements regarding the production 18F. The CSMUH has redesigned the CTI Radioisotope Delivery System shield. The lack of data for a possible secondary neutron doses has increased due to newly designed cyclotron rooms. This work aims to evaluate secondary neutron doses at a CTI cyclotron center using a thermoluminescent dosimeter (TLD-600). Two-dimensional neutron doses were mapped and indicated that neutron doses were high as neutrons leaked through self-shielded blocks and through the L-shaped concrete shield in vault rooms. These neutron doses varied markedly among locations close to the H218O target. The Monte Carlo simulation and minimum detectable dose are also discussed and demonstrated the reliability of using the TLD-600 approach. Findings can be adopted by medical centers to identify radioactive hot spots and develop radiation protection.

  19. A Systems Genetic Approach to Identify Low Dose Radiation-Induced Lymphoma Susceptibility/DOE2013FinalReport

    Energy Technology Data Exchange (ETDEWEB)

    Balmain, Allan [University of California, San Francisco; Song, Ihn Young [University of California, San Francisco

    2013-05-15

    The ultimate goal of this project is to identify the combinations of genetic variants that confer an individual's susceptibility to the effects of low dose (0.1 Gy) gamma-radiation, in particular with regard to tumor development. In contrast to the known effects of high dose radiation in cancer induction, the responses to low dose radiation (defined as 0.1 Gy or less) are much less well understood, and have been proposed to involve a protective anti-tumor effect in some in vivo scientific models. These conflicting results confound attempts to develop predictive models of the risk of exposure to low dose radiation, particularly when combined with the strong effects of inherited genetic variants on both radiation effects and cancer susceptibility. We have used a Systems Genetics approach in mice that combines genetic background analysis with responses to low and high dose radiation, in order to develop insights that will allow us to reconcile these disparate observations. Using this comprehensive approach we have analyzed normal tissue gene expression (in this case the skin and thymus), together with the changes that take place in this gene expression architecture a) in response to low or high- dose radiation and b) during tumor development. Additionally, we have demonstrated that using our expression analysis approach in our genetically heterogeneous/defined radiation-induced tumor mouse models can uniquely identify genes and pathways relevant to human T-ALL, and uncover interactions between common genetic variants of genes which may lead to tumor susceptibility.

  20. 氟砷致骨代谢损伤生物暴露限值基准剂量法分析%Determination of damage in bone metabolism caused by co-exposure to fluoride and arsenic using benchmark dose method in Chinese population

    Institute of Scientific and Technical Information of China (English)

    曾奇兵; 刘云; 洪峰; 杨鋆; 喻仙

    2012-01-01

    目的 应用基准剂量法探讨燃煤氟砷污染致暴露人群骨代谢损伤的生物暴露限值,为预防氟砷暴露对人体健康的损害提供骨损害方面的参考依据.方法 应用BMDS Version 2.1.2软件计算氟砷暴露人群尿氟、尿砷的基准剂量(BMD)及其可信限下限(BMDL).结果 氟、砷混合暴露引起骨代谢损伤的尿氟BMD及BMDL分别为1.96mg/gCr、1.32 mg/gCr;尿砷BMD及BMDL分别为120.11 μg/gCr、94.83 μg/gCr.结论 建议氟、砷混合暴露引起暴露人群骨代谢损伤尿氟和尿砷的生物暴露限值分别为1.32 mg/gCr和94.83 μg/gCr.%Objective To explore the biological exposure limitation for bone metabolism injury with benchmark dose method for the determination of potential risk associated with chronic co-exposure to fluoride and arsenic in Chinese population. Methods The benchmark dose( BMD) and the lower confidence limitation for the benchmark dose(BMDL) of urinary fluorine and arsenic in the exposure population were calculated using BMDS Version 2. 1. 2. Results The BMD and BMDL of urinary fluorine were 1. % mg/g creatinine and 1. 32 mg/g creatinine. BMD and BMDL of urinary arsenic were 120.11 μg/g and 94. 83 μg/g creatinine. Conclusion The estimated biological exposure limitation of urinary fluoride and arsenic were 1. 32 mg/g creatinine and 94. 83 μg/g creatinine in chronic co-exposure to fluoride and arsenic, respectively.

  1. TPSPET—A TPS-based approach for in vivo dose verification with PET in proton therapy

    Science.gov (United States)

    Frey, K.; Bauer, J.; Unholtz, D.; Kurz, C.; Krämer, M.; Bortfeld, T.; Parodi, K.

    2014-01-01

    Since the interest in ion-irradiation for tumour therapy has significantly increased over the last few decades, intensive investigations are performed to improve the accuracy of this form of patient treatment. One major goal is the development of methods for in vivo dose verification. In proton therapy, a PET (positron emission tomography)-based approach measuring the irradiation-induced tissue activation inside the patient has been already clinically implemented. The acquired PET images can be compared to an expectation, derived under the assumption of a correct treatment application, to validate the particle range and the lateral field position in vivo. In the context of this work, TPSPET is introduced as a new approach to predict proton-irradiation induced three-dimensional positron emitter distributions by means of the same algorithms of the clinical treatment planning system (TPS). In order to perform additional activity calculations, reaction-channel-dependent input positron emitter depth distributions are necessary, which are determined from the application of a modified filtering approach to the TPS reference depth dose profiles in water. This paper presents the implementation of TPSPET on the basis of the research treatment planning software treatment planning for particles. The results are validated in phantom and patient studies against Monte Carlo simulations, and compared to β+-emitter distributions obtained from a slightly modified version of the originally proposed one-dimensional filtering approach applied to three-dimensional dose distributions. In contrast to previously introduced methods, TPSPET provides a faster implementation, the results show no sensitivity to lateral field extension and the predicted β+-emitter densities are fully consistent to the planned treatment dose as they are calculated by the same pencil beam algorithms. These findings suggest a large potential of the application of TPSPET for in vivo dose verification in the daily

  2. Toward a unified approach to dose-response modeling in ecotoxicology.

    Science.gov (United States)

    Ritz, Christian

    2010-01-01

    This study reviews dose-response models that are used in ecotoxicology. The focus lies on clarification of differences and similarities between models, and as a side effect, their different guises in ecotoxicology are unravelled. A look at frequently used dose-response models reveals major discrepancies, among other things in naming conventions. Therefore, there is a need for a unified view on dose-response modeling in order to improve the understanding of it and to facilitate communication and comparison of findings across studies, thus realizing its full potential. This study attempts to establish a general framework that encompasses most dose-response models that are of interest to ecotoxicologists in practice. The framework includes commonly used models such as the log-logistic and Weibull models, but also features entire suites of models as found in various guidance documents. An outline on how the proposed framework can be implemented in statistical software systems is also provided.

  3. Benchmarking foreign electronics technologies

    Energy Technology Data Exchange (ETDEWEB)

    Bostian, C.W.; Hodges, D.A.; Leachman, R.C.; Sheridan, T.B.; Tsang, W.T.; White, R.M.

    1994-12-01

    This report has been drafted in response to a request from the Japanese Technology Evaluation Center`s (JTEC) Panel on Benchmarking Select Technologies. Since April 1991, the Competitive Semiconductor Manufacturing (CSM) Program at the University of California at Berkeley has been engaged in a detailed study of quality, productivity, and competitiveness in semiconductor manufacturing worldwide. The program is a joint activity of the College of Engineering, the Haas School of Business, and the Berkeley Roundtable on the International Economy, under sponsorship of the Alfred P. Sloan Foundation, and with the cooperation of semiconductor producers from Asia, Europe and the United States. Professors David A. Hodges and Robert C. Leachman are the project`s Co-Directors. The present report for JTEC is primarily based on data and analysis drawn from that continuing program. The CSM program is being conducted by faculty, graduate students and research staff from UC Berkeley`s Schools of Engineering and Business, and Department of Economics. Many of the participating firms are represented on the program`s Industry Advisory Board. The Board played an important role in defining the research agenda. A pilot study was conducted in 1991 with the cooperation of three semiconductor plants. The research plan and survey documents were thereby refined. The main phase of the CSM benchmarking study began in mid-1992 and will continue at least through 1997. reports are presented on the manufacture of integrated circuits; data storage; wireless technology; human-machine interfaces; and optoelectronics. Selected papers are indexed separately for inclusion in the Energy Science and Technology Database.

  4. Using dose-surface maps to predict radiation-induced rectal bleeding: a neural network approach.

    Science.gov (United States)

    Buettner, Florian; Gulliford, Sarah L; Webb, Steve; Partridge, Mike

    2009-09-07

    The incidence of late-toxicities after radiotherapy can be modelled based on the dose delivered to the organ under consideration. Most predictive models reduce the dose distribution to a set of dose-volume parameters and do not take the spatial distribution of the dose into account. The aim of this study was to develop a classifier predicting radiation-induced rectal bleeding using all available information on the dose to the rectal wall. The dose was projected on a two-dimensional dose-surface map (DSM) by virtual rectum-unfolding. These DSMs were used as inputs for a classification method based on locally connected neural networks. In contrast to fully connected conventional neural nets, locally connected nets take the topology of the input into account. In order to train the nets, data from 329 patients from the RT01 trial (ISRCTN 47772397) were split into ten roughly equal parts. By using nine of these parts as a training set and the remaining part as an independent test set, a ten-fold cross-validation was performed. Ensemble learning was used and 250 nets were built from randomly selected patients from the training set. Out of these 250 nets, an ensemble of expert nets was chosen. The performances of the full ensemble and of the expert ensemble were quantified by using receiver-operator-characteristic (ROC) curves. In order to quantify the predictive power of the shape, ensembles of fully connected conventional neural nets based on dose-surface histograms (DSHs) were generated and their performances were quantified. The expert ensembles performed better than or equally as well as the full ensembles. The area under the ROC curve for the DSM-based expert ensemble was 0.64. The area under the ROC curve for the DSH-based expert ensemble equalled 0.59. This difference in performance indicates that not only volumetric, but also morphological aspects of the dose distribution are correlated to rectal bleeding after radiotherapy. Thus, the shape of the dose

  5. Benchmark job – Watch out!

    CERN Multimedia

    Staff Association

    2017-01-01

    On 12 December 2016, in Echo No. 259, we already discussed at length the MERIT and benchmark jobs. Still, we find that a couple of issues warrant further discussion. Benchmark job – administrative decision on 1 July 2017 On 12 January 2017, the HR Department informed all staff members of a change to the effective date of the administrative decision regarding benchmark jobs. The benchmark job title of each staff member will be confirmed on 1 July 2017, instead of 1 May 2017 as originally announced in HR’s letter on 18 August 2016. Postponing the administrative decision by two months will leave a little more time to address the issues related to incorrect placement in a benchmark job. Benchmark job – discuss with your supervisor, at the latest during the MERIT interview In order to rectify an incorrect placement in a benchmark job, it is essential that the supervisor and the supervisee go over the assigned benchmark job together. In most cases, this placement has been done autom...

  6. Benchmark for Strategic Performance Improvement.

    Science.gov (United States)

    Gohlke, Annette

    1997-01-01

    Explains benchmarking, a total quality management tool used to measure and compare the work processes in a library with those in other libraries to increase library performance. Topics include the main groups of upper management, clients, and staff; critical success factors for each group; and benefits of benchmarking. (Author/LRW)

  7. Internal Benchmarking for Institutional Effectiveness

    Science.gov (United States)

    Ronco, Sharron L.

    2012-01-01

    Internal benchmarking is an established practice in business and industry for identifying best in-house practices and disseminating the knowledge about those practices to other groups in the organization. Internal benchmarking can be done with structures, processes, outcomes, or even individuals. In colleges or universities with multicampuses or a…

  8. Entropy-based benchmarking methods

    NARCIS (Netherlands)

    Temurshoev, Umed

    2012-01-01

    We argue that benchmarking sign-volatile series should be based on the principle of movement and sign preservation, which states that a bench-marked series should reproduce the movement and signs in the original series. We show that the widely used variants of Denton (1971) method and the growth pre

  9. Validation of a novel approach for dose individualization in pharmacotherapy using gabapentin in a proof of principles study.

    Science.gov (United States)

    Blau, Gary E; Orcun, Seza; Laínez, José M; Reklaitis, Gintaras V; Suvannasankha, Attaya; Fausel, Chris; Anaissie, Elias J

    2013-07-01

    To demonstrate the premise of individualized dosing charts (IDCs) as a clinical-bedside decision-support tool to individualize dosage regimens for drugs in which the interpatient variability is controlled by the pharmacokinetic (PK) behavior of the patient, to calculate the optimal sampling schedule (OSS), which minimizes the number of blood samples per patient. The approach is illustrated with available PK data for gabapentin. Retrospective proof of principles study using gabapentin PK data from a published clinical trial. Nineteen subjects in a trial designed to uncover the importance of the genetic contributions to variability in gabapentin absorption, renal elimination, and transport; subjects were monitored for 36 hours after administration of a single dose of gabapentin 400 mg, and plasma concentrations were determined at 14 time points. When the PK profiles were different between subjects, the IDCs are dramatically different from each other and from the IDC for an "average" patient representing the patient population. The dose amount and dosing interval must be adjusted to maximize the probability of staying within the target concentration range. An optimal sampling methodology based on the assumption-free Bayesian approach is used to distinguish the PK profile of an individual patient from the patient population. In the case of gabapentin, only two optimally selected test blood samples, at 1.5 and 6 hours after administration of a single doses, were necessary. The average sensitivity and the average specificity of the OSS was 99% and 96%, respectively. IDCs display the risk of a patient violating the target concentration range for any dosage regimen. They can be used as a clinical-bedside decision-support tool in a patient-physician partnership to decide on a dose amount and dosing interval that are medically acceptable while practical and convenient to ensure compliance. By using the assumption-free Bayesian approach and the OSS, the number of samples

  10. How to Advance TPC Benchmarks with Dependability Aspects

    Science.gov (United States)

    Almeida, Raquel; Poess, Meikel; Nambiar, Raghunath; Patil, Indira; Vieira, Marco

    Transactional systems are the core of the information systems of most organizations. Although there is general acknowledgement that failures in these systems often entail significant impact both on the proceeds and reputation of companies, the benchmarks developed and managed by the Transaction Processing Performance Council (TPC) still maintain their focus on reporting bare performance. Each TPC benchmark has to pass a list of dependability-related tests (to verify ACID properties), but not all benchmarks require measuring their performances. While TPC-E measures the recovery time of some system failures, TPC-H and TPC-C only require functional correctness of such recovery. Consequently, systems used in TPC benchmarks are tuned mostly for performance. In this paper we argue that nowadays systems should be tuned for a more comprehensive suite of dependability tests, and that a dependability metric should be part of TPC benchmark publications. The paper discusses WHY and HOW this can be achieved. Two approaches are introduced and discussed: augmenting each TPC benchmark in a customized way, by extending each specification individually; and pursuing a more unified approach, defining a generic specification that could be adjoined to any TPC benchmark.

  11. Applications of Integral Benchmark Data

    Energy Technology Data Exchange (ETDEWEB)

    Giuseppe Palmiotti; Teruhiko Kugo; Fitz Trumble; Albert C. (Skip) Kahler; Dale Lancaster

    2014-10-09

    The International Reactor Physics Experiment Evaluation Project (IRPhEP) and the International Criticality Safety Benchmark Evaluation Project (ICSBEP) provide evaluated integral benchmark data that may be used for validation of reactor physics / nuclear criticality safety analytical methods and data, nuclear data testing, advanced modeling and simulation, and safety analysis licensing activities. The handbooks produced by these programs are used in over 30 countries. Five example applications are presented in this paper: (1) Use of IRPhEP Data in Uncertainty Analyses and Cross Section Adjustment, (2) Uncertainty Evaluation Methods for Reactor Core Design at JAEA Using Reactor Physics Experimental Data, (3) Application of Benchmarking Data to a Broad Range of Criticality Safety Problems, (4) Cross Section Data Testing with ICSBEP Benchmarks, and (5) Use of the International Handbook of Evaluated Reactor Physics Benchmark Experiments to Support the Power Industry.

  12. Benchmarking & European Sustainable Transport Policies

    DEFF Research Database (Denmark)

    Gudmundsson, H.

    2003-01-01

    , Benchmarking is one of the management tools that have recently been introduced in the transport sector. It is rapidly being applied to a wide range of transport operations, services and policies. This paper is a contribution to the discussion of the role of benchmarking in the future efforts...... to support Sustainable European Transport Policies. The key message is that transport benchmarking has not yet been developed to cope with the challenges of this task. Rather than backing down completely, the paper suggests some critical conditions for applying and adopting benchmarking for this purpose. One...... way forward is to ensure a higher level of environmental integration in transport policy benchmarking. To this effect the paper will discuss the possible role of the socalled Transport and Environment Reporting Mechanism developed by the European Environment Agency. The paper provides an independent...

  13. Benchmarking and Sustainable Transport Policy

    DEFF Research Database (Denmark)

    Gudmundsson, Henrik; Wyatt, Andrew; Gordon, Lucy

    2004-01-01

    is generally not advised. Several other ways in which benchmarking and policy can support one another are identified in the analysis. This leads to a range of recommended initiatives to exploit the benefits of benchmarking in transport while avoiding some of the lurking pitfalls and dead ends......Order to learn from the best. In 2000 the European Commission initiated research to explore benchmarking as a tool to promote policies for ‘sustainable transport’. This paper reports findings and recommendations on how to address this challenge. The findings suggest that benchmarking is a valuable...... tool that may indeed help to move forward the transport policy agenda. However, there are major conditions and limitations. First of all it is not always so straightforward to delimit, measure and compare transport services in order to establish a clear benchmark. Secondly ‘sustainable transport...

  14. Effective dose delivery in atmospheric pressure plasma jets for plasma medicine: a model predictive control approach

    Science.gov (United States)

    Gidon, Dogan; Graves, David B.; Mesbah, Ali

    2017-08-01

    Atmospheric pressure plasma jets (APPJs) have been identified as a promising tool for plasma medicine. This paper aims to demonstrate the importance of using model-based feedback control strategies for safe, reproducible, and therapeutically effective application of APPJs for dose delivery to a target substrate. Key challenges in model-based control of APPJs arise from: (i) the multivariable, nonlinear nature of system dynamics, (ii) the need for constraining the system operation within an operating region that ensures safe plasma treatment, and (iii) the cumulative, nondecreasing nature of dose metrics. To systematically address these challenges, we propose a model predictive control (MPC) strategy for real-time feedback control of a radio-frequency APPJ in argon. To this end, a lumped-parameter, physics-based model is developed for describing the jet dynamics. Cumulative dose metrics are defined for quantifying the thermal and nonthermal energy effects of the plasma on substrate. The closed-loop performance of the MPC strategy is compared to that of a basic proportional-integral control system. Simulation results indicate that the MPC stategy provides a versatile framework for dose delivery in the presence of disturbances, while the safety and practical constraints of the APPJ operation can be systematically handled. Model-based feedback control strategies can lead to unprecedented opportunities for effective dose delivery in plasma medicine.

  15. Stochastic versus deterministic kernel-based superposition approaches for dose calculation of intensity-modulated arcs

    Science.gov (United States)

    Tang, Grace; Earl, Matthew A.; Luan, Shuang; Wang, Chao; Cao, Daliang; Yu, Cedric X.; Naqvi, Shahid A.

    2008-09-01

    Dose calculations for radiation arc therapy are traditionally performed by approximating continuous delivery arcs with multiple static beams. For 3D conformal arc treatments, the shape and weight variation per degree is usually small enough to allow arcs to be approximated by static beams separated by 5°-10°. But with intensity-modulated arc therapy (IMAT), the variation in shape and dose per degree can be large enough to require a finer angular spacing. With the increase in the number of beams, a deterministic dose calculation method, such as collapsed-cone convolution/superposition, will require proportionally longer computational times, which may not be practical clinically. We propose to use a homegrown Monte Carlo kernel-superposition technique (MCKS) to compute doses for rotational delivery. The IMAT plans were generated with 36 static beams, which were subsequently interpolated into finer angular intervals for dose calculation to mimic the continuous arc delivery. Since MCKS uses random sampling of photons, the dose computation time only increased insignificantly for the interpolated-static-beam plans that may involve up to 720 beams. Ten past IMRT cases were selected for this study. Each case took approximately 15-30 min to compute on a single CPU running Mac OS X using the MCKS method. The need for a finer beam spacing is dictated by how fast the beam weights and aperture shapes change between the adjacent static planning beam angles. MCKS, however, obviates the concern by allowing hundreds of beams to be calculated in practically the same time as for a few beams. For more than 43 beams, MCKS usually takes less CPU time than the collapsed-cone algorithm used by the Pinnacle3 planning system.

  16. Agonist trigger: what is the best approach? Agonist trigger and low dose hCG

    DEFF Research Database (Denmark)

    Humaidan, Peter; Al Humaidan, Peter Samir Heskjær

    2012-01-01

    Low-dose hCG supplementation after GnRH agonist trigger may normalize reproductive outcome while minimizing the occurrence of OHSS in high risk IVF patients. (Fertil Steril (R) 2012;97:529-30. (C) 2012 by American Society for Reproductive Medicine.)......Low-dose hCG supplementation after GnRH agonist trigger may normalize reproductive outcome while minimizing the occurrence of OHSS in high risk IVF patients. (Fertil Steril (R) 2012;97:529-30. (C) 2012 by American Society for Reproductive Medicine.)...

  17. Low energy isomers of (H2O)25 from a hierarchical method based on Monte Carlo temperature basin paving and molecular tailoring approaches benchmarked by MP2 calculations.

    Science.gov (United States)

    Sahu, Nityananda; Gadre, Shridhar R; Rakshit, Avijit; Bandyopadhyay, Pradipta; Miliordos, Evangelos; Xantheas, Sotiris S

    2014-10-28

    We report new global minimum candidate structures for the (H2O)25 cluster that are lower in energy than the ones reported previously and correspond to hydrogen bonded networks with 42 hydrogen bonds and an interior, fully coordinated water molecule. These were obtained as a result of a hierarchical approach based on initial Monte Carlo Temperature Basin Paving sampling of the cluster's Potential Energy Surface with the Effective Fragment Potential, subsequent geometry optimization using the Molecular Tailoring Approach with the fragments treated at the second order Møller-Plesset (MP2) perturbation (MTA-MP2) and final refinement of the entire cluster at the MP2 level of theory. The MTA-MP2 optimized cluster geometries, constructed from the fragments, were found to be within <0.5 kcal/mol from the minimum geometries obtained from the MP2 optimization of the entire (H2O)25 cluster. In addition, the grafting of the MTA-MP2 energies yields electronic energies that are within <0.3 kcal/mol from the MP2 energies of the entire cluster while preserving their energy rank order. Finally, the MTA-MP2 approach was found to reproduce the MP2 harmonic vibrational frequencies, constructed from the fragments, quite accurately when compared to the MP2 ones of the entire cluster in both the HOH bending and the OH stretching regions of the spectra.

  18. Microsatellite allele dose and configuration establishment (MADCE): an integrated approach for genetic studies in allopolyploids

    NARCIS (Netherlands)

    Dijk, van T.; Noordijk, Y.; Dubos, T.; Bink, M.C.A.M.; Visser, R.G.F.; Weg, van de W.E.

    2012-01-01

    BACKGROUND: Genetic studies in allopolyploid plants are challenging because of the presence of similar sub-genomes, which leads to multiple alleles and complex segregation ratios. In this study, we describe a novel method for establishing the exact dose and configuration of microsatellite alleles fo

  19. Empirical Evaluation of Meta-Analytic Approaches for Nutrient and Health Outcome Dose-Response Data

    Science.gov (United States)

    Yu, Winifred W.; Schmid, Christopher H.; Lichtenstein, Alice H.; Lau, Joseph; Trikalinos, Thomas A.

    2013-01-01

    The objective of this study is to empirically compare alternative meta-analytic methods for combining dose-response data from epidemiological studies. We identified meta-analyses of epidemiological studies that analyzed the association between a single nutrient and a dichotomous outcome. For each topic, we performed meta-analyses of odds ratios…

  20. Interspecies extrapolation based on the RepDose database--a probabilistic approach.

    Science.gov (United States)

    Escher, Sylvia E; Batke, Monika; Hoffmann-Doerr, Simone; Messinger, Horst; Mangelsdorf, Inge

    2013-04-12

    Repeated dose toxicity studies from the RepDose database (DB) were used to determine interspecies differences for rats and mice. NOEL (no observed effect level) ratios based on systemic effects were investigated for three different types of exposure: inhalation, oral food/drinking water and oral gavage. Furthermore, NOEL ratios for local effects in inhalation studies were evaluated. On the basis of the NOEL ratio distributions, interspecies assessment factors (AF) are evaluated. All data sets were best described by a lognormal distribution. No difference was seen between inhalation and oral exposure for systemic effects. Rats and mice were on average equally sensitive at equipotent doses with geometric mean (GM) values of 1 and geometric standard deviation (GSD) values ranging from 2.30 to 3.08. The local AF based on inhalation exposure resulted in a similar distribution with GM values of 1 and GSD values between 2.53 and 2.70. Our analysis confirms former analyses on interspecies differences, including also dog and human data. Furthermore it supports the principle of allometric scaling according to caloric demand in the case that body doses are applied. In conclusion, an interspecies distribution animal/human with a GM equal to allometric scaling and a GSD of 2.5 was derived.

  1. Is the fixed-dose combination of telmisartan and hydrochlorothiazide a good approach to treat hypertension?

    Directory of Open Access Journals (Sweden)

    Marc P Maillard

    2007-07-01

    Full Text Available Marc P Maillard, Michel BurnierService of Nephrology, Department of Internal Medicine, Lausanne University Hospital, SwitzerlandAbstract: Blockade of the renin-angiotensin system with selective AT1 receptor antagonists is recognized as an effective mean to lower blood pressure in hypertensive patients. Among the class of AT1 receptor antagonists, telmisartan offers the advantage of a very long half-life. This enables blood pressure control over 24 hours using once-daily administration. The combination of telmisartan with hydrochlorothiazide is a logical step because numerous previous studies have demonstrated that sodium depletion enhances the antihypertensive efficacy of drugs interfering with the activity of the renin-angiotensin system (RAS. In accordance with past experience using similar compounds blocking the RAS, several controlled studies have now demonstrated that the fixed-dose combination of telmisartan/hydrochlorothiazide is superior in lowering blood pressure than either telmisartan or hydrochlorothiazide alone. Of clinical interest also is the observation that the excellent clinical tolerance of the angiotensin II receptor antagonist is not affected by the association of the low-dose thiazide. Thus telmisartan/hydrochlorothiazide is an effective and well-tolerated antihypertensive combination. Finally, the development of fixed-dose combinations should improve drug adherence because of the one-pill-a-day regimen.Keywords: telmisartan, hydrochlorothiazide, fixed-dose combinations, antihypertensive agent, safety, compliance

  2. A Probabilistic Approach to Uncertainty Analysis in NTPR Radiation Dose Assessments

    Science.gov (United States)

    2009-11-01

    Headquarters (HQ) Detachment, Enewetak Atoll .............................................. 115 5.1.1 Case Description and Cohort Participation Scenario...Administrative and Operations Detachments, Enewetak Atoll .......................... 122 5.2.1 Case Description and Cohort Participation Scenario...Figure 33. Comparison of the Dose Distribution from Probabilistic Analysis with Unbiased Film Badge Readings for the 7126th AU at Enewetak Atoll

  3. Photoneutron leakage from medical accelerators: a comprehensive approach to patient and personnel dose measurement

    Energy Technology Data Exchange (ETDEWEB)

    D' Errico, F.; Curzio, G. [Universita degli Studi di Pisa, Dipartimento di Costruzioni Meccaniche e Nucleari, Pisa (Italy)

    1992-07-01

    Simple and reliable techniques, based on the use of superheated drop neutron detectors (SDD's*), are presented for both medical accelerator personnel exposure monitoring, and the direct measurement of non therapeutic dose equivalent received by patients undergoing high-energy x-ray and electron treatment. (author)

  4. Fast analytical approach of application specific dose efficient spectrum selection for diagnostic CT imaging and PET attenuation correction

    Science.gov (United States)

    Rui, Xue; Jin, Yannan; FitzGerald, Paul F.; Wu, Mingye; Alessio, Adam M.; Kinahan, Paul E.; De Man, Bruno

    2016-11-01

    Computed tomography (CT) has been used for a variety of applications, two of which include diagnostic imaging and attenuation correction for PET or SPECT imaging. Ideally, the x-ray tube spectrum should be optimized for the specific application to minimize the patient radiation dose while still providing the necessary information. In this study, we proposed a projection-based analytic approach for the analysis of contrast, noise, and bias. Dose normalized contrast to noise ratio (CNRD), inverse noise normalized by dose (IND) and bias are used as evaluation metrics to determine the optimal x-ray spectrum. Our simulation investigated the dose efficiency of the x-ray spectrum ranging from 40 kVp to 200 kVp. Water cylinders with diameters of 15 cm, 24 cm, and 35 cm were used in the simulation to cover a variety of patient sizes. The effects of electronic noise and pre-patient copper filtration were also evaluated. A customized 24 cm CTDI-like phantom with 13 mm diameter inserts filled with iodine (10 mg ml-1), tantalum (10 mg ml-1), water, and PMMA was measured with both standard (1.5 mGy) and ultra-low (0.2 mGy) dose to verify the simulation results at tube voltages of 80, 100, 120, and 140 kVp. For contrast-enhanced diagnostic imaging, the simulation results indicated that for high dose without filtration, the optimal kVp for water contrast is approximately 100 kVp for a 15 cm water cylinder. However, the 60 kVp spectrum produces the highest CNRD for bone and iodine. The optimal kVp for tantalum has two selections: approximately 50 and 100 kVp. The kVp that maximizes CNRD increases when the object size increases. The trend in the CTDI phantom measurements agrees with the simulation results, which also agrees with previous studies. Copper filtration improved the dose efficiency for water and tantalum, but reduced the iodine and bone dose efficiency in a clinically-relevant range (70-140 kVp). Our study also shows that for CT-based attenuation

  5. The Army Pollution Prevention Program: Improving Performance Through Benchmarking.

    Science.gov (United States)

    1995-06-01

    questionnaire, which had with 30 questions addressing 12 key maintenance performance measures. The measures were selected to represent a balanced scorecard of...in practice. While the continuing concern for the hazardous waste stream is genu- ine and well-founded, the Army must seek a more balanced approach...exactly what the benchmarking proc- ess entails. For example, benchmarking is commonly confused with industrial tourism — simply visiting a partner’s site

  6. Benchmarking the next generation of homology inference tools

    OpenAIRE

    Saripella, Ganapathi Varma; Sonnhammer, Erik L.L.; Forslund, Kristoffer

    2016-01-01

    Motivation: Over the last decades, vast numbers of sequences were deposited in public databases. Bioinformatics tools allow homology and consequently functional inference for these sequences. New profile-based homology search tools have been introduced, allowing reliable detection of remote homologs, but have not been systematically benchmarked. To provide such a comparison, which can guide bioinformatics workflows, we extend and apply our previously developed benchmark approach to evaluate t...

  7. Benchmarking in academic pharmacy departments.

    Science.gov (United States)

    Bosso, John A; Chisholm-Burns, Marie; Nappi, Jean; Gubbins, Paul O; Ross, Leigh Ann

    2010-10-11

    Benchmarking in academic pharmacy, and recommendations for the potential uses of benchmarking in academic pharmacy departments are discussed in this paper. Benchmarking is the process by which practices, procedures, and performance metrics are compared to an established standard or best practice. Many businesses and industries use benchmarking to compare processes and outcomes, and ultimately plan for improvement. Institutions of higher learning have embraced benchmarking practices to facilitate measuring the quality of their educational and research programs. Benchmarking is used internally as well to justify the allocation of institutional resources or to mediate among competing demands for additional program staff or space. Surveying all chairs of academic pharmacy departments to explore benchmarking issues such as department size and composition, as well as faculty teaching, scholarly, and service productivity, could provide valuable information. To date, attempts to gather this data have had limited success. We believe this information is potentially important, urge that efforts to gather it should be continued, and offer suggestions to achieve full participation.

  8. Benchmarking biofuels; Biobrandstoffen benchmarken

    Energy Technology Data Exchange (ETDEWEB)

    Croezen, H.; Kampman, B.; Bergsma, G.

    2012-03-15

    A sustainability benchmark for transport biofuels has been developed and used to evaluate the various biofuels currently on the market. For comparison, electric vehicles, hydrogen vehicles and petrol/diesel vehicles were also included. A range of studies as well as growing insight are making it ever clearer that biomass-based transport fuels may have just as big a carbon footprint as fossil fuels like petrol or diesel, or even bigger. At the request of Greenpeace Netherlands, CE Delft has brought together current understanding on the sustainability of fossil fuels, biofuels and electric vehicles, with particular focus on the performance of the respective energy carriers on three sustainability criteria, with the first weighing the heaviest: (1) Greenhouse gas emissions; (2) Land use; and (3) Nutrient consumption [Dutch] Greenpeace Nederland heeft CE Delft gevraagd een duurzaamheidsmeetlat voor biobrandstoffen voor transport te ontwerpen en hierop de verschillende biobrandstoffen te scoren. Voor een vergelijk zijn ook elektrisch rijden, rijden op waterstof en rijden op benzine of diesel opgenomen. Door onderzoek en voortschrijdend inzicht blijkt steeds vaker dat transportbrandstoffen op basis van biomassa soms net zoveel of zelfs meer broeikasgassen veroorzaken dan fossiele brandstoffen als benzine en diesel. CE Delft heeft voor Greenpeace Nederland op een rijtje gezet wat de huidige inzichten zijn over de duurzaamheid van fossiele brandstoffen, biobrandstoffen en elektrisch rijden. Daarbij is gekeken naar de effecten van de brandstoffen op drie duurzaamheidscriteria, waarbij broeikasgasemissies het zwaarst wegen: (1) Broeikasgasemissies; (2) Landgebruik; en (3) Nutriëntengebruik.

  9. Correlational effect size benchmarks.

    Science.gov (United States)

    Bosco, Frank A; Aguinis, Herman; Singh, Kulraj; Field, James G; Pierce, Charles A

    2015-03-01

    Effect size information is essential for the scientific enterprise and plays an increasingly central role in the scientific process. We extracted 147,328 correlations and developed a hierarchical taxonomy of variables reported in Journal of Applied Psychology and Personnel Psychology from 1980 to 2010 to produce empirical effect size benchmarks at the omnibus level, for 20 common research domains, and for an even finer grained level of generality. Results indicate that the usual interpretation and classification of effect sizes as small, medium, and large bear almost no resemblance to findings in the field, because distributions of effect sizes exhibit tertile partitions at values approximately one-half to one-third those intuited by Cohen (1988). Our results offer information that can be used for research planning and design purposes, such as producing better informed non-nil hypotheses and estimating statistical power and planning sample size accordingly. We also offer information useful for understanding the relative importance of the effect sizes found in a particular study in relationship to others and which research domains have advanced more or less, given that larger effect sizes indicate a better understanding of a phenomenon. Also, our study offers information about research domains for which the investigation of moderating effects may be more fruitful and provide information that is likely to facilitate the implementation of Bayesian analysis. Finally, our study offers information that practitioners can use to evaluate the relative effectiveness of various types of interventions. PsycINFO Database Record (c) 2015 APA, all rights reserved.

  10. A benchmark of co-flow and cyclic deposition/etch approaches for the selective epitaxial growth of tensile-strained Si:P

    Science.gov (United States)

    Hartmann, J. M.; Veillerot, M.; Prévitali, B.

    2017-10-01

    We have compared co-flow and cyclic deposition/etch processes for the selective epitaxial growth of Si:P layers. High growth rates, relatively low resistivities and significant amounts of tensile strain (up to 10 nm min-1, 0.55 mOhm cm and a strain equivalent to 1.06% of substitutional C in Si:C layers) were obtained at 700 °C, 760 Torr with a co-flow approach and a SiH2Cl2 + PH3 + HCl chemistry. This approach was successfully used to thicken the sources and drains regions of n-type fin-shaped Field Effect Transistors. Meanwhile, the (Si2H6 + PH3/HCl + GeH4) CDE process evaluated yielded at 600 °C, 80 Torr even lower resistivities (0.4 mOhm cm, typically), at the cost however of the tensile strain which was lost due to (i) the incorporation of Ge atoms (1.5%, typically) into the lattice during the selective etch steps and (ii) a reduction by a factor of two of the P atomic concentration in CDE layers compared to that in layers grown in a single step (5 × 1020 cm-3 compared to 1021 cm-3).

  11. Benchmarking in water project analysis

    Science.gov (United States)

    Griffin, Ronald C.

    2008-11-01

    The with/without principle of cost-benefit analysis is examined for the possible bias that it brings to water resource planning. Theory and examples for this question are established. Because benchmarking against the demonstrably low without-project hurdle can detract from economic welfare and can fail to promote efficient policy, improvement opportunities are investigated. In lieu of the traditional, without-project benchmark, a second-best-based "difference-making benchmark" is proposed. The project authorizations and modified review processes instituted by the U.S. Water Resources Development Act of 2007 may provide for renewed interest in these findings.

  12. Experiences in Benchmarking of Autonomic Systems

    Science.gov (United States)

    Etchevers, Xavier; Coupaye, Thierry; Vachet, Guy

    Autonomic computing promises improvements of systems quality of service in terms of availability, reliability, performance, security, etc. However, little research and experimental results have so far demonstrated this assertion, nor provided proof of the return on investment stemming from the efforts that introducing autonomic features requires. Existing works in the area of benchmarking of autonomic systems can be characterized by their qualitative and fragmented approaches. Still a crucial need is to provide generic (i.e. independent from business, technology, architecture and implementation choices) autonomic computing benchmarking tools for evaluating and/or comparing autonomic systems from a technical and, ultimately, an economical point of view. This article introduces a methodology and a process for defining and evaluating factors, criteria and metrics in order to qualitatively and quantitatively assess autonomic features in computing systems. It also discusses associated experimental results on three different autonomic systems.

  13. Benchmarking: a method for continuous quality improvement in health.

    Science.gov (United States)

    Ettorchi-Tardy, Amina; Levif, Marie; Michel, Philippe

    2012-05-01

    Benchmarking, a management approach for implementing best practices at best cost, is a recent concept in the healthcare system. The objectives of this paper are to better understand the concept and its evolution in the healthcare sector, to propose an operational definition, and to describe some French and international experiences of benchmarking in the healthcare sector. To this end, we reviewed the literature on this approach's emergence in the industrial sector, its evolution, its fields of application and examples of how it has been used in the healthcare sector. Benchmarking is often thought to consist simply of comparing indicators and is not perceived in its entirety, that is, as a tool based on voluntary and active collaboration among several organizations to create a spirit of competition and to apply best practices. The key feature of benchmarking is its integration within a comprehensive and participatory policy of continuous quality improvement (CQI). Conditions for successful benchmarking focus essentially on careful preparation of the process, monitoring of the relevant indicators, staff involvement and inter-organizational visits. Compared to methods previously implemented in France (CQI and collaborative projects), benchmarking has specific features that set it apart as a healthcare innovation. This is especially true for healthcare or medical-social organizations, as the principle of inter-organizational visiting is not part of their culture. Thus, this approach will need to be assessed for feasibility and acceptability before it is more widely promoted.

  14. Quantitative approaches for health risk assessment of environmental pollutants : estimation of differences in sensitivity, relative potencies, and margins of exposure

    OpenAIRE

    Kalantari, Fereshteh

    2012-01-01

    Historically, quantitative health risk assessment of chemical substances is based on deterministic approaches. For a more realistic and informative health risk assessment, however, the variability and uncertainty inherent in measurements should be taken into consideration. The variability and uncertainty may be assessed by applying probabilistic methods when performing dose-response assessment, exposure assessment and risk characterization. The benchmark dose (BMD) method has b...

  15. A Benchmark for Banks’ Strategy in Online Presence – An Innovative Approach Based on Elements of Search Engine Optimization (SEO and Machine Learning Techniques

    Directory of Open Access Journals (Sweden)

    Camelia Elena CIOLAC

    2011-06-01

    Full Text Available This paper aims to offer a new decision tool to assist banks in evaluating their efficiency of Internet presence and in planning the IT investments towards gaining better Internet popularity. The methodology used in this paper goes beyond the simple website interface analysis and uses web crawling as a source for collecting website performance data and employed web technologies and servers. The paper complements this technical perspective with a proposed scorecard used to assess the efforts of banks in Internet presence that reflects the banks’ commitment to Internet as a distribution channel. An innovative approach based on Machine Learning Techniques, the K-Nearest Neighbor Algorithm, is proposed by the author to estimate the Internet Popularity that a bank is likely to achieve based on its size and efforts in Internet presence.

  16. Benchmarking result diversification in social image retrieval

    DEFF Research Database (Denmark)

    Ionescu, Bogdan; Popescu, Adrian; Müller, Henning

    2014-01-01

    This article addresses the issue of retrieval result diversification in the context of social image retrieval and discusses the results achieved during the MediaEval 2013 benchmarking. 38 runs and their results are described and analyzed in this text. A comparison of the use of expert vs....... crowdsourcing annotations shows that crowdsourcing results are slightly different and have higher inter observer differences but results are comparable at lower cost. Multimodal approaches have best results in terms of cluster recall. Manual approaches can lead to high precision but often lower diversity....... With this detailed results analysis we give future insights on this matter....

  17. A model-based approach of scatter dose contributions and efficiency of apron shielding for radiation protection in CT.

    Science.gov (United States)

    Weber, N; Monnin, P; Elandoy, C; Ding, S

    2015-12-01

    Given the contribution of scattered radiations to patient dose in CT, apron shielding is often used for radiation protection. In this study the efficiency of apron was assessed with a model-based approach of the contributions of the four scatter sources in CT, i.e. external scattered radiations from the tube and table, internal scatter from the patient and backscatter from the shielding. For this purpose, CTDI phantoms filled with thermoluminescent dosimeters were scanned without apron, and then with an apron at 0, 2.5 and 5 cm from the primary field. Scatter from the tube was measured separately in air. The scatter contributions were separated and mathematically modelled. The protective efficiency of the apron was low, only 1.5% in scatter dose reduction on average. The apron at 0 cm from the beam lowered the dose by 7.5% at the phantom bottom but increased the dose by 2% at the top (backscatter) and did not affect the centre. When the apron was placed at 2.5 or 5 cm, the results were intermediate to the one obtained with the shielding at 0 cm and without shielding. The apron effectiveness is finally limited to the small fraction of external scattered radiation.

  18. Learned Shrinkage Approach for Low-Dose Reconstruction in Computed Tomography

    Directory of Open Access Journals (Sweden)

    Joseph Shtok

    2013-01-01

    Full Text Available We propose a direct nonlinear reconstruction algorithm for Computed Tomography (CT, designed to handle low-dose measurements. It involves the filtered back-projection and adaptive nonlinear filtering in both the projection and the image domains. The filter is an extension of the learned shrinkage method by Hel-Or and Shaked to the case of indirect observations. The shrinkage functions are learned using a training set of reference CT images. The optimization is performed with respect to an error functional in the image domain that combines the mean square error with a gradient-based penalty, promoting image sharpness. Our numerical simulations indicate that the proposed algorithm can manage well with noisy measurements, allowing a dose reduction by a factor of 4, while reducing noise and streak artifacts in the FBP reconstruction, comparable to the performance of a statistically based iterative algorithm.

  19. Oral Postdialysis Cholecalciferol Supplementation in Patients on Maintenance Hemodialysis: A Dose-Response Approach

    Directory of Open Access Journals (Sweden)

    Eric Descombes

    2014-01-01

    Full Text Available The aim of the present study was to evaluate the dose of postdialysis cholecalciferol needed to maintain the 25-hydroxyvitamin D [25(OHD] levels in the optimal range of 75–150 nmol/L. Twenty-six patients who had low baseline 25(OHD levels (mean 27.5±14.9 nmol/L were studied. The 25(OHD levels were measured every 2 months for one year. During the first two months, all the patients received 2000 IU of cholecalciferol after each hemodialysis (=6000 IU/wk. Thereafter, the dose was individualized and adapted every 2 months by administering 1 to 6 cholecalciferol tablets (2000 IU each per week (total weekly dose = 2000–12000 IU/wk. During cholecalciferol supplementation, the 25(OHD concentrations rapidly increased from baseline to 140.1±28.3 nmol/L at month 6 and 95.6±20.9 nmol/L at month 12. At month twelve, 86% of the patients had 25(OHD levels within the target range with a mean dose of 5917±4106 IU/wk of cholecalciferol; however, the amount needed to maintain these levels varied widely from 0 (n=2 to 12000 IU/wk (n=5. In conclusion, postdialysis cholecalciferol prescription is quite effective in correcting vitamin D deficiency/insufficiency, but the amount of cholecalciferol needed to maintain the 25(OHD levels within the optimal range over the long-term varies widely among patients and must be individualized.

  20. Performance properties of the population bioequivalence approach for in vitro delivered dose for orally inhaled respiratory products.

    Science.gov (United States)

    Morgan, Beth; Strickland, Helen

    2014-01-01

    Regulatory agencies, industry, and academia have acknowledged that in vitro assessments serve a role in establishing bioequivalence for second-entry drug product approvals as well as innovator post-approval drug product changes. For orally inhaled respiratory products (OIPs), the issues of correctly analyzing in vitro data and interpreting the results within the broader context of therapeutic equivalence have garnered significant attention. One of the recommended statistical tests for in vitro data is the population bioequivalence method (PBE). The current literature for assessing the PBE statistical approach for in vitro data assumes a log normal distribution. This paper focuses on an assessment of that assumption for in vitro delivered dose. Concepts in development of a statistical model are presented. The PBE criterion and hypotheses are written for the case when data follows a normal distribution, rather than log normal. Results of a simulation study are reported, characterizing the performance of the PBE approach when data are expected to be normally distributed, rather than log normal. In these cases, decisions using the PBE approach are not consistent for the same absolute mean difference that the test product is from the reference product. A conclusion of inequivalency will occur more often if the test product dose is lower than the reference product for the same deviation from target. These features suggest that more research is needed for statistical equivalency approaches for in vitro data.

  1. Systems engineering approach for the reuse of metallic waste from NPP decommissioning and dose evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Seo, Hyung Woo; Kim, Chang Lak [KEPCO International Nuclear Graduate School, Ulsan (Korea, Republic of)

    2017-03-15

    The oldest commercial reactor in South Korea, Kori-1 Nuclear Power Plant (NPP), will be shut down in 2017. Proper treatment for decommissioning wastes is one of the key factors to decommission a plant successfully. Particularly important is the recycling of clearance level or very low level radioactively contaminated metallic wastes, which contributes to waste minimization and the reduction of disposal volume. The aim of this study is to introduce a conceptual design of a recycle system and to evaluate the doses incurred through defined work flows. The various architecture diagrams were organized to define operational procedures and tasks. Potential exposure scenarios were selected in accordance with the recycle system, and the doses were evaluated with the RESRAD-RECYCLE computer code. By using this tool, the important scenarios and radionuclides as well as impacts of radionuclide characteristics and partitioning factors are analyzed. Moreover, dose analysis can be used to provide information on the necessary decontamination, radiation protection process, and allowable concentration limits for exposure scenarios.

  2. Water Level Superseded Benchmark Sheets

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Images of National Coast & Geodetic Survey (now NOAA's National Geodetic Survey/NGS) tidal benchmarks which have been superseded by new markers or locations....

  3. Benchmark simulation models, quo vadis?

    DEFF Research Database (Denmark)

    Jeppsson, U.; Alex, J; Batstone, D. J.

    2013-01-01

    As the work of the IWA Task Group on Benchmarking of Control Strategies for wastewater treatment plants (WWTPs) is coming to an end, it is essential to disseminate the knowledge gained. For this reason, all authors of the IWA Scientific and Technical Report on benchmarking have come together to p...... already being done within the context of the benchmarking simulation models (BSMs) or applicable work in the wider literature. Of key importance is increasing capability, usability and transparency of the BSM package while avoiding unnecessary complexity. © IWA Publishing 2013....... and spatial extension, process modifications within the WWTP, the realism of models, control strategy extensions and the potential for new evaluation tools within the existing benchmark system. We find that there are major opportunities for application within all of these areas, either from existing work...

  4. Benchmarking and Sustainable Transport Policy

    DEFF Research Database (Denmark)

    Gudmundsson, Henrik; Wyatt, Andrew; Gordon, Lucy

    2004-01-01

    Order to learn from the best. In 2000 the European Commission initiated research to explore benchmarking as a tool to promote policies for ‘sustainable transport’. This paper reports findings and recommendations on how to address this challenge. The findings suggest that benchmarking is a valuable...... tool that may indeed help to move forward the transport policy agenda. However, there are major conditions and limitations. First of all it is not always so straightforward to delimit, measure and compare transport services in order to establish a clear benchmark. Secondly ‘sustainable transport......’ evokes a broad range of concerns that are hard to address fully at the level of specific practices. Thirdly policies are not directly comparable across space and context. For these reasons attempting to benchmark ‘sustainable transport policies’ against one another would be a highly complex task, which...

  5. Performance Targets and External Benchmarking

    DEFF Research Database (Denmark)

    Friis, Ivar; Hansen, Allan; Vámosi, Tamás S.

    Research on relative performance measures, transfer pricing, beyond budgeting initiatives, target costing, piece rates systems and value based management has for decades underlined the importance of external benchmarking in performance management. Research conceptualises external benchmarking...... the conditions upon which the market mechanism is performing within organizations. This paper aims to contribute to research by providing more insight to the conditions for the use of external benchmarking as an element in performance management in organizations. Our study explores a particular type of external...... towards the conditions for the use of the external benchmarks we provide more insights to some of the issues and challenges that are related to using this mechanism for performance management and advance competitiveness in organizations....

  6. Benchmarking Post-Hartree–Fock Methods To Describe the Nonlinear Optical Properties of Polymethines: An Investigation of the Accuracy of Algebraic Diagrammatic Construction (ADC) Approaches

    KAUST Repository

    Knippenberg, Stefan

    2016-10-07

    Third-order nonlinear optical (NLO) properties of polymethine dyes have been widely studied for applications such as all-optical switching. However, the limited accuracy of the current computational methodologies has prevented a comprehensive understanding of the nature of the lowest excited states and their influence on the molecular optical and NLO properties. Here, attention is paid to the lowest excited-state energies and their energetic ratio, as these characteristics impact the figure-of-merit for all-optical switching. For a series of model polymethines, we compare several algebraic diagrammatic construction (ADC) schemes for the polarization propagator with approximate second-order coupled cluster (CC2) theory, the widely used INDO/MRDCI approach and the symmetry-adapted cluster configuration interaction (SAC-CI) algorithm incorporating singles and doubles linked excitation operators (SAC-CI SD-R). We focus in particular on the ground-to-excited state transition dipole moments and the corresponding state dipole moments, since these quantities are found to be of utmost importance for an effective description of the third-order polarizability γ and two-photon absorption spectra. A sum-overstates expression has been used, which is found to quickly converge. While ADC(3/2) has been found to be the most appropriate method to calculate these properties, CC2 performs poorly.

  7. Benchmarking Post-Hartree-Fock Methods To Describe the Nonlinear Optical Properties of Polymethines: An Investigation of the Accuracy of Algebraic Diagrammatic Construction (ADC) Approaches.

    Science.gov (United States)

    Knippenberg, Stefan; Gieseking, Rebecca L; Rehn, Dirk R; Mukhopadhyay, Sukrit; Dreuw, Andreas; Brédas, Jean-Luc

    2016-11-08

    Third-order nonlinear optical (NLO) properties of polymethine dyes have been widely studied for applications such as all-optical switching. However, the limited accuracy of the current computational methodologies has prevented a comprehensive understanding of the nature of the lowest excited states and their influence on the molecular optical and NLO properties. Here, attention is paid to the lowest excited-state energies and their energetic ratio, as these characteristics impact the figure-of-merit for all-optical switching. For a series of model polymethines, we compare several algebraic diagrammatic construction (ADC) schemes for the polarization propagator with approximate second-order coupled cluster (CC2) theory, the widely used INDO/MRDCI approach and the symmetry-adapted cluster configuration interaction (SAC-CI) algorithm incorporating singles and doubles linked excitation operators (SAC-CI SD-R). We focus in particular on the ground-to-excited state transition dipole moments and the corresponding state dipole moments, since these quantities are found to be of utmost importance for an effective description of the third-order polarizability γ and two-photon absorption spectra. A sum-overstates expression has been used, which is found to quickly converge. While ADC(3/2) has been found to be the most appropriate method to calculate these properties, CC2 performs poorly.

  8. Performance Targets and External Benchmarking

    DEFF Research Database (Denmark)

    Friis, Ivar; Hansen, Allan; Vámosi, Tamás S.

    as a market mechanism that can be brought inside the firm to provide incentives for continuous improvement and the development of competitive advances. However, whereas extant research primarily has focused on the importance and effects of using external benchmarks, less attention has been directed towards...... towards the conditions for the use of the external benchmarks we provide more insights to some of the issues and challenges that are related to using this mechanism for performance management and advance competitiveness in organizations....

  9. The formulation and implementation of benchmarking management approaches in Xinj ulong Company%新巨龙公司全面对标管理办法的制定与实施

    Institute of Scientific and Technical Information of China (English)

    刘文; 张燕

    2013-01-01

    介绍了山东新巨龙能源有限责任公司在面临矿井开采条件恶劣、宏观经济下滑、市场竞争日益激烈的严峻挑战下,通过分析对标管理的目的及意义,建立对标指标体系和对标管理体系,保证了对标的效果,实现了企业各项工作的稳定健康发展。%Under severe underground mining conditions,macroeconomic decline and fierce market competition,Shandong new dragon energy company established the benchmarking index system and benchmarking management system by analyzing the purpose and meaning of bench-marking management,which ensured the benchmarking results and achieved stable and healthy development in the company.

  10. Developing scheduling benchmark tests for the Space Network

    Science.gov (United States)

    Moe, Karen L.; Happell, Nadine; Brady, Sean

    1993-01-01

    A set of benchmark tests were developed to analyze and measure Space Network scheduling characteristics and to assess the potential benefits of a proposed flexible scheduling concept. This paper discusses the role of the benchmark tests in evaluating alternative flexible scheduling approaches and defines a set of performance measurements. The paper describes the rationale for the benchmark tests as well as the benchmark components, which include models of the Tracking and Data Relay Satellite (TDRS), mission spacecraft, their orbital data, and flexible requests for communication services. Parameters which vary in the tests address the degree of request flexibility, the request resource load, and the number of events to schedule. Test results are evaluated based on time to process and schedule quality. Preliminary results and lessons learned are addressed.

  11. Randomized benchmarking in measurement-based quantum computing

    Science.gov (United States)

    Alexander, Rafael N.; Turner, Peter S.; Bartlett, Stephen D.

    2016-09-01

    Randomized benchmarking is routinely used as an efficient method for characterizing the performance of sets of elementary logic gates in small quantum devices. In the measurement-based model of quantum computation, logic gates are implemented via single-site measurements on a fixed universal resource state. Here we adapt the randomized benchmarking protocol for a single qubit to a linear cluster state computation, which provides partial, yet efficient characterization of the noise associated with the target gate set. Applying randomized benchmarking to measurement-based quantum computation exhibits an interesting interplay between the inherent randomness associated with logic gates in the measurement-based model and the random gate sequences used in benchmarking. We consider two different approaches: the first makes use of the standard single-qubit Clifford group, while the second uses recently introduced (non-Clifford) measurement-based 2-designs, which harness inherent randomness to implement gate sequences.

  12. Issues in benchmarking human reliability analysis methods : a literature review.

    Energy Technology Data Exchange (ETDEWEB)

    Lois, Erasmia (US Nuclear Regulatory Commission); Forester, John Alan; Tran, Tuan Q. (Idaho National Laboratory, Idaho Falls, ID); Hendrickson, Stacey M. Langfitt; Boring, Ronald L. (Idaho National Laboratory, Idaho Falls, ID)

    2008-04-01

    There is a diversity of human reliability analysis (HRA) methods available for use in assessing human performance within probabilistic risk assessment (PRA). Due to the significant differences in the methods, including the scope, approach, and underlying models, there is a need for an empirical comparison investigating the validity and reliability of the methods. To accomplish this empirical comparison, a benchmarking study is currently underway that compares HRA methods with each other and against operator performance in simulator studies. In order to account for as many effects as possible in the construction of this benchmarking study, a literature review was conducted, reviewing past benchmarking studies in the areas of psychology and risk assessment. A number of lessons learned through these studies are presented in order to aid in the design of future HRA benchmarking endeavors.

  13. Heart region segmentation from low-dose CT scans: an anatomy based approach

    Science.gov (United States)

    Reeves, Anthony P.; Biancardi, Alberto M.; Yankelevitz, David F.; Cham, Matthew D.; Henschke, Claudia I.

    2012-02-01

    Cardiovascular disease is a leading cause of death in developed countries. The concurrent detection of heart diseases during low-dose whole-lung CT scans (LDCT), typically performed as part of a screening protocol, hinges on the accurate quantification of coronary calcification. The creation of fully automated methods is ideal as complete manual evaluation is imprecise, operator dependent, time consuming and thus costly. The technical challenges posed by LDCT scans in this context are mainly twofold. First, there is a high level image noise arising from the low radiation dose technique. Additionally, there is a variable amount of cardiac motion blurring due to the lack of electrocardiographic gating and the fact that heart rates differ between human subjects. As a consequence, the reliable segmentation of the heart, the first stage toward the implementation of morphologic heart abnormality detection, is also quite challenging. An automated computer method based on a sequential labeling of major organs and determination of anatomical landmarks has been evaluated on a public database of LDCT images. The novel algorithm builds from a robust segmentation of the bones and airways and embodies a stepwise refinement starting at the top of the lungs where image noise is at its lowest and where the carina provides a good calibration landmark. The segmentation is completed at the inferior wall of the heart where extensive image noise is accommodated. This method is based on the geometry of human anatomy and does not involve training through manual markings. Using visual inspection by an expert reader as a gold standard, the algorithm achieved successful heart and major vessel segmentation in 42 of 45 low-dose CT images. In the 3 remaining cases, the cardiac base was over segmented due to incorrect hemidiaphragm localization.

  14. How to achieve and prove performance improvement - 15 years of experience in German wastewater benchmarking.

    Science.gov (United States)

    Bertzbach, F; Franz, T; Möller, K

    2012-01-01

    This paper shows the results of performance improvement, which have been achieved in benchmarking projects in the wastewater industry in Germany over the last 15 years. A huge number of changes in operational practice and also in achieved annual savings can be shown, induced in particular by benchmarking at process level. Investigation of this question produces some general findings for the inclusion of performance improvement in a benchmarking project and for the communication of its results. Thus, we elaborate on the concept of benchmarking at both utility and process level, which is still a necessary distinction for the integration of performance improvement into our benchmarking approach. To achieve performance improvement via benchmarking it should be made quite clear that this outcome depends, on one hand, on a well conducted benchmarking programme and, on the other, on the individual situation within each participating utility.

  15. BENCHMARKING ON-LINE SERVICES INDUSTRIES

    Institute of Scientific and Technical Information of China (English)

    John HAMILTON

    2006-01-01

    The Web Quality Analyser (WQA) is a new benchmarking tool for industry. It hasbeen extensively tested across services industries. Forty five critical success features are presented as measures that capture the user's perception of services industry websites. This tool differs to previous tools, in that it captures the information technology (IT) related driver sectors of website performance, along with the marketing-services related driver sectors. These driver sectors capture relevant structure, function and performance components.An 'on-off' switch measurement approach determines each component. Relevant component measures scale into a relative presence of the applicable feature, with a feature block delivering one of the sector drivers. Although it houses both measurable and a few subjective components, the WQA offers a proven and useful means to compare relevant websites.The WQA defines website strengths and weaknesses, thereby allowing for corrections to the website structure of the specific business. WQA benchmarking against services related business competitors delivers a position on the WQA index, facilitates specific website driver rating comparisons, and demonstrates where key competitive advantage may reside. This paper reports on the marketing-services driver sectors of this new benchmarking WQA tool.

  16. On the combination of c- and D-optimal designs: General approaches and applications in dose-response studies.

    Science.gov (United States)

    Holland-Letz, Tim

    2017-03-01

    Dose-response modeling in areas such as toxicology is often conducted using a parametric approach. While estimation of parameters is usually one of the goals, often the main aim of the study is the estimation of quantities derived from the parameters, such as the ED50 dose. From the view of statistical optimal design theory such an objective corresponds to a c-optimal design criterion. Unfortunately, c-optimal designs often create practical problems, and furthermore commonly do not allow actual estimation of the parameters. It is therefore useful to consider alternative designs which show good c-performance, while still being applicable in practice and allowing reasonably good general parameter estimation. In effect, using optimal design terminology this means that a reasonable performance regarding the D-criterion is expected as well. In this article, we propose several approaches to the task of combining c- and D-efficient designs, such as using mixed information functions or setting minimum requirements regarding either c- or D-efficiency, and show how to algorithmically determine optimal designs in each case. We apply all approaches to a standard situation from toxicology, and obtain a much better balance between c- and D-performance. Next, we investigate how to adapt the designs to different parameter values. Finally, we show that the methodology used here is not just limited to the combination of c- and D-designs, but can also be used to handle more general constraint situations such as limits on the cost of an experiment.

  17. Prediction of powdered activated carbon doses for 2-MIB removal in drinking water treatment using a simplified HSDM approach.

    Science.gov (United States)

    Yu, Jianwei; Yang, Fong-Chen; Hung, Wei-Nung; Liu, Chia-Ling; Yang, Min; Lin, Tsair-Fuh

    2016-08-01

    The addition of powdered activated carbon (PAC) is an effective measure to cope with seasonal taste and odor (T&O) problems caused by 2-methylisoborneol (2-MIB) and trans-1, 10-dimethyl-trans-9-decalol (geosmin) in drinking water. Some T&O problems are episodic in nature, and generally require rapid responses. This paper proposed a simplified approach for the application of the homogenous surface diffusion model (HSDM) to predict the appropriate PAC doses for the removal of 2-MIB. Equilibrium and kinetic experiments were performed for 2-MIB adsorption onto five PACs in three source waters. The simplified HSDM approach was compared with the experimental data, by assigning the Freundlich 1/n value in the range of 0.1-1.0 and obtaining the Freundlich equilibrium parameter K value through a 6-hr adsorption kinetic test. The model describes the kinetic adsorption data very well for all of the tested PACs in different source waters. The results were validated using the data obtained from one full scale water treatment plant, and the differences between the predicted and observed results were within 10% range. This simplified HSDM approach may be applied for the rapid determination of PAC doses for water treatment plants when faced with 2-MIB episodes in source waters.

  18. Multitargeted Low-Dose GLAD Combination Chemoprevention: A Novel and Promising Approach to Combat Colon Carcinogenesis

    Directory of Open Access Journals (Sweden)

    Altaf Mohammed

    2013-05-01

    Full Text Available Preclinical studies have shown that gefitinib, licofelone, atorvastatin, and α-difluoromethylornithine (GLAD are promising colon cancer chemopreventive agents. Because low-dose combination regimens can offer potential additive or synergistic effects without toxicity, GLAD combination was tested for toxicity and chemopreventive efficacy for suppression of intestinal tumorigenesis in adenomatous polyposis coli (APCMin/+ mice. Six-week-old wild-type and APCMin/+ mice were fed modified American Institute of Nutrition 76A diets with or without GLAD (25 + 50 + 50 + 500 ppm for 14 weeks. Dietary GLAD caused no signs of toxicity based on organ pathology and liver enzyme profiles. GLAD feeding strongly inhibited (80–83%, P 95% fewer polyps with sizes of >2 mm compared with control mice and showed 75% and 85% inhibition of colonic tumors in males and females, respectively. Molecular analyses of polyps suggested that GLAD exerts efficacy by inhibiting cell proliferation, inducing apoptosis, decreasing β-catenin and caveolin-1 levels, increasing caspase-3 cleavage and p21, and modulating expression profile of inflammatory cytokines. These observations demonstrate that GLAD, a novel cocktail of chemopreventive agents at very low doses, suppresses intestinal tumorigenesis in APCMin/+ mice with no toxicity. This novel strategy to prevent colorectal cancer is an important step in developing agents with high efficacy without unwanted side effects.

  19. New approach to explain results of the low dose radiation on the Raphanus sativus

    Energy Technology Data Exchange (ETDEWEB)

    Kurisu, Y.; Yoshioka, K.; Yoshida, S.; Murata, I.; Takahashi, A. [Osaka University, Graduate School of Engineering, Department of Nuclear Engineering, Suita, Osaka (Japan)

    2002-01-01

    Recently, the researches on radiation hormesis toward the animals and plants are made abundantly. The radiation hormesis effect is that subharmful doses of radiation may evoke a stimulatory response in any organism. We did irradiation experiments of fusion (DD and DT) neutron, thermal and fast neutron, and 60-cobalt gamma-ray to the dry seeds of Raphanus stivus, and examined whether radiation hormesis effects appeared by measuring germination rate, the length of a hypocotyl and a root and the total weight on the 7th day from starring cultivation. The evaluation of radiation hormesis effects was done by using relative effectiveness which is the ratio of the mean of the measurement objects of the irradiation group to that of non-irradiation group. In the Raphanus stivus the radiation hormesis effects of the measured objects were only turned up in seed groups irradiated by the fusion (D-T) neutron. We have confirmed that absorbed dose range where the effects are revealed is from 1 cGy to 10 Gy and there the relative effectiveness is from 1.05 to 1.25. In this research the model about radiation hormesis effect on Raphanus sativus confirmed in irradiation of D-T neutrons is proposed. And it is apparent that radiation from radio activated seeds influences hormesis effect on Raphanus sativus. (author)

  20. High dose misonidazole with dexamethasone rescue: a possible approach to circumvent neurotoxicity

    Energy Technology Data Exchange (ETDEWEB)

    Urtasun, R.C.; Tanasichuk, H.; Fulton, D.; Agboola, O.; Turner, A.R.; Koziol, D.; Raleigh, J.

    1982-03-01

    With a view of modifying misonidazole (MISO) neurotoxicity, we initiated a randomized clinical study to assess a possible drug interaction and toxicity protection when dexamethasone (DXM) is administered concomitantly with MISO. The ongoing study consists of: 1. Pharamacokinetic evaluation; 2. Assessment of toxicity. Fourteen patients undergoing radiation therapy for different types of malignant neoplasia (excluding brain tumors) have been randomized to receive either MISO alone, or DXM one week prior and during treatment with MISO. Five of seven patients receiving MISO alone developed peripheral neuropathies while only one out of 7 patients that received MISO with DXM coverage developed a transient and mild neuropathy. Pharmacokinetic evaluation of MISO in plasma and urine of those patients receiving DXM has shown no evidence of drug interaction. It is postulated that the mechanism of action of DXM is at the nerve cell membrane level, restoring and stabilizing cell surface properties. In future studies we will investigate the use of DXM with increasing doses of MISO above the recommended maximum dose of 12 gm/m/sup 2/, hoping to achieve a higher tumor tissue level of MISO while avoiding unacceptable toxicity. The effect of Allopurinol on the plasma kinetics of MISO was studied in four additional patients, observing also no evidence of drug interaction.

  1. Simulation of dose deposition in stereotactic synchrotron radiation therapy: a fast approach combining Monte Carlo and deterministic algorithms

    Energy Technology Data Exchange (ETDEWEB)

    Smekens, F; Freud, N; Letang, J M; Babot, D [CNDRI (Nondestructive Testing using Ionizing Radiations) Laboratory, INSA-Lyon, 69621 Villeurbanne Cedex (France); Adam, J-F; Elleaume, H; Esteve, F [INSERM U-836, Equipe 6 ' Rayonnement Synchrotron et Recherche Medicale' , Institut des Neurosciences de Grenoble (France); Ferrero, C; Bravin, A [European Synchrotron Radiation Facility, Grenoble (France)], E-mail: francois.smekens@insa-lyon.fr

    2009-08-07

    A hybrid approach, combining deterministic and Monte Carlo (MC) calculations, is proposed to compute the distribution of dose deposited during stereotactic synchrotron radiation therapy treatment. The proposed approach divides the computation into two parts: (i) the dose deposited by primary radiation (coming directly from the incident x-ray beam) is calculated in a deterministic way using ray casting techniques and energy-absorption coefficient tables and (ii) the dose deposited by secondary radiation (Rayleigh and Compton scattering, fluorescence) is computed using a hybrid algorithm combining MC and deterministic calculations. In the MC part, a small number of particle histories are simulated. Every time a scattering or fluorescence event takes place, a splitting mechanism is applied, so that multiple secondary photons are generated with a reduced weight. The secondary events are further processed in a deterministic way, using ray casting techniques. The whole simulation, carried out within the framework of the Monte Carlo code Geant4, is shown to converge towards the same results as the full MC simulation. The speed of convergence is found to depend notably on the splitting multiplicity, which can easily be optimized. To assess the performance of the proposed algorithm, we compare it to state-of-the-art MC simulations, accelerated by the track length estimator technique (TLE), considering a clinically realistic test case. It is found that the hybrid approach is significantly faster than the MC/TLE method. The gain in speed in a test case was about 25 for a constant precision. Therefore, this method appears to be suitable for treatment planning applications.

  2. A web application for evaluating Phase I methods using a non-parametric optimal benchmark.

    Science.gov (United States)

    Wages, Nolan A; Varhegyi, Nikole

    2017-06-01

    In evaluating the performance of Phase I dose-finding designs, simulation studies are typically conducted to assess how often a method correctly selects the true maximum tolerated dose under a set of assumed dose-toxicity curves. A necessary component of the evaluation process is to have some concept for how well a design can possibly perform. The notion of an upper bound on the accuracy of maximum tolerated dose selection is often omitted from the simulation study, and the aim of this work is to provide researchers with accessible software to quickly evaluate the operating characteristics of Phase I methods using a benchmark. The non-parametric optimal benchmark is a useful theoretical tool for simulations that can serve as an upper limit for the accuracy of maximum tolerated dose identification based on a binary toxicity endpoint. It offers researchers a sense of the plausibility of a Phase I method's operating characteristics in simulation. We have developed an R shiny web application for simulating the benchmark. The web application has the ability to quickly provide simulation results for the benchmark and requires no programming knowledge. The application is free to access and use on any device with an Internet browser. The application provides the percentage of correct selection of the maximum tolerated dose and an accuracy index, operating characteristics typically used in evaluating the accuracy of dose-finding designs. We hope this software will facilitate the use of the non-parametric optimal benchmark as an evaluation tool in dose-finding simulation.

  3. Sustainable value assessment of farms using frontier efficiency benchmarks.

    Science.gov (United States)

    Van Passel, Steven; Van Huylenbroeck, Guido; Lauwers, Ludwig; Mathijs, Erik

    2009-07-01

    Appropriate assessment of firm sustainability facilitates actor-driven processes towards sustainable development. The methodology in this paper builds further on two proven methodologies for the assessment of sustainability performance: it combines the sustainable value approach with frontier efficiency benchmarks. The sustainable value methodology tries to relate firm performance to the use of different resources. This approach assesses contributions to corporate sustainability by comparing firm resource productivity with the resource productivity of a benchmark, and this for all resources considered. The efficiency is calculated by estimating the production frontier indicating the maximum feasible production possibilities. In this research, the sustainable value approach is combined with efficiency analysis methods to benchmark sustainability assessment. In this way, the production theoretical underpinnings of efficiency analysis enrich the sustainable value approach. The methodology is presented using two different functional forms: the Cobb-Douglas and the translog functional forms. The simplicity of the Cobb-Douglas functional form as benchmark is very attractive but it lacks flexibility. The translog functional form is more flexible but has the disadvantage that it requires a lot of data to avoid estimation problems. Using frontier methods for deriving firm specific benchmarks has the advantage that the particular situation of each company is taken into account when assessing sustainability. Finally, we showed that the methodology can be used as an integrative sustainability assessment tool for policy measures.

  4. Assessing Organ Doses from Paediatric CT Scans—A Novel Approach for an Epidemiology Study (the EPI-CT Study

    Directory of Open Access Journals (Sweden)

    Steven L. Simon

    2013-02-01

    Full Text Available The increasing worldwide use of paediatric computed tomography (CT has led to increasing concerns regarding the subsequent effects of exposure to radiation. In response to this concern, the international EPI-CT project was developed to study the risk of cancer in a large multi-country cohort. In radiation epidemiology, accurate estimates of organ-specific doses are essential. In EPI-CT, data collection is split into two time periods—before and after introduction of the Picture Archiving Communication System (PACS introduced in the 1990s. Prior to PACS, only sparse information about scanner settings is available from radiology departments. Hence, a multi-level approach was developed to retrieve information from a questionnaire, surveys, scientific publications, and expert interviews. For the years after PACS was introduced, scanner settings will be extracted from Digital Imaging and Communications in Medicine (DICOM headers, a protocol for storing medical imaging data. Radiation fields and X-ray interactions within the body will be simulated using phantoms of various ages and Monte-Carlo-based radiation transport calculations. Individual organ doses will be estimated for each child using an accepted calculation strategy, scanner settings, and the radiation transport calculations. Comprehensive analyses of missing and uncertain dosimetry data will be conducted to provide uncertainty distributions of doses.

  5. Two computational approaches for Monte Carlo based shutdown dose rate calculation with applications to the JET fusion machine

    Energy Technology Data Exchange (ETDEWEB)

    Petrizzi, L.; Batistoni, P.; Migliori, S. [Associazione EURATOM ENEA sulla Fusione, Frascati (Roma) (Italy); Chen, Y.; Fischer, U.; Pereslavtsev, P. [Association FZK-EURATOM Forschungszentrum Karlsruhe (Germany); Loughlin, M. [EURATOM/UKAEA Fusion Association, Culham Science Centre, Abingdon, Oxfordshire, OX (United Kingdom); Secco, A. [Nice Srl Via Serra 33 Camerano Casasco AT (Italy)

    2003-07-01

    In deuterium-deuterium (D-D) and deuterium-tritium (D-T) fusion plasmas neutrons are produced causing activation of JET machine components. For safe operation and maintenance it is important to be able to predict the induced activation and the resulting shut down dose rates. This requires a suitable system of codes which is capable of simulating both the neutron induced material activation during operation and the decay gamma radiation transport after shut-down in the proper 3-D geometry. Two methodologies to calculate the dose rate in fusion devices have been developed recently and applied to fusion machines, both using the MCNP Monte Carlo code. FZK has developed a more classical approach, the rigorous 2-step (R2S) system in which MCNP is coupled to the FISPACT inventory code with an automated routing. ENEA, in collaboration with the ITER Team, has developed an alternative approach, the direct 1 step method (D1S). Neutron and decay gamma transport are handled in one single MCNP run, using an ad hoc cross section library. The intention was to tightly couple the neutron induced production of a radio-isotope and the emission of its decay gammas for an accurate spatial distribution and a reliable calculated statistical error. The two methods have been used by the two Associations to calculate the dose rate in five positions of JET machine, two inside the vacuum chamber and three outside, at cooling times between 1 second and 1 year after shutdown. The same MCNP model and irradiation conditions have been assumed. The exercise has been proposed and financed in the frame of the Fusion Technological Program of the JET machine. The scope is to supply the designers with the most reliable tool and data to calculate the dose rate on fusion machines. Results showed that there is a good agreement: the differences range between 5-35%. The next step to be considered in 2003 will be an exercise in which the comparison will be done with dose-rate data from JET taken during and

  6. Benchmarking of human resources management

    Directory of Open Access Journals (Sweden)

    David M. Akinnusi

    2008-12-01

    Full Text Available This paper reviews the role of human resource management (HRM which, today, plays a strategic partnership role in management. The focus of the paper is on HRM in the public sector, where much hope rests on HRM as a means of transforming the public service and achieving much needed service delivery. However, a critical evaluation of HRM practices in the public sector reveals that these services leave much to be desired. The paper suggests the adoption of benchmarking as a process to revamp HRM in the public sector so that it is able to deliver on its promises. It describes the nature and process of benchmarking and highlights the inherent difficulties in applying benchmarking in HRM. It concludes with some suggestions for a plan of action. The process of identifying “best” practices in HRM requires the best collaborative efforts of HRM practitioners and academicians. If used creatively, benchmarking has the potential to bring about radical and positive changes in HRM in the public sector. The adoption of the benchmarking process is, in itself, a litmus test of the extent to which HRM in the public sector has grown professionally.

  7. Benchmark simulation models, quo vadis?

    Science.gov (United States)

    Jeppsson, U; Alex, J; Batstone, D J; Benedetti, L; Comas, J; Copp, J B; Corominas, L; Flores-Alsina, X; Gernaey, K V; Nopens, I; Pons, M-N; Rodríguez-Roda, I; Rosen, C; Steyer, J-P; Vanrolleghem, P A; Volcke, E I P; Vrecko, D

    2013-01-01

    As the work of the IWA Task Group on Benchmarking of Control Strategies for wastewater treatment plants (WWTPs) is coming to an end, it is essential to disseminate the knowledge gained. For this reason, all authors of the IWA Scientific and Technical Report on benchmarking have come together to provide their insights, highlighting areas where knowledge may still be deficient and where new opportunities are emerging, and to propose potential avenues for future development and application of the general benchmarking framework and its associated tools. The paper focuses on the topics of temporal and spatial extension, process modifications within the WWTP, the realism of models, control strategy extensions and the potential for new evaluation tools within the existing benchmark system. We find that there are major opportunities for application within all of these areas, either from existing work already being done within the context of the benchmarking simulation models (BSMs) or applicable work in the wider literature. Of key importance is increasing capability, usability and transparency of the BSM package while avoiding unnecessary complexity.

  8. Modeling solar cell degradation in space: a comparison of the NRL displacement damage dose and the JPL equivalent fluence approaches

    Energy Technology Data Exchange (ETDEWEB)

    Messenger, S.R. [SFA Inc., Largo, MD (United States); Summers, G.P. [US Naval Research Laboratory, Washington, DC (United States); University of Maryland, Baltimore, MD (United States); Walters, R.J.; Xapsos, M.A. [US Naval Research Laboratory, Washington, DC (United States); Burke, E.A.

    2001-07-01

    The method for predicting solar cell degradation in space radiation environments developed recently at the US Naval Research Laboratory (NRL) is compared in detail with the earlier method developed at the US Jet Propulsion Laboratory (JPL). Although both methods are similar, the key difference is that in the NRL approach, the energy dependence of the damage coefficients is determined from a calculation of the nonionizing energy loss (NIEL) and requires relatively few experimental measurements, whereas in the JPL method the damage coefficients have to be determined using an extensive set of experimental measurements. The end result of the NRL approach is a determination of a single characteristic degradation curve for a cell technology, which is measured against displacement damage dose rather than fluence. The end-of-life (EOL) cell performance for a particular mission can be read from the characteristic curve once the displacement damage dose for the mission has been determined. In the JPL method, the end result is a determination of the equivalent 1 MeV electron fluence, which would cause the same level of degradation as the actual space environment. The two approaches give similar results for GaAs/Ge solar cells, for which a large database exists. Because the NRL method requires far less experimental data than the JPL method, it is more readily applied to emerging cell technologies for which extensive radiation measurements are not available. The NRL approach is being incorporated into a code named SAVANT by researchers at NASA Glenn Research Center. The predictions of SAVANT are shown to agree closely with actual space data for GaAs/Ge and CuInSe{sub 2} cells flown on the Equator-S mission. (author)

  9. A multiscale Bayesian data integration approach for mapping air dose rates around the Fukushima Daiichi Nuclear Power Plant.

    Science.gov (United States)

    Wainwright, Haruko M; Seki, Akiyuki; Chen, Jinsong; Saito, Kimiaki

    2017-02-01

    This paper presents a multiscale data integration method to estimate the spatial distribution of air dose rates in the regional scale around the Fukushima Daiichi Nuclear Power Plant. We integrate various types of datasets, such as ground-based walk and car surveys, and airborne surveys, all of which have different scales, resolutions, spatial coverage, and accuracy. This method is based on geostatistics to represent spatial heterogeneous structures, and also on Bayesian hierarchical models to integrate multiscale, multi-type datasets in a consistent manner. The Bayesian method allows us to quantify the uncertainty in the estimates, and to provide the confidence intervals that are critical for robust decision-making. Although this approach is primarily data-driven, it has great flexibility to include mechanistic models for representing radiation transport or other complex correlations. We demonstrate our approach using three types of datasets collected at the same time over Fukushima City in Japan: (1) coarse-resolution airborne surveys covering the entire area, (2) car surveys along major roads, and (3) walk surveys in multiple neighborhoods. Results show that the method can successfully integrate three types of datasets and create an integrated map (including the confidence intervals) of air dose rates over the domain in high resolution. Moreover, this study provides us with various insights into the characteristics of each dataset, as well as radiocaesium distribution. In particular, the urban areas show high heterogeneity in the contaminant distribution due to human activities as well as large discrepancy among different surveys due to such heterogeneity.

  10. A Benchmark Construction of Positron Crystal Undulator

    CERN Document Server

    Tikhomirov, Victor V

    2015-01-01

    Optimization of a positron crystal undulator (CU) is addressed. The ways to assure both the maximum intensity and minimum spectral width of positron CU radiation are outlined. We claim that the minimum CU spectrum width of 3 -- 4% is reached at the positron energies of a few GeV and that the optimal bending radius of crystals planes in CU ranges from 3 to 5 critical bending radii for channeled particles. Following suggested approach a benchmark positron CU construction is devised and its functioning is illustrated using the simulation method widely tested by experimental data.

  11. Benchmarking methods and data sets for ligand enrichment assessment in virtual screening.

    Science.gov (United States)

    Xia, Jie; Tilahun, Ermias Lemma; Reid, Terry-Elinor; Zhang, Liangren; Wang, Xiang Simon

    2015-01-01

    Retrospective small-scale virtual screening (VS) based on benchmarking data sets has been widely used to estimate ligand enrichments of VS approaches in the prospective (i.e. real-world) efforts. However, the intrinsic differences of benchmarking sets to the real screening chemical libraries can cause biased assessment. Herein, we summarize the history of benchmarking methods as well as data sets and highlight three main types of biases found in benchmarking sets, i.e. "analogue bias", "artificial enrichment" and "false negative". In addition, we introduce our recent algorithm to build maximum-unbiased benchmarking sets applicable to both ligand-based and structure-based VS approaches, and its implementations to three important human histone deacetylases (HDACs) isoforms, i.e. HDAC1, HDAC6 and HDAC8. The leave-one-out cross-validation (LOO CV) demonstrates that the benchmarking sets built by our algorithm are maximum-unbiased as measured by property matching, ROC curves and AUCs.

  12. Assessment of Static Delamination Propagation Capabilities in Commercial Finite Element Codes Using Benchmark Analysis

    Science.gov (United States)

    Orifici, Adrian C.; Krueger, Ronald

    2010-01-01

    With capabilities for simulating delamination growth in composite materials becoming available, the need for benchmarking and assessing these capabilities is critical. In this study, benchmark analyses were performed to assess the delamination propagation simulation capabilities of the VCCT implementations in Marc TM and MD NastranTM. Benchmark delamination growth results for Double Cantilever Beam, Single Leg Bending and End Notched Flexure specimens were generated using a numerical approach. This numerical approach was developed previously, and involves comparing results from a series of analyses at different delamination lengths to a single analysis with automatic crack propagation. Specimens were analyzed with three-dimensional and two-dimensional models, and compared with previous analyses using Abaqus . The results demonstrated that the VCCT implementation in Marc TM and MD Nastran(TradeMark) was capable of accurately replicating the benchmark delamination growth results and that the use of the numerical benchmarks offers advantages over benchmarking using experimental and analytical results.

  13. Imaging MALDI MS of Dosed Brain Tissues Utilizing an Alternative Analyte Pre-extraction Approach

    Science.gov (United States)

    Quiason, Cristine M.; Shahidi-Latham, Sheerin K.

    2015-06-01

    Matrix-assisted laser desorption ionization (MALDI) imaging mass spectrometry has been adopted in the pharmaceutical industry as a useful tool to detect xenobiotic distribution within tissues. A unique sample preparation approach for MALDI imaging has been described here for the extraction and detection of cobimetinib and clozapine, which were previously undetectable in mouse and rat brain using a single matrix application step. Employing a combination of a buffer wash and a cyclohexane pre-extraction step prior to standard matrix application, the xenobiotics were successfully extracted and detected with an 8 to 20-fold gain in sensitivity. This alternative approach for sample preparation could serve as an advantageous option when encountering difficult to detect analytes.

  14. Randomized benchmarking of multiqubit gates.

    Science.gov (United States)

    Gaebler, J P; Meier, A M; Tan, T R; Bowler, R; Lin, Y; Hanneke, D; Jost, J D; Home, J P; Knill, E; Leibfried, D; Wineland, D J

    2012-06-29

    We describe an extension of single-qubit gate randomized benchmarking that measures the error of multiqubit gates in a quantum information processor. This platform-independent protocol evaluates the performance of Clifford unitaries, which form a basis of fault-tolerant quantum computing. We implemented the benchmarking protocol with trapped ions and found an error per random two-qubit Clifford unitary of 0.162±0.008, thus setting the first benchmark for such unitaries. By implementing a second set of sequences with an extra two-qubit phase gate inserted after each step, we extracted an error per phase gate of 0.069±0.017. We conducted these experiments with transported, sympathetically cooled ions in a multizone Paul trap-a system that can in principle be scaled to larger numbers of ions.

  15. Pathogen germs response to low-dose radiation — medical approach

    Directory of Open Access Journals (Sweden)

    Focea R.

    2012-04-01

    Full Text Available The side effects of radiation therapy in the case of microbial loading of irradiated organs was considered as phenomenological basis of the experiment carried out on Staphylococcus aureus (ATCC germ exposed to low X-ray doses. The inoculum was prepared in a liquid culture medium with standard composition, the volumes of 3 ml identical samples (in sterile glass tubes being irradiated in hospital conditions. Five experimental variants were developed corresponding to irradiation time durations between 25 and 100 minutes. The spectro-colorimetric assay was accomplished at 560 nm and 420 nm, the resulting average values (for three repetitions being analyzed from the viewpoint of cell density in the irradiated variants compared to control ones. The resistance to antibiotics of the irradiated bacteria was tested on agarized cultures against five antibiotic molecules (ampicillin, cloramphenicol, tetracycline, tobramicin and ofloxacin by assessing the diameter of inhibition growth areas in each case. The increase of the inhibition area diameter with up to 15% (in the case of tetracycline was noticed for the lowest irradiation time for all five antibiotics, which is suggesting a weakening of the bacteria resistance to the pharmaceutical agents following the X-ray treatment. This was concordant with the results of the spectro-colorimetric assay of the cell density within the directly irradiated bacteria cultures. The main issue of this study is concerning the optimization of the radiotherapy protocol in patients with potential microbial loading.

  16. Pathogen germs response to low-dose radiation — medical approach

    Science.gov (United States)

    Poiata, A.; Focea, R.; Creanga, D.

    2012-04-01

    The side effects of radiation therapy in the case of microbial loading of irradiated organs was considered as phenomenological basis of the experiment carried out on Staphylococcus aureus (ATCC germ) exposed to low X-ray doses. The inoculum was prepared in a liquid culture medium with standard composition, the volumes of 3 ml identical samples (in sterile glass tubes) being irradiated in hospital conditions. Five experimental variants were developed corresponding to irradiation time durations between 25 and 100 minutes. The spectro-colorimetric assay was accomplished at 560 nm and 420 nm, the resulting average values (for three repetitions) being analyzed from the viewpoint of cell density in the irradiated variants compared to control ones. The resistance to antibiotics of the irradiated bacteria was tested on agarized cultures against five antibiotic molecules (ampicillin, cloramphenicol, tetracycline, tobramicin and ofloxacin) by assessing the diameter of inhibition growth areas in each case. The increase of the inhibition area diameter with up to 15% (in the case of tetracycline) was noticed for the lowest irradiation time for all five antibiotics, which is suggesting a weakening of the bacteria resistance to the pharmaceutical agents following the X-ray treatment. This was concordant with the results of the spectro-colorimetric assay of the cell density within the directly irradiated bacteria cultures. The main issue of this study is concerning the optimization of the radiotherapy protocol in patients with potential microbial loading.

  17. Performance Benchmarks for Screening Breast MR Imaging in Community Practice.

    Science.gov (United States)

    Lee, Janie M; Ichikawa, Laura; Valencia, Elizabeth; Miglioretti, Diana L; Wernli, Karen; Buist, Diana S M; Kerlikowske, Karla; Henderson, Louise M; Sprague, Brian L; Onega, Tracy; Rauscher, Garth H; Lehman, Constance D

    2017-10-01

    Purpose To compare screening magnetic resonance (MR) imaging performance in the Breast Cancer Surveillance Consortium (BCSC) with Breast Imaging Reporting and Data System (BI-RADS) benchmarks. Materials and Methods This study was approved by the institutional review board and compliant with HIPAA and included BCSC screening MR examinations collected between 2005 and 2013 from 5343 women (8387 MR examinations) linked to regional Surveillance, Epidemiology, and End Results program registries, state tumor registries, and pathologic information databases that identified breast cancer cases and tumor characteristics. Clinical, demographic, and imaging characteristics were assessed. Performance measures were calculated according to BI-RADS fifth edition and included cancer detection rate (CDR), positive predictive value of biopsy recommendation (PPV2), sensitivity, and specificity. Results The median patient age was 52 years; 52% of MR examinations were performed in women with a first-degree family history of breast cancer, 46% in women with a personal history of breast cancer, and 15% in women with both risk factors. Screening MR imaging depicted 146 cancers, and 35 interval cancers were identified (181 total-54 in situ, 125 invasive, and two status unknown). The CDR was 17 per 1000 screening examinations (95% confidence interval [CI]: 15, 20 per 1000 screening examinations; BI-RADS benchmark, 20-30 per 1000 screening examinations). PPV2 was 19% (95% CI: 16%, 22%; benchmark, 15%). Sensitivity was 81% (95% CI: 75%, 86%; benchmark, >80%), and specificity was 83% (95% CI: 82%, 84%; benchmark, 85%-90%). The median tumor size of invasive cancers was 10 mm; 88% were node negative. Conclusion The interpretative performance of screening MR imaging in the BCSC meets most BI-RADS benchmarks and approaches benchmark levels for remaining measures. Clinical practice performance data can inform ongoing benchmark development and help identify areas for quality improvement. (©) RSNA

  18. Approaches to risk assessment in food allergy

    DEFF Research Database (Denmark)

    Madsen, Charlotte Bernhard; Hattersley, S.; Buck, J.;

    2009-01-01

    the area forward. Three possible approaches to safety assessment and risk assessment for allergenic foods were presented and discussed: safety assessment using NOAEL/LOAEL and uncertainty factors, safety assessment using Benchmark Dose and Margin of Exposure (MoE), and risk assessment using probabilistic...... models. The workshop concluded that all the three approaches to safety and risk assessment of allergenic foods should continue to be considered. A particular strength of the MoE and probabilistic approaches is that they do not rely on low-dose extrapolations with its inherent issues. Probabilistic...

  19. SPICE benchmark for global tomographic methods

    Science.gov (United States)

    Qin, Yilong; Capdeville, Yann; Maupin, Valerie; Montagner, Jean-Paul; Lebedev, Sergei; Beucler, Eric

    2008-11-01

    The existing global tomographic methods result in different models due to different parametrization, scale resolution and theoretical approach. To test how current imaging techniques are limited by approximations in theory and by the inadequacy of data quality and coverage, it is necessary to perform a global-scale benchmark to understand the resolving properties of each specific imaging algorithm. In the framework of the Seismic wave Propagation and Imaging in Complex media: a European network (SPICE) project, it was decided to perform a benchmark experiment of global inversion algorithms. First, a preliminary benchmark with a simple isotropic model is carried out to check the feasibility in terms of acquisition geometry and numerical accuracy. Then, to fully validate tomographic schemes with a challenging synthetic data set, we constructed one complex anisotropic global model, which is characterized by 21 elastic constants and includes 3-D heterogeneities in velocity, anisotropy (radial and azimuthal anisotropy), attenuation, density, as well as surface topography and bathymetry. The intermediate-period (>32 s), high fidelity anisotropic modelling was performed by using state-of-the-art anisotropic anelastic modelling code, that is, coupled spectral element method (CSEM), on modern massively parallel computing resources. The benchmark data set consists of 29 events and three-component seismograms are recorded by 256 stations. Because of the limitation of the available computing power, synthetic seismograms have a minimum period of 32 s and a length of 10 500 s. The inversion of the benchmark data set demonstrates several well-known problems of classical surface wave tomography, such as the importance of crustal correction to recover the shallow structures, the loss of resolution with depth, the smearing effect, both horizontal and vertical, the inaccuracy of amplitude of isotropic S-wave velocity variation, the difficulty of retrieving the magnitude of azimuthal

  20. Perceptual hashing algorithms benchmark suite

    Institute of Scientific and Technical Information of China (English)

    Zhang Hui; Schmucker Martin; Niu Xiamu

    2007-01-01

    Numerous perceptual hashing algorithms have been developed for identification and verification of multimedia objects in recent years. Many application schemes have been adopted for various commercial objects. Developers and users are looking for a benchmark tool to compare and evaluate their current algorithms or technologies. In this paper, a novel benchmark platform is presented. PHABS provides an open framework and lets its users define their own test strategy, perform tests, collect and analyze test data. With PHABS, various performance parameters of algorithms can be tested, and different algorithms or algorithms with different parameters can be evaluated and compared easily.

  1. Closed-loop neuromorphic benchmarks

    CSIR Research Space (South Africa)

    Stewart

    2015-11-01

    Full Text Available Benchmarks   Terrence C. Stewart 1* , Travis DeWolf 1 , Ashley Kleinhans 2 , Chris Eliasmith 1   1 University of Waterloo, Canada, 2 Council for Scientific and Industrial Research, South Africa   Submitted to Journal:   Frontiers in Neuroscience   Specialty... the study was exempt from ethical approval procedures.) Did the study presented in the manuscript involve human or animal subjects: No I v i w 1Closed-loop Neuromorphic Benchmarks Terrence C. Stewart 1,∗, Travis DeWolf 1, Ashley Kleinhans 2 and Chris...

  2. CompaRNA: a server for continuous benchmarking of automated methods for RNA secondary structure prediction

    OpenAIRE

    2013-01-01

    We present a continuous benchmarking approach for the assessment of RNA secondary structure prediction methods implemented in the CompaRNA web server. As of 3 October 2012, the performance of 28 single-sequence and 13 comparative methods has been evaluated on RNA sequences/structures released weekly by the Protein Data Bank. We also provide a static benchmark generated on RNA 2D structures derived from the RNAstrand database. Benchmarks on both data sets offer insight into the relative perfor...

  3. The contextual benchmark method: benchmarking e-government services

    NARCIS (Netherlands)

    Jansen, Jurjen; Vries, de Sjoerd; Schaik, van Paul

    2010-01-01

    This paper offers a new method for benchmarking e-Government services. Government organizations no longer doubt the need to deliver their services on line. Instead, the question that is more relevant is how well the electronic services offered by a particular organization perform in comparison with

  4. The bottom-up approach to bounding potential low-dose cancer risks from formaldehyde: An update.

    Science.gov (United States)

    Starr, Thomas B; Swenberg, James A

    2016-06-01

    In 2013, we proposed a novel bottom-up approach to bounding low-dose cancer risks that may result from small exogenous exposures to chemicals that are always present in the body as a result of normal biological processes. The approach utilizes the background cancer risk and the background (endogenous) concentration of a cancer-related exposure biomarker in specific target tissues. After allowing for statistical uncertainty in these two parameters, the ratio of the background risk to background exposure provides a conservative slope factor estimate that can be utilized to bound the added risk that may be associated with incremental exogenous exposures. Our original bottom-up estimates were markedly smaller than those obtained previously by the US Environmental Protection Agency (USEPA) with a conventional top-down approach to modeling nasopharyngeal cancer and leukemia mortality data from a US worker cohort. Herein we provide updated bottom-up estimates of risk for these two cancers that are smaller still, and rely upon more robust estimates of endogenous and exogenous formaldehyde-DNA adducts in monkeys and a more robust estimate of the DNA adduct elimination half-life in rats, both obtained very recently. We also re-examine the worker mortality data used by USEPA in developing its estimate of human leukemia incidence from lifetime exposure to 1 ppm airborne formaldehyde. Finally, we compare a new bottom-up slope estimate of the risk of rat nasal cancer with conventional top-down estimates obtained with empirical dose-response modeling of rat nasal cancer bioassay data. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.

  5. Shutdown dose rate assessment with the Advanced D1S method: Development, applications and validation

    Energy Technology Data Exchange (ETDEWEB)

    Villari, R., E-mail: rosaria.villari@enea.it [Associazione EURATOM-ENEA sulla Fusione, Via Enrico Fermi 45, 00044 Frascati, Rome (Italy); Fischer, U. [Karlsruhe Institute of Technology KIT, Institute for Neutron Physics and Reactor Technology, Hermann-von-Helmholtz-Platz 1, 76344 Eggenstein-Leopoldshafen (Germany); Moro, F. [Associazione EURATOM-ENEA sulla Fusione, Via Enrico Fermi 45, 00044 Frascati, Rome (Italy); Pereslavtsev, P. [Karlsruhe Institute of Technology KIT, Institute for Neutron Physics and Reactor Technology, Hermann-von-Helmholtz-Platz 1, 76344 Eggenstein-Leopoldshafen (Germany); Petrizzi, L. [European Commission, DG Research and Innovation K5, CDMA 00/030, B-1049 Brussels (Belgium); Podda, S. [Associazione EURATOM-ENEA sulla Fusione, Via Enrico Fermi 45, 00044 Frascati, Rome (Italy); Serikov, A. [Karlsruhe Institute of Technology KIT, Institute for Neutron Physics and Reactor Technology, Hermann-von-Helmholtz-Platz 1, 76344 Eggenstein-Leopoldshafen (Germany)

    2014-10-15

    Highlights: Development of Advanced-D1S for shutdown dose rate calculations; Recent applications of the tool to tokamaks; Summary of the results of benchmarking with measurements and R2S calculations; Limitations and further development. Abstract: The present paper addresses the recent developments and applications of Advanced-D1S to the calculations of shutdown dose rate in tokamak devices. Results of benchmarking with measurements and Rigorous 2-Step (R2S) calculations are summarized and discussed as well as limitations and further developments. The outcomes confirm the essential role of the Advanced-D1S methodology and the evidence for its complementary use with the R2Smesh approach for the reliable assessment of shutdown dose rates and related statistical uncertainties in present and future fusion devices.

  6. Using chemical benchmarking to determine the persistence of chemicals in a Swedish lake.

    Science.gov (United States)

    Zou, Hongyan; Radke, Michael; Kierkegaard, Amelie; MacLeod, Matthew; McLachlan, Michael S

    2015-02-03

    It is challenging to measure the persistence of chemicals under field conditions. In this work, two approaches for measuring persistence in the field were compared: the chemical mass balance approach, and a novel chemical benchmarking approach. Ten pharmaceuticals, an X-ray contrast agent, and an artificial sweetener were studied in a Swedish lake. Acesulfame K was selected as a benchmark to quantify persistence using the chemical benchmarking approach. The 95% confidence intervals of the half-life for transformation in the lake system ranged from 780-5700 days for carbamazepine to <1-2 days for ketoprofen. The persistence estimates obtained using the benchmarking approach agreed well with those from the mass balance approach (1-21% difference), indicating that chemical benchmarking can be a valid and useful method to measure the persistence of chemicals under field conditions. Compared to the mass balance approach, the benchmarking approach partially or completely eliminates the need to quantify mass flow of chemicals, so it is particularly advantageous when the quantification of mass flow of chemicals is difficult. Furthermore, the benchmarking approach allows for ready comparison and ranking of the persistence of different chemicals.

  7. Benchmarking Internet of Things devices

    CSIR Research Space (South Africa)

    Kruger, CP

    2014-07-01

    Full Text Available International Conference on Industrial Informatics (INDIN), 27-30 July 2014 Benchmarking Internet of Things devices C.P. Kruger y and G.P. Hancke yz *Advanced Sensor Networks Research Group, Counsil for Scientific and Industrial Research, South...

  8. Benchmarked Library Websites Comparative Study

    KAUST Repository

    Ramli, Rindra M.

    2015-01-01

    This presentation provides an analysis of services provided by the benchmarked library websites. The exploratory study includes comparison of these websites against a list of criterion and presents a list of services that are most commonly deployed by the selected websites. In addition to that, the investigators proposed a list of services that could be provided via the KAUST library website.

  9. Engine Benchmarking - Final CRADA Report

    Energy Technology Data Exchange (ETDEWEB)

    Wallner, Thomas [Argonne National Lab. (ANL), Argonne, IL (United States)

    2016-01-01

    Detailed benchmarking of the powertrains of three light-duty vehicles was performed. Results were presented and provided to CRADA partners. The vehicles included a MY2011 Audi A4, a MY2012 Mini Cooper and a MY2014 Nissan Versa.

  10. Benchmarking Universiteitsvastgoed: Managementinformatie bij vastgoedbeslissingen

    NARCIS (Netherlands)

    Den Heijer, A.C.; De Vries, J.C.

    2004-01-01

    Voor u ligt het eindrapport van het onderzoek "Benchmarking universiteitsvastgoed". Dit rapport is de samenvoeging van twee deel producten: het theorierapport (verschenen in december 2003) en het praktijkrapport (verschenen in januari 2004). Onderwerpen in het theoriedeel zijn de analyse van andere

  11. Benchmark Lisp And Ada Programs

    Science.gov (United States)

    Davis, Gloria; Galant, David; Lim, Raymond; Stutz, John; Gibson, J.; Raghavan, B.; Cheesema, P.; Taylor, W.

    1992-01-01

    Suite of nonparallel benchmark programs, ELAPSE, designed for three tests: comparing efficiency of computer processing via Lisp vs. Ada; comparing efficiencies of several computers processing via Lisp; or comparing several computers processing via Ada. Tests efficiency which computer executes routines in each language. Available for computer equipped with validated Ada compiler and/or Common Lisp system.

  12. RESRAD benchmarking against six radiation exposure pathway models

    Energy Technology Data Exchange (ETDEWEB)

    Faillace, E.R.; Cheng, J.J.; Yu, C.

    1994-10-01

    A series of benchmarking runs were conducted so that results obtained with the RESRAD code could be compared against those obtained with six pathway analysis models used to determine the radiation dose to an individual living on a radiologically contaminated site. The RESRAD computer code was benchmarked against five other computer codes - GENII-S, GENII, DECOM, PRESTO-EPA-CPG, and PATHRAE-EPA - and the uncodified methodology presented in the NUREG/CR-5512 report. Estimated doses for the external gamma pathway; the dust inhalation pathway; and the soil, food, and water ingestion pathways were calculated for each methodology by matching, to the extent possible, input parameters such as occupancy, shielding, and consumption factors.

  13. Development of modern approach to absorbed dose assessment in radionuclide therapy, based on Monte Carlo method simulation of patient scintigraphy

    Science.gov (United States)

    Lysak, Y. V.; Klimanov, V. A.; Narkevich, B. Ya

    2017-01-01

    One of the most difficult problems of modern radionuclide therapy (RNT) is control of the absorbed dose in pathological volume. This research presents new approach based on estimation of radiopharmaceutical (RP) accumulated activity value in tumor volume, based on planar scintigraphic images of the patient and calculated radiation transport using Monte Carlo method, including absorption and scattering in biological tissues of the patient, and elements of gamma camera itself. In our research, to obtain the data, we performed modeling scintigraphy of the vial with administered to the patient activity of RP in gamma camera, the vial was placed at the certain distance from the collimator, and the similar study was performed in identical geometry, with the same values of activity of radiopharmaceuticals in the pathological target in the body of the patient. For correct calculation results, adapted Fisher-Snyder human phantom was simulated in MCNP program. In the context of our technique, calculations were performed for different sizes of pathological targets and various tumors deeps inside patient’s body, using radiopharmaceuticals based on a mixed β-γ-radiating (131I, 177Lu), and clear β- emitting (89Sr, 90Y) therapeutic radionuclides. Presented method can be used for adequate implementing in clinical practice estimation of absorbed doses in the regions of interest on the basis of planar scintigraphy of the patient with sufficient accuracy.

  14. A Study on Benchmarking Models and Frameworks in Industrial SMEs: Challenges and Issues

    Directory of Open Access Journals (Sweden)

    Masoomeh Zeinalnezhad

    2011-01-01

    Full Text Available This paper is based on a literature review of recent publications in the field of benchmarking methodology implemented in small and medium enterprises with regards to measure and benchmark upstream, leading or developmental aspects of organizations. Benchmarking has been recognized as an essential tool for continuous improvement and competitiveness.  It can also help SMEs to improve their operational and financial performances. However, only few entrepreneurs turn to benchmarking implementation, due to lack of time and resources. In this study current benchmarking models (2005 onwards, dedicated specifically to the SMEs, have been identified and their characteristics and objectives have been discussed.  Key findings from this review confirm that this is an under-developed area of research and that most practitioner approaches are focused on benchmarking practices within SMEs. There is a need to extend theoretical and practical aspects of benchmarking in SMEs by studying the process of benchmarking with regards to the novel concept of lead benchmarking as a possible means of achieving increased radical and innovative transformation in organizational change.   From the review it emerged that, lead, forward looking and predictive benchmarking have not been considered in SMEs, and future researches could include them.

  15. Implementation of the New Approach for the Dose-Response Functions Development for the Case of Athens and Greece

    Science.gov (United States)

    Christodoulakis, J.; Tzanis, C. G.; Varotsos, C. A.; Kouremadas, G.

    2016-08-01

    Dose-response functions (DRFs) are functions used for estimating corrosion and/or soiling levels of materials used in constructions and cultural monuments. In order to achieve this, DRFs lean on ground-based measurements of specific air pollution and climatic parameters like nitrogen oxides, ozone, temperature and others. In DRAGON 3 2015 Symposium we presented a new approach which proposed a technique for using satellite-based data for the necessary parameters instead of ground-based expanding in this way: a) the usage of DRFs in cases/areas where there is no availability of in situ measurements, b) the applicability of satellite-based data. In this work we present mapping results of deterioration levels (corrosion and soiling) for the case of Athens, Greece but also for the whole Greece country.

  16. SU-E-J-141: Activity-Equivalent Path Length Approach for the 3D PET-Based Dose Reconstruction in Proton Therapy

    Energy Technology Data Exchange (ETDEWEB)

    Attili, A; Vignati, A; Giordanengo, S [Istituto Nazionale di Fisica Nucleare, Sez. Torino, Torino (Italy); Kraan, A [Istituto Nazionale di Fisica Nucleare, Sez. Pisa, Pisa (Italy); Universita degli Studi di Pisa, Pisa (Italy); Dalmasso, F [Istituto Nazionale di Fisica Nucleare, Sez. Torino, Torino (Italy); Universita degli Studi di Torino, Torino (Italy); Battistoni, G [Istituto Nazionale di Fisica Nucleare, Sez. Milano, Milano (Italy)

    2015-06-15

    Purpose: Ion beam therapy is sensitive to uncertainties from treatment planning and dose delivery. PET imaging of induced positron emitter distributions is a practical approach for in vivo, in situ verification of ion beam treatments. Treatment verification is usually done by comparing measured activity distributions with reference distributions, evaluated in nominal conditions. Although such comparisons give valuable information on treatment quality, a proper clinical evaluation of the treatment ultimately relies on the knowledge of the actual delivered dose. Analytical deconvolution methods relating activity and dose have been studied in this context, but were not clinically applied. In this work we present a feasibility study of an alternative approach for dose reconstruction from activity data, which is based on relating variations in accumulated activity to tissue density variations. Methods: First, reference distributions of dose and activity were calculated from the treatment plan and CT data. Then, the actual measured activity data were cumulatively matched with the reference activity distributions to obtain a set of activity-equivalent path lengths (AEPLs) along the rays of the pencil beams. Finally, these AEPLs were used to deform the original dose distribution, yielding the actual delivered dose. The method was tested by simulating a proton therapy treatment plan delivering 2 Gy on a homogeneous water phantom (the reference), which was compared with the same plan delivered on a phantom containing inhomogeneities. Activity and dose distributions were were calculated by means of the FLUKA Monte Carlo toolkit. Results: The main features of the observed dose distribution in the inhomogeneous situation were reproduced using the AEPL approach. Variations in particle range were reproduced and the positions, where these deviations originated, were properly identified. Conclusions: For a simple inhomogeneous phantom the 3D dose reconstruction from PET

  17. 42 CFR 440.385 - Delivery of benchmark and benchmark-equivalent coverage through managed care entities.

    Science.gov (United States)

    2010-10-01

    ... 42 Public Health 4 2010-10-01 2010-10-01 false Delivery of benchmark and benchmark-equivalent...: GENERAL PROVISIONS Benchmark Benefit and Benchmark-Equivalent Coverage § 440.385 Delivery of benchmark and benchmark-equivalent coverage through managed care entities. In implementing benchmark or...

  18. Increasing nursing students' understanding and accuracy with medical dose calculations: A collaborative approach.

    Science.gov (United States)

    Mackie, Jane E; Bruce, Catherine D

    2016-05-01

    Accurate calculation of medication dosages can be challenging for nursing students. Specific interventions related to types of errors made by nursing students may improve the learning of this important skill. The objective of this study was to determine areas of challenge for students in performing medication dosage calculations in order to design interventions to improve this skill. Strengths and weaknesses in the teaching and learning of medication dosage calculations were assessed. These data were used to create online interventions which were then measured for the impact on student ability to perform medication dosage calculations. The setting of the study is one university in Canada. The qualitative research participants were 8 nursing students from years 1-3 and 8 faculty members. Quantitative results are based on test data from the same second year clinical course during the academic years 2012 and 2013. Students and faculty participated in one-to-one interviews; responses were recorded and coded for themes. Tests were implemented and scored, then data were assessed to classify the types and number of errors. Students identified conceptual understanding deficits, anxiety, low self-efficacy, and numeracy skills as primary challenges in medication dosage calculations. Faculty identified long division as a particular content challenge, and a lack of online resources for students to practice calculations. Lessons and online resources designed as an intervention to target mathematical and concepts and skills led to improved results and increases in overall pass rates for second year students for medication dosage calculation tests. This study suggests that with concerted effort and a multi-modal approach to supporting nursing students, their abilities to calculate dosages can be improved. The positive results in this study also point to the promise of cross-discipline collaborations between nursing and education. Copyright © 2016 Elsevier Ltd. All rights

  19. Implementing the Affordable Care Act: choosing an essential health benefits benchmark plan.

    Science.gov (United States)

    Corlette, Sabrina; Lucia, Kevin W; Levin, Max

    2013-03-01

    To improve the adequacy of private health insurance, the Affordable Care Act requires insurers to cover a minimum set of medical benefits, known as "essential health benefits." In implementing this requirement, states were asked to select a "benchmark plan" to serve as a reference point. This issue brief examines state action to select an essential health benefits benchmark plan and finds that 24 states and the District of Columbia selected a plan. All but five states will have a small-group plan as their benchmark. Each state, whether or not it made a benchmark selection, will have a set of essential health benefits that reflects local, employer-based health insurance coverage currently sold in the state. States adopted a variety of approaches to selecting a benchmark, including intergov­ernmental collaboration, stakeholder engagement, and research on benchmark options.

  20. Toxicological benchmarks for screening potential contaminants of concern for effects on aquatic biota: 1994 Revision

    Energy Technology Data Exchange (ETDEWEB)

    Suter, G.W. II [Oak Ridge National Lab., TN (United States); Mabrey, J.B. [University of West Florida, Pensacola, FL (United States)

    1994-07-01

    This report presents potential screening benchmarks for protection of aquatic life from contaminants in water. Because there is no guidance for screening benchmarks, a set of alternative benchmarks is presented herein. The alternative benchmarks are based on different conceptual approaches to estimating concentrations causing significant effects. For the upper screening benchmark, there are the acute National Ambient Water Quality Criteria (NAWQC) and the Secondary Acute Values (SAV). The SAV concentrations are values estimated with 80% confidence not to exceed the unknown acute NAWQC for those chemicals with no NAWQC. The alternative chronic benchmarks are the chronic NAWQC, the Secondary Chronic Value (SCV), the lowest chronic values for fish and daphnids from chronic toxicity tests, the estimated EC20 for a sensitive species, and the concentration estimated to cause a 20% reduction in the recruit abundance of largemouth bass. It is recommended that ambient chemical concentrations be compared to all of these benchmarks. If NAWQC are exceeded, the chemicals must be contaminants of concern because the NAWQC are applicable or relevant and appropriate requirements (ARARs). If NAWQC are not exceeded, but other benchmarks are, contaminants should be selected on the basis of the number of benchmarks exceeded and the conservatism of the particular benchmark values, as discussed in the text. To the extent that toxicity data are available, this report presents the alternative benchmarks for chemicals that have been detected on the Oak Ridge Reservation. It also presents the data used to calculate benchmarks and the sources of the data. It compares the benchmarks and discusses their relative conservatism and utility.

  1. A Probabilistic Approach to Assess External Doses to the Public Considering Spatial Variability of Radioactive Contamination and Interpopulation Differences in Behavior Pattern.

    Science.gov (United States)

    Takahara, Shogo; Iijima, Masashi; Yoneda, Minoru; Shimada, Yoko

    2017-09-08

    Dose assessment is an important issue from the viewpoints of protecting people from radiation exposure and managing postaccident situations adequately. However, the radiation doses received by people cannot be determined with complete accuracy because of the uncertainties and the variability associated with any process of defining individual characteristics and in the dose assessment process itself. In this study, a dose assessment model was developed based on measurements and surveys of individual doses and relevant contributors (i.e., ambient dose rates and behavior patterns) in Fukushima City for four population groups: Fukushima City Office staff, Senior Citizens' Club, Contractors' Association, and Agricultural Cooperative. In addition, probabilistic assessments were performed for these population groups by considering the spatial variability of contamination and interpopulation differences resulting from behavior patterns. As a result of comparison with the actual measurements, the assessment results for participants from the Fukushima City Office agreed with the measured values, thereby validating the model and the approach. Although the assessment results obtained for the Senior Citizens' Club and the Agricultural Cooperative differ partly from the measured values, by addressing further considerations in terms of dose reduction effects due to decontamination and the impact of additional exposure sources in agricultural fields, these results can be improved. By contrast, the measurements obtained for the participants from the Contractors' Association were not reproduced well in the present study. To assess the doses to this group, further investigations of association members' work activities and the related dose reduction effects are needed. © 2017 Society for Risk Analysis.

  2. Benchmarking clinical photography services in the NHS.

    Science.gov (United States)

    Arbon, Giles

    2015-01-01

    Benchmarking is used in services across the National Health Service (NHS) using various benchmarking programs. Clinical photography services do not have a program in place and services have to rely on ad hoc surveys of other services. A trial benchmarking exercise was undertaken with 13 services in NHS Trusts. This highlights valuable data and comparisons that can be used to benchmark and improve services throughout the profession.

  3. Journal Benchmarking for Strategic Publication Management and for Improving Journal Positioning in the World Ranking Systems

    Science.gov (United States)

    Moskovkin, Vladimir M.; Bocharova, Emilia A.; Balashova, Oksana V.

    2014-01-01

    Purpose: The purpose of this paper is to introduce and develop the methodology of journal benchmarking. Design/Methodology/ Approach: The journal benchmarking method is understood to be an analytic procedure of continuous monitoring and comparing of the advance of specific journal(s) against that of competing journals in the same subject area,…

  4. 75 FR 16712 - Waybill Data Released in Three-Benchmark Rail Rate Proceedings

    Science.gov (United States)

    2010-04-02

    ... TRANSPORTATION Surface Transportation Board 49 CFR Part 1244 Waybill Data Released in Three-Benchmark Rail Rate... Board proposes to amend its rules with respect to the Three-Benchmark methodology used to adjudicate... simplified stand-alone cost approach for medium-size rail rate disputes and revising its...

  5. Journal Benchmarking for Strategic Publication Management and for Improving Journal Positioning in the World Ranking Systems

    Science.gov (United States)

    Moskovkin, Vladimir M.; Bocharova, Emilia A.; Balashova, Oksana V.

    2014-01-01

    Purpose: The purpose of this paper is to introduce and develop the methodology of journal benchmarking. Design/Methodology/ Approach: The journal benchmarking method is understood to be an analytic procedure of continuous monitoring and comparing of the advance of specific journal(s) against that of competing journals in the same subject area,…

  6. The European Union benchmarking experience. From euphoria to fatigue?

    Directory of Open Access Journals (Sweden)

    Michael Zängle

    2004-06-01

    Full Text Available Even if one may agree with the possible criticism of the Lisbon process as being too vague in com-mitment or as lacking appropriate statistical techniques and indicators, the benchmarking system pro-vided by EUROSTAT seems to be sufficiently effective in warning against imminent failure. The Lisbon objectives are very demanding. This holds true even if each of the objectives is looked at in isolation. But 'Lisbon' is more demanding than that, requiring a combination of several objectives to be achieved simultaneously (GDP growth, labour productivity, job-content of growth, higher quality of jobs and greater social cohesion. Even to countries like Ireland, showing exceptionally high performance in GDP growth and employment promotion during the period under investigation, achieving potentially conflicting objectives simultaneously seems to be beyond feasibility. The European Union benchmark-ing exercise is embedded in the context of the Open Method(s of Co-ordination (OMC. This context makes the benchmarking approach part and parcel of an overarching philosophy, which relates the benchmarking indicators to each other and assigns to them their role in corroborating the increasingly dominating project of the 'embedded neo-liberalism'. Against this background, the present paper is focussed on the following point. With the EU bench-marking system being effective enough to make the imminent under-achievement visible, there is a danger of disillusionment and 'benchmarking fatigue', which may provoke an ideological crisis. The dominant project being so deeply rooted, however, chances are high that this crisis will be solved im-manently in terms of embedded neo-liberalism by strengthening the neo-liberal branch of the Euro-pean project. Confining itself to the Europe of Fifteen, the analysis draws on EUROSTAT's database of Structural Indicators. ...

  7. Academic library benchmarking in The Netherlands: a comparative study

    NARCIS (Netherlands)

    Voorbij, H.

    2009-01-01

    Purpose - This paper aims to describe some of the unique features of the Dutch academic library benchmarking system. Design/methodology/approach - The Dutch system is compared with similar projects in the USA, the UK and Germany. Findings - One of the most distinguishing features of the Dutch system

  8. Development of a Benchmark System for Social Enterprise Software

    OpenAIRE

    Semar, Wolfgang; Mastrandrea, Elena; Odoni, Fabian

    2017-01-01

    Network knowledge management in companies does not work without proactive motivation of their users. This paper describes the development of different benchmarks to assess users’ performance and shows a novel approach to stimulate their willingness to actively share their knowledge in their collaborative work by the use of visualisations.

  9. Benchmarking: Achieving the best in class

    Energy Technology Data Exchange (ETDEWEB)

    Kaemmerer, L

    1996-05-01

    Oftentimes, people find the process of organizational benchmarking an onerous task, or, because they do not fully understand the nature of the process, end up with results that are less than stellar. This paper presents the challenges of benchmarking and reasons why benchmarking can benefit an organization in today`s economy.

  10. The LDBC Social Network Benchmark: Interactive Workload

    NARCIS (Netherlands)

    Erling, O.; Averbuch, A.; Larriba-Pey, J.; Chafi, H.; Gubichev, A.; Prat, A.; Pham, M.D.; Boncz, P.A.

    2015-01-01

    The Linked Data Benchmark Council (LDBC) is now two years underway and has gathered strong industrial participation for its mission to establish benchmarks, and benchmarking practices for evaluating graph data management systems. The LDBC introduced a new choke-point driven methodology for developin

  11. From prompt gamma distribution to dose: a novel approach combining an evolutionary algorithm and filtering based on Gaussian-powerlaw convolutions

    Science.gov (United States)

    Schumann, A.; Priegnitz, M.; Schoene, S.; Enghardt, W.; Rohling, H.; Fiedler, F.

    2016-10-01

    Range verification and dose monitoring in proton therapy is considered as highly desirable. Different methods have been developed worldwide, like particle therapy positron emission tomography (PT-PET) and prompt gamma imaging (PGI). In general, these methods allow for a verification of the proton range. However, quantification of the dose from these measurements remains challenging. For the first time, we present an approach for estimating the dose from prompt γ-ray emission profiles. It combines a filtering procedure based on Gaussian-powerlaw convolution with an evolutionary algorithm. By means of convolving depth dose profiles with an appropriate filter kernel, prompt γ-ray depth profiles are obtained. In order to reverse this step, the evolutionary algorithm is applied. The feasibility of this approach is demonstrated for a spread-out Bragg-peak in a water target.

  12. Geothermal Heat Pump Benchmarking Report

    Energy Technology Data Exchange (ETDEWEB)

    None

    1997-01-17

    A benchmarking study was conducted on behalf of the Department of Energy to determine the critical factors in successful utility geothermal heat pump programs. A Successful program is one that has achieved significant market penetration. Successfully marketing geothermal heat pumps has presented some major challenges to the utility industry. However, select utilities have developed programs that generate significant GHP sales. This benchmarking study concludes that there are three factors critical to the success of utility GHP marking programs: (1) Top management marketing commitment; (2) An understanding of the fundamentals of marketing and business development; and (3) An aggressive competitive posture. To generate significant GHP sales, competitive market forces must by used. However, because utilities have functioned only in a regulated arena, these companies and their leaders are unschooled in competitive business practices. Therefore, a lack of experience coupled with an intrinsically non-competitive culture yields an industry environment that impedes the generation of significant GHP sales in many, but not all, utilities.

  13. Benchmarking Variable Selection in QSAR.

    Science.gov (United States)

    Eklund, Martin; Norinder, Ulf; Boyer, Scott; Carlsson, Lars

    2012-02-01

    Variable selection is important in QSAR modeling since it can improve model performance and transparency, as well as reduce the computational cost of model fitting and predictions. Which variable selection methods that perform well in QSAR settings is largely unknown. To address this question we, in a total of 1728 benchmarking experiments, rigorously investigated how eight variable selection methods affect the predictive performance and transparency of random forest models fitted to seven QSAR datasets covering different endpoints, descriptors sets, types of response variables, and number of chemical compounds. The results show that univariate variable selection methods are suboptimal and that the number of variables in the benchmarked datasets can be reduced with about 60 % without significant loss in model performance when using multivariate adaptive regression splines MARS and forward selection.

  14. Towards benchmarking an in-stream water quality model

    Directory of Open Access Journals (Sweden)

    2007-01-01

    Full Text Available A method of model evaluation is presented which utilises a comparison with a benchmark model. The proposed benchmarking concept is one that can be applied to many hydrological models but, in this instance, is implemented in the context of an in-stream water quality model. The benchmark model is defined in such a way that it is easily implemented within the framework of the test model, i.e. the approach relies on two applications of the same model code rather than the application of two separate model codes. This is illustrated using two case studies from the UK, the Rivers Aire and Ouse, with the objective of simulating a water quality classification, general quality assessment (GQA, which is based on dissolved oxygen, biochemical oxygen demand and ammonium. Comparisons between the benchmark and test models are made based on GQA, as well as a step-wise assessment against the components required in its derivation. The benchmarking process yields a great deal of important information about the performance of the test model and raises issues about a priori definition of the assessment criteria.

  15. Developing a benchmark for emotional analysis of music

    Science.gov (United States)

    Yang, Yi-Hsuan; Soleymani, Mohammad

    2017-01-01

    Music emotion recognition (MER) field rapidly expanded in the last decade. Many new methods and new audio features are developed to improve the performance of MER algorithms. However, it is very difficult to compare the performance of the new methods because of the data representation diversity and scarcity of publicly available data. In this paper, we address these problems by creating a data set and a benchmark for MER. The data set that we release, a MediaEval Database for Emotional Analysis in Music (DEAM), is the largest available data set of dynamic annotations (valence and arousal annotations for 1,802 songs and song excerpts licensed under Creative Commons with 2Hz time resolution). Using DEAM, we organized the ‘Emotion in Music’ task at MediaEval Multimedia Evaluation Campaign from 2013 to 2015. The benchmark attracted, in total, 21 active teams to participate in the challenge. We analyze the results of the benchmark: the winning algorithms and feature-sets. We also describe the design of the benchmark, the evaluation procedures and the data cleaning and transformations that we suggest. The results from the benchmark suggest that the recurrent neural network based approaches combined with large feature-sets work best for dynamic MER. PMID:28282400

  16. Developing a benchmark for emotional analysis of music.

    Science.gov (United States)

    Aljanaki, Anna; Yang, Yi-Hsuan; Soleymani, Mohammad

    2017-01-01

    Music emotion recognition (MER) field rapidly expanded in the last decade. Many new methods and new audio features are developed to improve the performance of MER algorithms. However, it is very difficult to compare the performance of the new methods because of the data representation diversity and scarcity of publicly available data. In this paper, we address these problems by creating a data set and a benchmark for MER. The data set that we release, a MediaEval Database for Emotional Analysis in Music (DEAM), is the largest available data set of dynamic annotations (valence and arousal annotations for 1,802 songs and song excerpts licensed under Creative Commons with 2Hz time resolution). Using DEAM, we organized the 'Emotion in Music' task at MediaEval Multimedia Evaluation Campaign from 2013 to 2015. The benchmark attracted, in total, 21 active teams to participate in the challenge. We analyze the results of the benchmark: the winning algorithms and feature-sets. We also describe the design of the benchmark, the evaluation procedures and the data cleaning and transformations that we suggest. The results from the benchmark suggest that the recurrent neural network based approaches combined with large feature-sets work best for dynamic MER.

  17. A Benchmark for Management Effectiveness

    OpenAIRE

    Zimmermann, Bill; Chanaron, Jean-Jacques; Klieb, Leslie

    2007-01-01

    International audience; This study presents a tool to gauge managerial effectiveness in the form of a questionnaire that is easy to administer and score. The instrument covers eight distinct areas of organisational climate and culture of management inside a company or department. Benchmark scores were determined by administering sample-surveys to a wide cross-section of individuals from numerous firms in Southeast Louisiana, USA. Scores remained relatively constant over a seven-year timeframe...

  18. Restaurant Energy Use Benchmarking Guideline

    Energy Technology Data Exchange (ETDEWEB)

    Hedrick, R.; Smith, V.; Field, K.

    2011-07-01

    A significant operational challenge for food service operators is defining energy use benchmark metrics to compare against the performance of individual stores. Without metrics, multiunit operators and managers have difficulty identifying which stores in their portfolios require extra attention to bring their energy performance in line with expectations. This report presents a method whereby multiunit operators may use their own utility data to create suitable metrics for evaluating their operations.

  19. Building America Research Benchmark Definition: Updated August 15, 2007

    Energy Technology Data Exchange (ETDEWEB)

    Hendron, R.

    2007-09-01

    To track progress toward aggressive multi-year whole-house energy savings goals of 40-70% and onsite power production of up to 30%, DOE's Residential Buildings Program and NREL developed the Building America Research Benchmark in consultation with the Building America industry teams. The Benchmark is generally consistent with mid-1990s standard practice, as reflected in the Home Energy Rating System (HERS) Technical Guidelines (RESNET 2002), with additional definitions that allow the analyst to evaluate all residential end-uses, an extension of the traditional HERS rating approach that focuses on space conditioning and hot water. Unlike the reference homes used for HERS, EnergyStar, and most energy codes, the Benchmark represents typical construction at a fixed point in time so it can be used as the basis for Building America's multi-year energy savings goals without the complication of chasing a 'moving target'.

  20. Building America Research Benchmark Definition, Updated December 15, 2006

    Energy Technology Data Exchange (ETDEWEB)

    Hendron, R.

    2007-01-01

    To track progress toward aggressive multi-year whole-house energy savings goals of 40-70% and onsite power production of up to 30%, DOE's Residential Buildings Program and NREL developed the Building America Research Benchmark in consultation with the Building America industry teams. The Benchmark is generally consistent with mid-1990s standard practice, as reflected in the Home Energy Rating System (HERS) Technical Guidelines (RESNET 2002), with additional definitions that allow the analyst to evaluate all residential end-uses, an extension of the traditional HERS rating approach that focuses on space conditioning and hot water. Unlike the reference homes used for HERS, EnergyStar, and most energy codes, the Benchmark represents typical construction at a fixed point in time so it can be used as the basis for Building America's multi-year energy savings goals without the complication of chasing a ''moving target''.

  1. Building America Research Benchmark Definition: Updated December 20, 2007

    Energy Technology Data Exchange (ETDEWEB)

    Hendron, R.

    2008-01-01

    To track progress toward aggressive multi-year whole-house energy savings goals of 40-70% and onsite power production of up to 30%, DOE's Residential Buildings Program and NREL developed the Building America Research Benchmark in consultation with the Building America industry teams. The Benchmark is generally consistent with mid-1990s standard practice, as reflected in the Home Energy Rating System (HERS) Technical Guidelines (RESNET 2002), with additional definitions that allow the analyst to evaluate all residential end-uses, an extension of the traditional HERS rating approach that focuses on space conditioning and hot water. Unlike the reference homes used for HERS, EnergyStar, and most energy codes, the Benchmark represents typical construction at a fixed point in time so it can be used as the basis for Building America's multi-year energy savings goals without the complication of chasing a 'moving target'.

  2. Refined hazard characterization of 3-MCPD using benchmark dose modeling

    NARCIS (Netherlands)

    Rietjens, I.M.C.M.; Scholz, G.; Berg, van den I.; Schilter, B.; Slob, W.

    2012-01-01

    3-Monochloropropane-1,2-diol (3-MCPD)-esters represent a newly identified class of food-borne process contaminants of possible health concern. Due to hydrolysis 3-MCPD esters constitute a potentially significant source of free 3-MCPD exposure and their preliminary risk assessment was based on toxico

  3. Benchmark Dose Software Development and Maintenance Ten Berge Cxt Models

    Science.gov (United States)

    This report is intended to provide an overview of beta version 1.0 of the implementation of a concentration-time (CxT) model originally programmed and provided by Wil ten Berge (referred to hereafter as the ten Berge model). The recoding and development described here represent ...

  4. Thermal Performance Benchmarking: Annual Report

    Energy Technology Data Exchange (ETDEWEB)

    Moreno, Gilbert

    2016-04-08

    The goal for this project is to thoroughly characterize the performance of state-of-the-art (SOA) automotive power electronics and electric motor thermal management systems. Information obtained from these studies will be used to: Evaluate advantages and disadvantages of different thermal management strategies; establish baseline metrics for the thermal management systems; identify methods of improvement to advance the SOA; increase the publicly available information related to automotive traction-drive thermal management systems; help guide future electric drive technologies (EDT) research and development (R&D) efforts. The performance results combined with component efficiency and heat generation information obtained by Oak Ridge National Laboratory (ORNL) may then be used to determine the operating temperatures for the EDT components under drive-cycle conditions. In FY15, the 2012 Nissan LEAF power electronics and electric motor thermal management systems were benchmarked. Testing of the 2014 Honda Accord Hybrid power electronics thermal management system started in FY15; however, due to time constraints it was not possible to include results for this system in this report. The focus of this project is to benchmark the thermal aspects of the systems. ORNL's benchmarking of electric and hybrid electric vehicle technology reports provide detailed descriptions of the electrical and packaging aspects of these automotive systems.

  5. HS06 Benchmark for an ARM Server

    Science.gov (United States)

    Kluth, Stefan

    2014-06-01

    We benchmarked an ARM cortex-A9 based server system with a four-core CPU running at 1.1 GHz. The system used Ubuntu 12.04 as operating system and the HEPSPEC 2006 (HS06) benchmarking suite was compiled natively with gcc-4.4 on the system. The benchmark was run for various settings of the relevant gcc compiler options. We did not find significant influence from the compiler options on the benchmark result. The final HS06 benchmark result is 10.4.

  6. HS06 Benchmark for an ARM Server

    CERN Document Server

    Kluth, Stefan

    2013-01-01

    We benchmarked an ARM cortex-A9 based server system with a four-core CPU running at 1.1 GHz. The system used Ubuntu 12.04 as operating system and the HEPSPEC 2006 (HS06) benchmarking suite was compiled natively with gcc-4.4 on the system. The benchmark was run for various settings of the relevant gcc compiler options. We did not find significant influence from the compiler options on the benchmark result. The final HS06 benchmark result is 10.4.

  7. Development and application of freshwater sediment-toxicity benchmarks for currently used pesticides

    Energy Technology Data Exchange (ETDEWEB)

    Nowell, Lisa H., E-mail: lhnowell@usgs.gov [U.S. Geological Survey, California Water Science Center, Placer Hall, 6000 J Street, Sacramento, CA 95819 (United States); Norman, Julia E., E-mail: jnorman@usgs.gov [U.S. Geological Survey, Oregon Water Science Center, 2130 SW 5" t" h Avenue, Portland, OR 97201 (United States); Ingersoll, Christopher G., E-mail: cingersoll@usgs.gov [U.S. Geological Survey, Columbia Environmental Research Center, 4200 New Haven Road, Columbia, MO 65021 (United States); Moran, Patrick W., E-mail: pwmoran@usgs.gov [U.S. Geological Survey, Washington Water Science Center, 934 Broadway, Suite 300, Tacoma, WA 98402 (United States)

    2016-04-15

    Sediment-toxicity benchmarks are needed to interpret the biological significance of currently used pesticides detected in whole sediments. Two types of freshwater sediment benchmarks for pesticides were developed using spiked-sediment bioassay (SSB) data from the literature. These benchmarks can be used to interpret sediment-toxicity data or to assess the potential toxicity of pesticides in whole sediment. The Likely Effect Benchmark (LEB) defines a pesticide concentration in whole sediment above which there is a high probability of adverse effects on benthic invertebrates, and the Threshold Effect Benchmark (TEB) defines a concentration below which adverse effects are unlikely. For compounds without available SSBs, benchmarks were estimated using equilibrium partitioning (EqP). When a sediment sample contains a pesticide mixture, benchmark quotients can be summed for all detected pesticides to produce an indicator of potential toxicity for that mixture. Benchmarks were developed for 48 pesticide compounds using SSB data and 81 compounds using the EqP approach. In an example application, data for pesticides measured in sediment from 197 streams across the United States were evaluated using these benchmarks, and compared to measured toxicity from whole-sediment toxicity tests conducted with the amphipod Hyalella azteca (28-d exposures) and the midge Chironomus dilutus (10-d exposures). Amphipod survival, weight, and biomass were significantly and inversely related to summed benchmark quotients, whereas midge survival, weight, and biomass showed no relationship to benchmarks. Samples with LEB exceedances were rare (n = 3), but all were toxic to amphipods (i.e., significantly different from control). Significant toxicity to amphipods was observed for 72% of samples exceeding one or more TEBs, compared to 18% of samples below all TEBs. Factors affecting toxicity below TEBs may include the presence of contaminants other than pesticides, physical/chemical characteristics

  8. Argonne Code Center: Benchmark problem book.

    Energy Technology Data Exchange (ETDEWEB)

    None, None

    1977-06-01

    This book is an outgrowth of activities of the Computational Benchmark Problems Committee of the Mathematics and Computation Division of the American Nuclear Society. This is the second supplement of the original benchmark book which was first published in February, 1968 and contained computational benchmark problems in four different areas. Supplement No. 1, which was published in December, 1972, contained corrections to the original benchmark book plus additional problems in three new areas. The current supplement. Supplement No. 2, contains problems in eight additional new areas. The objectives of computational benchmark work and the procedures used by the committee in pursuing the objectives are outlined in the original edition of the benchmark book (ANL-7416, February, 1968). The members of the committee who have made contributions to Supplement No. 2 are listed below followed by the contributors to the earlier editions of the benchmark book.

  9. Benchmarks

    Data.gov (United States)

    Earth Data Analysis Center, University of New Mexico — The National Flood Hazard Layer (NFHL) data incorporates all Digital Flood Insurance Rate Map(DFIRM) databases published by FEMA, and any Letters Of Map Revision...

  10. Development and application of freshwater sediment-toxicity benchmarks for currently used pesticides.

    Science.gov (United States)

    Nowell, Lisa H; Norman, Julia E; Ingersoll, Christopher G; Moran, Patrick W

    2016-04-15

    Sediment-toxicity benchmarks are needed to interpret the biological significance of currently used pesticides detected in whole sediments. Two types of freshwater sediment benchmarks for pesticides were developed using spiked-sediment bioassay (SSB) data from the literature. These benchmarks can be used to interpret sediment-toxicity data or to assess the potential toxicity of pesticides in whole sediment. The Likely Effect Benchmark (LEB) defines a pesticide concentration in whole sediment above which there is a high probability of adverse effects on benthic invertebrates, and the Threshold Effect Benchmark (TEB) defines a concentration below which adverse effects are unlikely. For compounds without available SSBs, benchmarks were estimated using equilibrium partitioning (EqP). When a sediment sample contains a pesticide mixture, benchmark quotients can be summed for all detected pesticides to produce an indicator of potential toxicity for that mixture. Benchmarks were developed for 48 pesticide compounds using SSB data and 81 compounds using the EqP approach. In an example application, data for pesticides measured in sediment from 197 streams across the United States were evaluated using these benchmarks, and compared to measured toxicity from whole-sediment toxicity tests conducted with the amphipod Hyalella azteca (28-d exposures) and the midge Chironomus dilutus (10-d exposures). Amphipod survival, weight, and biomass were significantly and inversely related to summed benchmark quotients, whereas midge survival, weight, and biomass showed no relationship to benchmarks. Samples with LEB exceedances were rare (n=3), but all were toxic to amphipods (i.e., significantly different from control). Significant toxicity to amphipods was observed for 72% of samples exceeding one or more TEBs, compared to 18% of samples below all TEBs. Factors affecting toxicity below TEBs may include the presence of contaminants other than pesticides, physical/chemical characteristics

  11. Development and application of freshwater sediment-toxicity benchmarks for currently used pesticides

    Science.gov (United States)

    Nowell, Lisa H.; Norman, Julia E.; Ingersoll, Christopher G.; Moran, Patrick W.

    2016-01-01

    Sediment-toxicity benchmarks are needed to interpret the biological significance of currently used pesticides detected in whole sediments. Two types of freshwater sediment benchmarks for pesticides were developed using spiked-sediment bioassay (SSB) data from the literature. These benchmarks can be used to interpret sediment-toxicity data or to assess the potential toxicity of pesticides in whole sediment. The Likely Effect Benchmark (LEB) defines a pesticide concentration in whole sediment above which there is a high probability of adverse effects on benthic invertebrates, and the Threshold Effect Benchmark (TEB) defines a concentration below which adverse effects are unlikely. For compounds without available SSBs, benchmarks were estimated using equilibrium partitioning (EqP). When a sediment sample contains a pesticide mixture, benchmark quotients can be summed for all detected pesticides to produce an indicator of potential toxicity for that mixture. Benchmarks were developed for 48 pesticide compounds using SSB data and 81 compounds using the EqP approach. In an example application, data for pesticides measured in sediment from 197 streams across the United States were evaluated using these benchmarks, and compared to measured toxicity from whole-sediment toxicity tests conducted with the amphipod Hyalella azteca (28-d exposures) and the midge Chironomus dilutus (10-d exposures). Amphipod survival, weight, and biomass were significantly and inversely related to summed benchmark quotients, whereas midge survival, weight, and biomass showed no relationship to benchmarks. Samples with LEB exceedances were rare (n = 3), but all were toxic to amphipods (i.e., significantly different from control). Significant toxicity to amphipods was observed for 72% of samples exceeding one or more TEBs, compared to 18% of samples below all TEBs. Factors affecting toxicity below TEBs may include the presence of contaminants other than pesticides, physical

  12. PageRank Pipeline Benchmark: Proposal for a Holistic System Benchmark for Big-Data Platforms

    CERN Document Server

    Dreher, Patrick; Hill, Chris; Gadepally, Vijay; Kuszmaul, Bradley; Kepner, Jeremy

    2016-01-01

    The rise of big data systems has created a need for benchmarks to measure and compare the capabilities of these systems. Big data benchmarks present unique scalability challenges. The supercomputing community has wrestled with these challenges for decades and developed methodologies for creating rigorous scalable benchmarks (e.g., HPC Challenge). The proposed PageRank pipeline benchmark employs supercomputing benchmarking methodologies to create a scalable benchmark that is reflective of many real-world big data processing systems. The PageRank pipeline benchmark builds on existing prior scalable benchmarks (Graph500, Sort, and PageRank) to create a holistic benchmark with multiple integrated kernels that can be run together or independently. Each kernel is well defined mathematically and can be implemented in any programming environment. The linear algebraic nature of PageRank makes it well suited to being implemented using the GraphBLAS standard. The computations are simple enough that performance predictio...

  13. Model Averaging Software for Dichotomous Dose Response Risk Estimation

    Directory of Open Access Journals (Sweden)

    Matthew W. Wheeler

    2008-02-01

    Full Text Available Model averaging has been shown to be a useful method for incorporating model uncertainty in quantitative risk estimation. In certain circumstances this technique is computationally complex, requiring sophisticated software to carry out the computation. We introduce software that implements model averaging for risk assessment based upon dichotomous dose-response data. This software, which we call Model Averaging for Dichotomous Response Benchmark Dose (MADr-BMD, fits the quantal response models, which are also used in the US Environmental Protection Agency benchmark dose software suite, and generates a model-averaged dose response model to generate benchmark dose and benchmark dose lower bound estimates. The software fulfills a need for risk assessors, allowing them to go beyond one single model in their risk assessments based on quantal data by focusing on a set of models that describes the experimental data.

  14. Reconstructing Organophosphorus Pesticide Doses Using the Reversed Dosimetry Approach in a Simple Physiologically-Based Pharmacokinetic Model

    Directory of Open Access Journals (Sweden)

    Chensheng Lu

    2012-01-01

    Full Text Available We illustrated the development of a simple pharmacokinetic (SPK model aiming to estimate the absorbed chlorpyrifos doses using urinary biomarker data, 3,5,6-trichlorpyridinol as the model input. The effectiveness of the SPK model in the pesticide risk assessment was evaluated by comparing dose estimates using different urinary composite data. The dose estimates resulting from the first morning voids appeared to be lower than but not significantly different to those using before bedtime, lunch or dinner voids. We found similar trend for dose estimates using three different urinary composite data. However, the dose estimates using the SPK model for individual children were significantly higher than those from the conventional physiologically based pharmacokinetic (PBPK modeling using aggregate environmental measurements of chlorpyrifos as the model inputs. The use of urinary data in the SPK model intuitively provided a plausible alternative to the conventional PBPK model in reconstructing the absorbed chlorpyrifos dose.

  15. NASA Software Engineering Benchmarking Effort

    Science.gov (United States)

    Godfrey, Sally; Rarick, Heather

    2012-01-01

    Benchmarking was very interesting and provided a wealth of information (1) We did see potential solutions to some of our "top 10" issues (2) We have an assessment of where NASA stands with relation to other aerospace/defense groups We formed new contacts and potential collaborations (1) Several organizations sent us examples of their templates, processes (2) Many of the organizations were interested in future collaboration: sharing of training, metrics, Capability Maturity Model Integration (CMMI) appraisers, instructors, etc. We received feedback from some of our contractors/ partners (1) Desires to participate in our training; provide feedback on procedures (2) Welcomed opportunity to provide feedback on working with NASA

  16. Benchmarking of human resources management

    OpenAIRE

    David M. Akinnusi

    2008-01-01

    This paper reviews the role of human resource management (HRM) which, today, plays a strategic partnership role in management. The focus of the paper is on HRM in the public sector, where much hope rests on HRM as a means of transforming the public service and achieving much needed service delivery. However, a critical evaluation of HRM practices in the public sector reveals that these services leave much to be desired. The paper suggests the adoption of benchmarking as a process to revamp HR...

  17. Comparison of different approaches of estimating effective dose from reported exposure data in 3D imaging with interventional fluoroscopy systems

    Science.gov (United States)

    Svalkvist, Angelica; Hansson, Jonny; Bâth, Magnus

    2014-03-01

    Three-dimensional (3D) imaging with interventional fluoroscopy systems is today a common examination. The examination includes acquisition of two-dimensional projection images, used to reconstruct section images of the patient. The aim of the present study was to investigate the difference in resulting effective dose obtained using different levels of complexity in calculations of effective doses from these examinations. In the study the Siemens Artis Zeego interventional fluoroscopy system (Siemens Medical Solutions, Erlangen, Germany) was used. Images of anthropomorphic chest and pelvis phantoms were acquired. The exposure values obtained were used to calculate the resulting effective doses from the examinations, using the computer software PCXMC (STUK, Helsinki, Finland). The dose calculations were performed using three different methods: 1. using individual exposure values for each projection image, 2. using the mean tube voltage and the total DAP value, evenly distributed over the projection images, and 3. using the mean kV and the total DAP value, evenly distributed over smaller selection of projection images. The results revealed that the difference in resulting effective dose between the first two methods was smaller than 5%. When only a selection of projection images were used in the dose calculations the difference increased to over 10%. Given the uncertainties associated with the effective dose concept, the results indicate that dose calculations based on average exposure values distributed over a smaller selection of projection angles can provide reasonably accurate estimations of the radiation doses from 3D imaging using interventional fluoroscopy systems.

  18. A comprehensive benchmarking system for evaluating global vegetation models

    Directory of Open Access Journals (Sweden)

    D. I. Kelley

    2012-11-01

    Full Text Available We present a benchmark system for global vegetation models. This system provides a quantitative evaluation of multiple simulated vegetation properties, including primary production; seasonal net ecosystem production; vegetation cover, composition and height; fire regime; and runoff. The benchmarks are derived from remotely sensed gridded datasets and site-based observations. The datasets allow comparisons of annual average conditions and seasonal and inter-annual variability, and they allow the impact of spatial and temporal biases in means and variability to be assessed separately. Specifically designed metrics quantify model performance for each process, and are compared to scores based on the temporal or spatial mean value of the observations and a "random" model produced by bootstrap resampling of the observations. The benchmark system is applied to three models: a simple light-use efficiency and water-balance model (the Simple Diagnostic Biosphere Model: SDBM, and the Lund-Potsdam-Jena (LPJ and Land Processes and eXchanges (LPX dynamic global vegetation models (DGVMs. SDBM reproduces observed CO2 seasonal cycles, but its simulation of independent measurements of net primary production (NPP is too high. The two DGVMs show little difference for most benchmarks (including the inter-annual variability in the growth rate and seasonal cycle of atmospheric CO2, but LPX represents burnt fraction demonstrably more accurately. Benchmarking also identified several weaknesses common to both DGVMs. The benchmarking system provides a quantitative approach for evaluating how adequately processes are represented in a model, identifying errors and biases, tracking improvements in performance through model development, and discriminating among models. Adoption of such a system would do much to improve confidence in terrestrial model predictions of climate change impacts and feedbacks.

  19. PROCEDURES FOR DERIVING EQUILIBRIUM PARTITIONING SEDIMENT BENCHMARKS (ESBS) FOR THE PROTECTION OF BENTHIC ORGANISMS: DIELDRIN

    Science.gov (United States)

    This equilibrium partitioning sediment benchmark (ESB) document describes procedures to derive concentrations of the insecticide dieldrin in sediment which are protective of the presence of benthic organisms. The equilibrium partitioning (EqP) approach was chosen because it acco...

  20. A tiered approach for integrating exposure and dosimetry with in vitro dose-response data in the modern risk assessment paradigm

    Science.gov (United States)

    High-throughput (HT) risk screening approaches apply in vitro dose-response data to estimate potential health risks that arise from exposure to chemicals. However, much uncertainty is inherent in relating bioactivities observed in an in vitro system to the perturbations of biolog...

  1. SU-E-T-56: A Novel Approach to Computing Expected Value and Variance of Point Dose From Non-Gated Radiotherapy Delivery

    Energy Technology Data Exchange (ETDEWEB)

    Zhou, S; Zhu, X; Zhang, M; Zheng, D; Zhang, Q; Lei, Y; Li, S; Driewer, J; Wang, S; Enke, C [University of Nebraska Medical Center, Omaha, NE (United States)

    2015-06-15

    Purpose: Randomness in patient internal organ motion phase at the beginning of non-gated radiotherapy delivery may introduce uncertainty to dose received by the patient. Concerns of this dose deviation from the planned one has motivated many researchers to study this phenomenon although unified theoretical framework for computing it is still missing. This study was conducted to develop such framework for analyzing the effect. Methods: Two reasonable assumptions were made: a) patient internal organ motion is stationary and periodic; b) no special arrangement is made to start a non -gated radiotherapy delivery at any specific phase of patient internal organ motion. A statistical ensemble was formed consisting of patient’s non-gated radiotherapy deliveries at all equally possible initial organ motion phases. To characterize the patient received dose, statistical ensemble average method is employed to derive formulae for two variables: expected value and variance of dose received by a patient internal point from a non-gated radiotherapy delivery. Fourier Series was utilized to facilitate our analysis. Results: According to our formulae, the two variables can be computed from non-gated radiotherapy generated dose rate time sequences at the point’s corresponding locations on fixed phase 3D CT images sampled evenly in time over one patient internal organ motion period. The expected value of point dose is simply the average of the doses to the point’s corresponding locations on the fixed phase CT images. The variance can be determined by time integration in terms of Fourier Series coefficients of the dose rate time sequences on the same fixed phase 3D CT images. Conclusion: Given a non-gated radiotherapy delivery plan and patient’s 4D CT study, our novel approach can predict the expected value and variance of patient radiation dose. We expect it to play a significant role in determining both quality and robustness of patient non-gated radiotherapy plan.

  2. Design and prospective validation of a dosing instrument for continuous infusion of vancomycin : a within-population approach

    NARCIS (Netherlands)

    van Maarseveen, Erik M.; Bouma, Annemien; Touw, Daniel J.; Neef, Cees; van Zanten, Arthur R.H.

    2014-01-01

    INTRODUCTION: The clinical application of continuous infusion (CoI) of vancomycin has gained interest in recent years. Since no international guidelines on initial dosing of vancomycin CoI exist, there is a need for methods to facilitate the switch from intermittent to continuous vancomycin dosing a

  3. A practical approach for a patient-tailored dose protocol in coronary CT angiography using prospective ECG triggering

    NARCIS (Netherlands)

    Dijk, van J.D.; Huizing, E.D.; Jager, P.L.; Ottervanger, J.P.; Knollema, S.; Slump, C.H.; Dalen, van J.A.

    2015-01-01

    To derive and validate a practical patient-specific dose protocol to obtain an image quality, expressed by the image noise, independent of patients’ size and a better radiation dose justification in coronary CT angiography (CCTA) using prospective ECG triggering. 43 patients underwent clinically ind

  4. Observer-based FDI for Gain Fault Detection in Ship Propulsion Benchmark

    DEFF Research Database (Denmark)

    Lootsma, T.F.; Izadi-Zamanabadi, Roozbeh; Nijmeijer, H.

    2001-01-01

    A geometric approach for input-affine nonlinear systems is briefly described and then applied to a ship propulsion benchmark. The obtained results are used to design a diagnostic nonlinear observer for successful FDI of the diesel engine gain fault......A geometric approach for input-affine nonlinear systems is briefly described and then applied to a ship propulsion benchmark. The obtained results are used to design a diagnostic nonlinear observer for successful FDI of the diesel engine gain fault...

  5. Observer-based FDI for Gain Fault Detection in Ship Propulsion Benchmark

    DEFF Research Database (Denmark)

    Lootsma, T.F.; Izadi-Zamanabadi, Roozbeh; Nijmeijer, H.

    2001-01-01

    A geometric approach for input-affine nonlinear systems is briefly described and then applied to a ship propulsion benchmark. The obtained results are used to design a diagnostic nonlinear observer for successful FDI of the diesel engine gain fault.......A geometric approach for input-affine nonlinear systems is briefly described and then applied to a ship propulsion benchmark. The obtained results are used to design a diagnostic nonlinear observer for successful FDI of the diesel engine gain fault....

  6. Computational Chemistry Comparison and Benchmark Database

    Science.gov (United States)

    SRD 101 NIST Computational Chemistry Comparison and Benchmark Database (Web, free access)   The NIST Computational Chemistry Comparison and Benchmark Database is a collection of experimental and ab initio thermochemical properties for a selected set of molecules. The goals are to provide a benchmark set of molecules for the evaluation of ab initio computational methods and allow the comparison between different ab initio computational methods for the prediction of thermochemical properties.

  7. Benchmarking i eksternt regnskab og revision

    DEFF Research Database (Denmark)

    Thinggaard, Frank; Kiertzner, Lars

    2001-01-01

    løbende i en benchmarking-proces. Dette kapitel vil bredt undersøge, hvor man med nogen ret kan få benchmarking-begrebet knyttet til eksternt regnskab og revision. Afsnit 7.1 beskæftiger sig med det eksterne årsregnskab, mens afsnit 7.2 tager fat i revisionsområdet. Det sidste afsnit i kapitlet opsummerer...... betragtningerne om benchmarking i forbindelse med begge områder....

  8. Benchmarking Analysis of Institutional University Autonomy in Denmark, Lithuania, Romania, Scotland, and Sweden

    DEFF Research Database (Denmark)

    This book presents a benchmark, comparative analysis of institutional university autonomy in Denmark, Lithuania, Romania, Scotland and Sweden. These countries are partners in a EU TEMPUS funded project 'Enhancing University Autonomy in Moldova' (EUniAM). This benchmark analysis was conducted...... by the EUniAM Lead Task Force team that collected and analysed secondary and primary data in each of these countries and produced four benchmark reports that are part of this book. For each dimension and interface of institutional university autonomy, the members of the Lead Task Force team identified...... respective evaluation criteria and searched for similarities and differences in approaches to higher education sectors and respective autonomy regimes in these countries. The consolidated report that precedes the benchmark reports summarises the process and key findings from the four benchmark reports...

  9. Competitive bidding in Medicare Advantage: effect of benchmark changes on plan bids.

    Science.gov (United States)

    Song, Zirui; Landrum, Mary Beth; Chernew, Michael E

    2013-12-01

    Bidding has been proposed to replace or complement the administered prices that Medicare pays to hospitals and health plans. In 2006, the Medicare Advantage program implemented a competitive bidding system to determine plan payments. In perfectly competitive models, plans bid their costs and thus bids are insensitive to the benchmark. Under many other models of competition, bids respond to changes in the benchmark. We conceptualize the bidding system and use an instrumental variable approach to study the effect of benchmark changes on bids. We use 2006-2010 plan payment data from the Centers for Medicare and Medicaid Services, published county benchmarks, actual realized fee-for-service costs, and Medicare Advantage enrollment. We find that a $1 increase in the benchmark leads to about a $0.53 increase in bids, suggesting that plans in the Medicare Advantage market have meaningful market power. Copyright © 2013 Elsevier B.V. All rights reserved.

  10. NRC-BNL Benchmark Program on Evaluation of Methods for Seismic Analysis of Coupled Systems

    Energy Technology Data Exchange (ETDEWEB)

    Chokshi, N.; DeGrassi, G.; Xu, J.

    1999-03-24

    A NRC-BNL benchmark program for evaluation of state-of-the-art analysis methods and computer programs for seismic analysis of coupled structures with non-classical damping is described. The program includes a series of benchmarking problems designed to investigate various aspects of complexities, applications and limitations associated with methods for analysis of non-classically damped structures. Discussions are provided on the benchmarking process, benchmark structural models, and the evaluation approach, as well as benchmarking ground rules. It is expected that the findings and insights, as well as recommendations from this program will be useful in developing new acceptance criteria and providing guidance for future regulatory activities involving licensing applications of these alternate methods to coupled systems.

  11. NRC-BNL BENCHMARK PROGRAM ON EVALUATION OF METHODS FOR SEISMIC ANALYSIS OF COUPLED SYSTEMS.

    Energy Technology Data Exchange (ETDEWEB)

    XU,J.

    1999-08-15

    A NRC-BNL benchmark program for evaluation of state-of-the-art analysis methods and computer programs for seismic analysis of coupled structures with non-classical damping is described. The program includes a series of benchmarking problems designed to investigate various aspects of complexities, applications and limitations associated with methods for analysis of non-classically damped structures. Discussions are provided on the benchmarking process, benchmark structural models, and the evaluation approach, as well as benchmarking ground rules. It is expected that the findings and insights, as well as recommendations from this program will be useful in developing new acceptance criteria and providing guidance for future regulatory activities involving licensing applications of these alternate methods to coupled systems.

  12. Developing Benchmarks for Solar Radio Bursts

    Science.gov (United States)

    Biesecker, D. A.; White, S. M.; Gopalswamy, N.; Black, C.; Domm, P.; Love, J. J.; Pierson, J.

    2016-12-01

    Solar radio bursts can interfere with radar, communication, and tracking signals. In severe cases, radio bursts can inhibit the successful use of radio communications and disrupt a wide range of systems that are reliant on Position, Navigation, and Timing services on timescales ranging from minutes to hours across wide areas on the dayside of Earth. The White House's Space Weather Action Plan has asked for solar radio burst intensity benchmarks for an event occurrence frequency of 1 in 100 years and also a theoretical maximum intensity benchmark. The solar radio benchmark team was also asked to define the wavelength/frequency bands of interest. The benchmark team developed preliminary (phase 1) benchmarks for the VHF (30-300 MHz), UHF (300-3000 MHz), GPS (1176-1602 MHz), F10.7 (2800 MHz), and Microwave (4000-20000) bands. The preliminary benchmarks were derived based on previously published work. Limitations in the published work will be addressed in phase 2 of the benchmark process. In addition, deriving theoretical maxima requires additional work, where it is even possible to, in order to meet the Action Plan objectives. In this presentation, we will present the phase 1 benchmarks and the basis used to derive them. We will also present the work that needs to be done in order to complete the final, or phase 2 benchmarks.

  13. Benchmarking for controllere: Metoder, teknikker og muligheder

    DEFF Research Database (Denmark)

    Bukh, Per Nikolaj; Sandalgaard, Niels; Dietrichson, Lars

    2008-01-01

    Der vil i artiklen blive stillet skarpt på begrebet benchmarking ved at præsentere og diskutere forskellige facetter af det. Der vil blive redegjort for fire forskellige anvendelser af benchmarking for at vise begrebets bredde og væsentligheden af at klarlægge formålet med et benchmarkingprojekt......, inden man går i gang. Forskellen på resultatbenchmarking og procesbenchmarking vil blive behandlet, hvorefter brugen af intern hhv. ekstern benchmarking vil blive diskuteret. Endelig introduceres brugen af benchmarking i budgetlægning og budgetopfølgning....

  14. Establishing benchmarks and metrics for utilization management.

    Science.gov (United States)

    Melanson, Stacy E F

    2014-01-01

    The changing environment of healthcare reimbursement is rapidly leading to a renewed appreciation of the importance of utilization management in the clinical laboratory. The process of benchmarking of laboratory operations is well established for comparing organizational performance to other hospitals (peers) and for trending data over time through internal benchmarks. However, there are relatively few resources available to assist organizations in benchmarking for laboratory utilization management. This article will review the topic of laboratory benchmarking with a focus on the available literature and services to assist in managing physician requests for laboratory testing. © 2013.

  15. Issues to consider in the derivation of water quality benchmarks for the protection of aquatic life.

    Science.gov (United States)

    Schneider, Uwe

    2014-01-01

    While water quality benchmarks for the protection of aquatic life have been in use in some jurisdictions for several decades (USA, Canada, several European countries), more and more countries are now setting up their own national water quality benchmark development programs. In doing so, they either adopt an existing method from another jurisdiction, update on an existing approach, or develop their own new derivation method. Each approach has its own advantages and disadvantages, and many issues have to be addressed when setting up a water quality benchmark development program or when deriving a water quality benchmark. Each of these tasks requires a special expertise. They may seem simple, but are complex in their details. The intention of this paper was to provide some guidance for this process of water quality benchmark development on the program level, for the derivation methodology development, and in the actual benchmark derivation step, as well as to point out some issues (notably the inclusion of adapted populations and cryptic species and points to consider in the use of the species sensitivity distribution approach) and future opportunities (an international data repository and international collaboration in water quality benchmark development).

  16. Benchmarking Implementations of Functional Languages with ``Pseudoknot'', a Float-Intensive Benchmark

    NARCIS (Netherlands)

    Hartel, P.H.; Feeley, M.; Alt, M.; Augustsson, L.

    1996-01-01

    Over 25 implementations of different functional languages are benchmarked using the same program, a floatingpoint intensive application taken from molecular biology. The principal aspects studied are compile time and execution time for the various implementations that were benchmarked. An important

  17. The Zoo, Benchmarks & You: How To Reach the Oregon State Benchmarks with Zoo Resources.

    Science.gov (United States)

    2002

    This document aligns Oregon state educational benchmarks and standards with Oregon Zoo resources. Benchmark areas examined include English, mathematics, science, social studies, and career and life roles. Brief descriptions of the programs offered by the zoo are presented. (SOE)

  18. The Zoo, Benchmarks & You: How To Reach the Oregon State Benchmarks with Zoo Resources.

    Science.gov (United States)

    2002

    This document aligns Oregon state educational benchmarks and standards with Oregon Zoo resources. Benchmark areas examined include English, mathematics, science, social studies, and career and life roles. Brief descriptions of the programs offered by the zoo are presented. (SOE)

  19. Benchmarking Implementations of Functional Languages with "Pseudoknot", a float-intensive benchmark

    NARCIS (Netherlands)

    Hartel, Pieter H.; Feeley, M.; Alt, M.; Augustsson, L.

    Over 25 implementations of different functional languages are benchmarked using the same program, a floatingpoint intensive application taken from molecular biology. The principal aspects studied are compile time and execution time for the various implementations that were benchmarked. An important

  20. An approach for online evaluations of dose consequences caused by small rotational setup errors in intracranial stereotactic radiation therapy

    Energy Technology Data Exchange (ETDEWEB)

    Lu Bo; Li, Jonathan; Kahler, Darren; Yan Guanghua; Mittauer, Kathryn; Shi Wenyin; Okunieff, Paul; Liu, Chihray [Department of Radiation Oncology, University of Florida College of Medicine, Gainesville, Florida 32607 (United States)

    2011-11-15

    Purpose: The purpose of this work is to investigate the impact of small rotational errors on the magnitudes and distributions of spatial dose variations for intracranial stereotactic radiotherapy (SRT) treatment setups, and to assess the feasibility of using the original dose map overlaid with rotated contours (ODMORC) method as a fast, online evaluation tool to estimate dose changes (using DVHs) to clinical target volumes (CTVs) and organs-at-risks (OARs) caused by small rotational setup errors. Methods: Fifteen intracranial SRT cases treated with either three-dimensional conformal radiation therapy (3DCRT) or intensity-modulated radiation therapy (IMRT) techniques were chosen for the study. Selected cases have a variety of anatomical dimensions and pathologies. Angles of {+-}3 deg. and {+-}5 deg. in all directions were selected to simulate the rotational errors. Dose variations in different regions of the brain, CTVs, and OARs were evaluated to illustrate the various spatial effects of dose differences before and after rotations. DVHs accounting for rotations that were recomputed by the treatment planning system (TPS) and those generated by the ODMORC method were compared. A framework of a fast algorithm for multicontour rotation implemented by ODMORC is introduced as well. Results: The average values of relative dose variations between original dose and recomputed dose accounting for rotations were greater than 4.0% and 10.0% in absolute mean and in standard deviation, respectively, at the skull and adjacent regions for all cases. They were less than 1.0% and 2.5% in absolute mean and in standard deviation, respectively, for dose points 3 mm away from the skull. The results indicated that spatial dose to any part of the brain organs or tumors separated from the skull or head surface would be relatively stable before and after rotations. Statistical data of CTVs and OARs indicate the lens and cochleas have the large dose variations before and after rotations

  1. Benchmarking the next generation of homology inference tools.

    Science.gov (United States)

    Saripella, Ganapathi Varma; Sonnhammer, Erik L L; Forslund, Kristoffer

    2016-09-01

    Over the last decades, vast numbers of sequences were deposited in public databases. Bioinformatics tools allow homology and consequently functional inference for these sequences. New profile-based homology search tools have been introduced, allowing reliable detection of remote homologs, but have not been systematically benchmarked. To provide such a comparison, which can guide bioinformatics workflows, we extend and apply our previously developed benchmark approach to evaluate the 'next generation' of profile-based approaches, including CS-BLAST, HHSEARCH and PHMMER, in comparison with the non-profile based search tools NCBI-BLAST, USEARCH, UBLAST and FASTA. We generated challenging benchmark datasets based on protein domain architectures within either the PFAM + Clan, SCOP/Superfamily or CATH/Gene3D domain definition schemes. From each dataset, homologous and non-homologous protein pairs were aligned using each tool, and standard performance metrics calculated. We further measured congruence of domain architecture assignments in the three domain databases. CSBLAST and PHMMER had overall highest accuracy. FASTA, UBLAST and USEARCH showed large trade-offs of accuracy for speed optimization. Profile methods are superior at inferring remote homologs but the difference in accuracy between methods is relatively small. PHMMER and CSBLAST stand out with the highest accuracy, yet still at a reasonable computational cost. Additionally, we show that less than 0.1% of Swiss-Prot protein pairs considered homologous by one database are considered non-homologous by another, implying that these classifications represent equivalent underlying biological phenomena, differing mostly in coverage and granularity. Benchmark datasets and all scripts are placed at (http://sonnhammer.org/download/Homology_benchmark). forslund@embl.de Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press.

  2. Benchmarking homogenization algorithms for monthly data

    Directory of Open Access Journals (Sweden)

    V. K. C. Venema

    2012-01-01

    Full Text Available The COST (European Cooperation in Science and Technology Action ES0601: advances in homogenization methods of climate series: an integrated approach (HOME has executed a blind intercomparison and validation study for monthly homogenization algorithms. Time series of monthly temperature and precipitation were evaluated because of their importance for climate studies and because they represent two important types of statistics (additive and multiplicative. The algorithms were validated against a realistic benchmark dataset. The benchmark contains real inhomogeneous data as well as simulated data with inserted inhomogeneities. Random independent break-type inhomogeneities with normally distributed breakpoint sizes were added to the simulated datasets. To approximate real world conditions, breaks were introduced that occur simultaneously in multiple station series within a simulated network of station data. The simulated time series also contained outliers, missing data periods and local station trends. Further, a stochastic nonlinear global (network-wide trend was added.

    Participants provided 25 separate homogenized contributions as part of the blind study. After the deadline at which details of the imposed inhomogeneities were revealed, 22 additional solutions were submitted. These homogenized datasets were assessed by a number of performance metrics including (i the centered root mean square error relative to the true homogeneous value at various averaging scales, (ii the error in linear trend estimates and (iii traditional contingency skill scores. The metrics were computed both using the individual station series as well as the network average regional series. The performance of the contributions depends significantly on the error metric considered. Contingency scores by themselves are not very informative. Although relative homogenization algorithms typically improve the homogeneity of temperature data, only the best ones improve

  3. 2D EPID dose calibration for pretreatment quality control of conformal and IMRT fields: A simple and fast convolution approach.

    Science.gov (United States)

    Camilleri, Jérémy; Mazurier, Jocelyne; Franck, Denis; Dudouet, Philippe; Latorzeff, Igor; Franceries, Xavier

    2016-01-01

    This work presents an original algorithm that converts the signal of an electronic portal imaging device (EPID) into absorbed dose in water at the depth of maximum. The model includes a first image pre-processing step that accounts for the non-uniformity of the detector response but also for the perturbation of the signal due to backscatter radiation. Secondly, the image is converted into absorbed dose to water through a linear conversion function associated with a dose redistribution kernel. These two computation parameters were modelled by correlating the on-axis EPID signal with absorbed dose measurements obtained on square fields by using an ionization chamber placed in water at the depth of maximum dose. The accuracy of the algorithm was assessed by comparing the dose determined from the EPID signal with the dose derived by the treatment planning system (TPS) using the ϒ-index. These comparisons were performed on 8 conformal radiotherapy treatment fields (3DCRT) and 18 modulated fields (IMRT). For a dose difference and a distance-to-agreement set to 3% of the maximum dose and 2 mm respectively, the mean percentage of points with a ϒ-value less than or equal to 1 was 99.8% ± 0.1% for 3DCRT fields and 96.8% ± 2.7% for IMRT fields. Moreover, the mean gamma values were always less than 0.5 whatever the treatment technique. These results confirm that our algorithm is an accurate and suitable tool for clinical use in a context of IMRT quality assurance programmes. Copyright © 2015 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  4. A rational quantitative approach to determine the best dosing regimen for a target therapeutic effect: a unified formalism for antibiotic evaluation.

    Science.gov (United States)

    Li, Jun; Nekka, Fahima

    2013-02-21

    The determination of an optimal dosing regimen is a critical step to enhance the drug efficacy and avoid toxicity. Rational dosing recommendations based on mathematical considerations are increasingly being adopted in the process of drug development and use. In this paper, we propose a quantitative approach to evaluate the efficacy of antibiotic agents. By integrating both pharmacokinetic (PK) and pharmacodynamic (PD) information, this approach gives rise to a unified formalism able to measure the cause-effect of dosing regimens. This new pharmaco-metric allows to cover a whole range of antibiotics, including the two well known concentration and time dependent classes, through the introduction of the Hill-dependency concept. As a direct fallout, our formalism opens a new path toward the bioequivalence evaluation in terms of PK and PD, which associates the in vivo drug concentration and the in vitro drug effect. Using this new approach, we succeeded to reveal unexpected, but relevant behaviors of drug performance when different drug regimens and drug classes are considered. Of particular notice, we found that the doses required to reach the same therapeutic effect, when scheduled differently, exhibit completely different tendencies for concentration and time dependent drugs. Moreover, we theoretically confirmed the previous experimental results of the superiority of the once daily regimen of aminoglycosides. The proposed methodology is appealing for its computational features and can easily be applicable to design fair clinical protocols or rationalize prescription decisions.

  5. SeSBench - An initiative to benchmark reactive transport models for environmental subsurface processes

    Science.gov (United States)

    Jacques, Diederik

    2017-04-01

    As soil functions are governed by a multitude of interacting hydrological, geochemical and biological processes, simulation tools coupling mathematical models for interacting processes are needed. Coupled reactive transport models are a typical example of such coupled tools mainly focusing on hydrological and geochemical coupling (see e.g. Steefel et al., 2015). Mathematical and numerical complexity for both the tool itself or of the specific conceptual model can increase rapidly. Therefore, numerical verification of such type of models is a prerequisite for guaranteeing reliability and confidence and qualifying simulation tools and approaches for any further model application. In 2011, a first SeSBench -Subsurface Environmental Simulation Benchmarking- workshop was held in Berkeley (USA) followed by four other ones. The objective is to benchmark subsurface environmental simulation models and methods with a current focus on reactive transport processes. The final outcome was a special issue in Computational Geosciences (2015, issue 3 - Reactive transport benchmarks for subsurface environmental simulation) with a collection of 11 benchmarks. Benchmarks, proposed by the participants of the workshops, should be relevant for environmental or geo-engineering applications; the latter were mostly related to radioactive waste disposal issues - excluding benchmarks defined for pure mathematical reasons. Another important feature is the tiered approach within a benchmark with the definition of a single principle problem and different sub problems. The latter typically benchmarked individual or simplified processes (e.g. inert solute transport, simplified geochemical conceptual model) or geometries (e.g. batch or one-dimensional, homogeneous). Finally, three codes should be involved into a benchmark. The SeSBench initiative contributes to confidence building for applying reactive transport codes. Furthermore, it illustrates the use of those type of models for different

  6. Inverse treatment planning for spinal robotic radiosurgery: an international multi-institutional benchmark trial.

    Science.gov (United States)

    Blanck, Oliver; Wang, Lei; Baus, Wolfgang; Grimm, Jimm; Lacornerie, Thomas; Nilsson, Joakim; Luchkovskyi, Sergii; Palazon Cano, Isabel; Shou, Zhenyu; Ayadi, Myriam; Treuer, Harald; Viard, Romain; Siebert, Frank-Andre; Chan, Mark K H; Hildebrandt, Guido; Dunst, Jürgen; Imhoff, Detlef; Wurster, Stefan; Wolff, Robert; Romanelli, Pantaleo; Lartigau, Eric; Semrau, Robert; Soltys, Scott G; Schweikard, Achim

    2016-01-01

    Stereotactic radiosurgery (SRS) is the accurate, conformal delivery of high-dose radiation to well-defined targets while minimizing normal structure doses via steep dose gradients. While inverse treatment planning (ITP) with computerized optimization algorithms are routine, many aspects of the planning process remain user-dependent. We performed an international, multi-institutional benchmark trial to study planning variability and to analyze preferable ITP practice for spinal robotic radiosurgery. 10 SRS treatment plans were generated for a complex-shaped spinal metastasis with 21 Gy in 3 fractions and tight constraints for spinal cord (V14Gy 95%). The resulting plans were rated on a scale from 1 to 4 (excellent-poor) in five categories (constraint compliance, optimization goals, low-dose regions, ITP complexity, and clinical acceptability) by a blinded review panel. Additionally, the plans were mathemati-cally rated based on plan indices (critical structure and target doses, conformity, monitor units, normal tissue complication probability, and treatment time) and compared to the human rankings. The treatment plans and the reviewers' rankings varied substantially among the participating centers. The average mean overall rank was 2.4 (1.2-4.0) and 8/10 plans were rated excellent in at least one category by at least one reviewer. The mathematical rankings agreed with the mean overall human rankings in 9/10 cases pointing toward the possibility for sole mathematical plan quality comparison. The final rankings revealed that a plan with a well-balanced trade-off among all planning objectives was preferred for treatment by most par-ticipants, reviewers, and the mathematical ranking system. Furthermore, this plan was generated with simple planning techniques. Our multi-institutional planning study found wide variability in ITP approaches for spinal robotic radiosurgery. The participants', reviewers', and mathematical match on preferable treatment plans and ITP

  7. Inverse treatment planning for spinal robotic radiosurgery: an international multi-institutional benchmark trial.

    Science.gov (United States)

    Blanck, Oliver; Wang, Lei; Baus, Wolfgang; Grimm, Jimm; Lacornerie, Thomas; Nilsson, Joakim; Luchkovskyi, Sergii; Cano, Isabel Palazon; Shou, Zhenyu; Ayadi, Myriam; Treuer, Harald; Viard, Romain; Siebert, Frank-Andre; Chan, Mark K H; Hildebrandt, Guido; Dunst, Jürgen; Imhoff, Detlef; Wurster, Stefan; Wolff, Robert; Romanelli, Pantaleo; Lartigau, Eric; Semrau, Robert; Soltys, Scott G; Schweikard, Achim

    2016-05-01

    Stereotactic radiosurgery (SRS) is the accurate, conformal delivery of high-dose radiation to well-defined targets while minimizing normal structure doses via steep dose gradients. While inverse treatment planning (ITP) with computerized optimization algorithms are routine, many aspects of the planning process remain user-dependent. We performed an international, multi-institutional benchmark trial to study planning variability and to analyze preferable ITP practice for spinal robotic radiosurgery. 10 SRS treatment plans were generated for a complex-shaped spinal metastasis with 21 Gy in 3 fractions and tight constraints for spinal cord (V14Gy95%). The resulting plans were rated on a scale from 1 to 4 (excellent-poor) in five categories (constraint compliance, optimization goals, low-dose regions, ITP complexity, and clinical acceptability) by a blinded review panel. Additionally, the plans were mathematically rated based on plan indices (critical structure and target doses, conformity, monitor units, normal tissue complication probability, and treatment time) and compared to the human rankings. The treatment plans and the reviewers' rankings varied substantially among the participating centers. The average mean overall rank was 2.4 (1.2-4.0) and 8/10 plans were rated excellent in at least one category by at least one reviewer. The mathematical rankings agreed with the mean overall human rankings in 9/10 cases pointing toward the possibility for sole mathematical plan quality comparison. The final rankings revealed that a plan with a well-balanced trade-off among all planning objectives was preferred for treatment by most participants, reviewers, and the mathematical ranking system. Furthermore, this plan was generated with simple planning techniques. Our multi-institutional planning study found wide variability in ITP approaches for spinal robotic radiosurgery. The participants', reviewers', and mathematical match on preferable treatment plans and ITP techniques

  8. Generating Shifting Workloads to Benchmark Adaptability in Relational Database Systems

    Science.gov (United States)

    Rabl, Tilmann; Lang, Andreas; Hackl, Thomas; Sick, Bernhard; Kosch, Harald

    A large body of research concerns the adaptability of database systems. Many commercial systems already contain autonomic processes that adapt configurations as well as data structures and data organization. Yet there is virtually no possibility for a just measurement of the quality of such optimizations. While standard benchmarks have been developed that simulate real-world database applications very precisely, none of them considers variations in workloads produced by human factors. Today’s benchmarks test the performance of database systems by measuring peak performance on homogeneous request streams. Nevertheless, in systems with user interaction access patterns are constantly shifting. We present a benchmark that simulates a web information system with interaction of large user groups. It is based on the analysis of a real online eLearning management system with 15,000 users. The benchmark considers the temporal dependency of user interaction. Main focus is to measure the adaptability of a database management system according to shifting workloads. We will give details on our design approach that uses sophisticated pattern analysis and data mining techniques.

  9. Benchmarking in the Academic Departments using Data Envelopment Analysis

    Directory of Open Access Journals (Sweden)

    Mohammad M. Rayeni

    2010-01-01

    Full Text Available Problem statement: The purpose of this study is to analyze efficiency and benchmarking using Data Envelopment Analysis (DEA in departments of University. Benchmarking is a process of defining valid measures of performance comparison among peer decision making units (DMUs, using them to determine the relative positions of the peer DMUs and, ultimately, establishing a standard of excellence. Approach: DEA can be regarded as a benchmarking tool, because the frontier identified can be regarded as an empirical standard of excellence. Once the frontier is established, then one may compare a set of DMUs to the frontier. Results: We apply benchmarking to detect mistakes of inefficient departments to become efficient and to learn better managerial practice. Conclusion: The results indicated 9 departments are inefficient between 21 departments. The average inefficiency is 0.8516. Inefficient departments dont have excess in the number of teaching staff, but all of them have excess the number of registered student. The shortage of performed research works is the most important indicators of outputs in inefficient departments, which must be corrected.

  10. Benchmarking: A tool to enhance performance

    Energy Technology Data Exchange (ETDEWEB)

    Munro, J.F. [Oak Ridge National Lab., TN (United States); Kristal, J. [USDOE Assistant Secretary for Environmental Management, Washington, DC (United States); Thompson, G.; Johnson, T. [Los Alamos National Lab., NM (United States)

    1996-12-31

    The Office of Environmental Management is bringing Headquarters and the Field together to implement process improvements throughout the Complex through a systematic process of organizational learning called benchmarking. Simply stated, benchmarking is a process of continuously comparing and measuring practices, processes, or methodologies with those of other private and public organizations. The EM benchmarking program, which began as the result of a recommendation from Xerox Corporation, is building trust and removing barriers to performance enhancement across the DOE organization. The EM benchmarking program is designed to be field-centered with Headquarters providing facilitatory and integrative functions on an ``as needed`` basis. One of the main goals of the program is to assist Field Offices and their associated M&O/M&I contractors develop the capabilities to do benchmarking for themselves. In this regard, a central precept is that in order to realize tangible performance benefits, program managers and staff -- the ones closest to the work - must take ownership of the studies. This avoids the ``check the box`` mentality associated with some third party studies. This workshop will provide participants with a basic level of understanding why the EM benchmarking team was developed and the nature and scope of its mission. Participants will also begin to understand the types of study levels and the particular methodology the EM benchmarking team is using to conduct studies. The EM benchmarking team will also encourage discussion on ways that DOE (both Headquarters and the Field) can team with its M&O/M&I contractors to conduct additional benchmarking studies. This ``introduction to benchmarking`` is intended to create a desire to know more and a greater appreciation of how benchmarking processes could be creatively employed to enhance performance.

  11. Benchmarking ICRF simulations for ITER

    Energy Technology Data Exchange (ETDEWEB)

    R. V. Budny, L. Berry, R. Bilato, P. Bonoli, M. Brambilla, R.J. Dumont, A. Fukuyama, R. Harvey, E.F. Jaeger, E. Lerche, C.K. Phillips, V. Vdovin, J. Wright, and members of the ITPA-IOS

    2010-09-28

    Abstract Benchmarking of full-wave solvers for ICRF simulations is performed using plasma profiles and equilibria obtained from integrated self-consistent modeling predictions of four ITER plasmas. One is for a high performance baseline (5.3 T, 15 MA) DT H-mode plasma. The others are for half-field, half-current plasmas of interest for the pre-activation phase with bulk plasma ion species being either hydrogen or He4. The predicted profiles are used by seven groups to predict the ICRF electromagnetic fields and heating profiles. Approximate agreement is achieved for the predicted heating power partitions for the DT and He4 cases. Profiles of the heating powers and electromagnetic fields are compared.

  12. Benchmarking Asteroid-Deflection Experiment

    Science.gov (United States)

    Remington, Tane; Bruck Syal, Megan; Owen, John Michael; Miller, Paul L.

    2016-10-01

    An asteroid impacting Earth could have devastating consequences. In preparation to deflect or disrupt one before it reaches Earth, it is imperative to have modeling capabilities that adequately simulate the deflection actions. Code validation is key to ensuring full confidence in simulation results used in an asteroid-mitigation plan. We are benchmarking well-known impact experiments using Spheral, an adaptive smoothed-particle hydrodynamics code, to validate our modeling of asteroid deflection. We describe our simulation results, compare them with experimental data, and discuss what we have learned from our work. This work was performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344. LLNL-ABS-695540

  13. NASA Software Engineering Benchmarking Study

    Science.gov (United States)

    Rarick, Heather L.; Godfrey, Sara H.; Kelly, John C.; Crumbley, Robert T.; Wifl, Joel M.

    2013-01-01

    To identify best practices for the improvement of software engineering on projects, NASA's Offices of Chief Engineer (OCE) and Safety and Mission Assurance (OSMA) formed a team led by Heather Rarick and Sally Godfrey to conduct this benchmarking study. The primary goals of the study are to identify best practices that: Improve the management and technical development of software intensive systems; Have a track record of successful deployment by aerospace industries, universities [including research and development (R&D) laboratories], and defense services, as well as NASA's own component Centers; and Identify candidate solutions for NASA's software issues. Beginning in the late fall of 2010, focus topics were chosen and interview questions were developed, based on the NASA top software challenges. Between February 2011 and November 2011, the Benchmark Team interviewed a total of 18 organizations, consisting of five NASA Centers, five industry organizations, four defense services organizations, and four university or university R and D laboratory organizations. A software assurance representative also participated in each of the interviews to focus on assurance and software safety best practices. Interviewees provided a wealth of information on each topic area that included: software policy, software acquisition, software assurance, testing, training, maintaining rigor in small projects, metrics, and use of the Capability Maturity Model Integration (CMMI) framework, as well as a number of special topics that came up in the discussions. NASA's software engineering practices compared favorably with the external organizations in most benchmark areas, but in every topic, there were ways in which NASA could improve its practices. Compared to defense services organizations and some of the industry organizations, one of NASA's notable weaknesses involved communication with contractors regarding its policies and requirements for acquired software. One of NASA's strengths

  14. COG validation: SINBAD Benchmark Problems

    Energy Technology Data Exchange (ETDEWEB)

    Lent, E M; Sale, K E; Buck, R M; Descalle, M

    2004-02-23

    We validated COG, a 3D Monte Carlo radiation transport code, against experimental data and MNCP4C simulations from the Shielding Integral Benchmark Archive Database (SINBAD) compiled by RSICC. We modeled three experiments: the Osaka Nickel and Aluminum sphere experiments conducted at the OKTAVIAN facility, and the liquid oxygen experiment conducted at the FNS facility. COG results are in good agreement with experimental data and generally within a few % of MCNP results. There are several possible sources of discrepancy between MCNP and COG results: (1) the cross-section database versions are different, MCNP uses ENDFB VI 1.1 while COG uses ENDFB VIR7, (2) the code implementations are different, and (3) the models may differ slightly. We also limited the use of variance reduction methods when running the COG version of the problems.

  15. General benchmarks for quantum repeaters

    CERN Document Server

    Pirandola, Stefano

    2015-01-01

    Using a technique based on quantum teleportation, we simplify the most general adaptive protocols for key distribution, entanglement distillation and quantum communication over a wide class of quantum channels in arbitrary dimension. Thanks to this method, we bound the ultimate rates for secret key generation and quantum communication through single-mode Gaussian channels and several discrete-variable channels. In particular, we derive exact formulas for the two-way assisted capacities of the bosonic quantum-limited amplifier and the dephasing channel in arbitrary dimension, as well as the secret key capacity of the qubit erasure channel. Our results establish the limits of quantum communication with arbitrary systems and set the most general and precise benchmarks for testing quantum repeaters in both discrete- and continuous-variable settings.

  16. Realistic approach to estimate lens doses and cataract radiation risk in cardiology when personal dosimeters have not been regularly used.

    Science.gov (United States)

    Vañó, Eliseo; Fernández, José M; Sánchez, Roberto M; Dauer, Lawrence T

    2013-10-01

    Interventional fluoroscopic guided cardiac procedures lead to radiation exposure to the lenses of the eyes of cardiologists, which over time may be associated with an increased risk of cataracts. This study derives radiation doses to the lens of the eye in cardiac catheterization laboratories from measurements of individual procedures to allow for estimates of such doses for those cases when personal dosimeters have not been used regularly. Using active electronic dosimeters at the C-arm (at 95 cm from the isocenter), scatter radiation doses have been measured for cardiac procedures and estimated radiation doses to the lenses of the cardiologists for different groups of procedures (diagnostic, PTCAs, and valvular). Correlation factors with kerma area product included in the patient dose reports have been derived. The mean, median, and third quartile scatter dose values per procedure at the C-arm for 1,969 procedures were 0.99, 0.78 and 1.25 mSv, respectively; for coronary angiography, 0.51, 0.45, and 0.61 mSv, respectively; for PTCAs, 1.29, 1.07, and 1.56 mSv; and for valvular procedures, 1.64, 1.45, and 2.66 mSv, respectively. For all the procedures, the ratio between the scatter dose at the C-arm and the kerma area product resulted in between 10.3-11.3 μSv Gy cm. The experimental results of this study allow for realistic estimations of the dose to the lenses of the eyes from the workload of the cardiologists and from the level of use of radiation protection tools when personal dosimeters have not been regularly used.

  17. Integrating the Nqueens Algorithm into a Parameterized Benchmark Suite

    Science.gov (United States)

    2016-02-01

    claim that autotuning is needed. However, they concentrate on a Message Passing Interface (MPI)/ OpenCL approach, whereas we are benchmarking using...only OpenCL . 4. Backtrack Branch and Bound The BBB algorithm is a way to search for a solution to a problem among a variety of potential solutions...heterogeneous computers. This is especially true when using a portable application program interface (API) such as OpenCL , which was used for this work. There

  18. Benchmarking holiday experience: the case of senior tourists

    OpenAIRE

    Johann, M; Panchapakesan, P.

    2016-01-01

    WOS:000386788500013 (Nº de Acesso Web of Science) Purpose: The purpose of this paper is to determine and benchmark the senior tourists’ preferences by considering the importance attached by them and their perception with respect to internal tourism attributes (i.e. package tour characteristics) and external tourism attributes (i.e. destination features). Design/methodology/approach: The present study makes use of importance-performance analysis and employs paired sample t-test for this pur...

  19. 42 CFR 440.330 - Benchmark health benefits coverage.

    Science.gov (United States)

    2010-10-01

    ... 42 Public Health 4 2010-10-01 2010-10-01 false Benchmark health benefits coverage. 440.330 Section... SERVICES (CONTINUED) MEDICAL ASSISTANCE PROGRAMS SERVICES: GENERAL PROVISIONS Benchmark Benefit and Benchmark-Equivalent Coverage § 440.330 Benchmark health benefits coverage. Benchmark coverage is...

  20. Synergetic effect of benchmarking competitive advantages

    Directory of Open Access Journals (Sweden)

    N.P. Tkachova

    2011-12-01

    Full Text Available It is analyzed the essence of synergistic competitive benchmarking. The classification of types of synergies is developed. It is determined the sources of synergies in conducting benchmarking of competitive advantages. It is proposed methodological framework definition of synergy in the formation of competitive advantage.

  1. Synergetic effect of benchmarking competitive advantages

    OpenAIRE

    N.P. Tkachova; P.G. Pererva

    2011-01-01

    It is analyzed the essence of synergistic competitive benchmarking. The classification of types of synergies is developed. It is determined the sources of synergies in conducting benchmarking of competitive advantages. It is proposed methodological framework definition of synergy in the formation of competitive advantage.

  2. Benchmarking set for domestic smart grid management

    NARCIS (Netherlands)

    Bosman, M.G.C.; Bakker, Vincent; Molderink, Albert; Hurink, Johann L.; Smit, Gerardus Johannes Maria

    2010-01-01

    In this paper we propose a benchmark for domestic smart grid management. It consists of an in-depth description of a domestic smart grid, in which local energy consumers, producers and buffers can be controlled. First, from this description a general benchmark framework is derived, which can be used

  3. Machines are benchmarked by code, not algorithms

    NARCIS (Netherlands)

    Poss, R.

    2013-01-01

    This article highlights how small modifications to either the source code of a benchmark program or the compilation options may impact its behavior on a specific machine. It argues that for evaluating machines, benchmark providers and users be careful to ensure reproducibility of results based on th

  4. Benchmark analysis of railway networks and undertakings

    NARCIS (Netherlands)

    Hansen, I.A.; Wiggenraad, P.B.L.; Wolff, J.W.

    2013-01-01

    Benchmark analysis of railway networks and companies has been stimulated by the European policy of deregulation of transport markets, the opening of national railway networks and markets to new entrants and separation of infrastructure and train operation. Recent international railway benchmarking s

  5. Benchmark Assessment for Improved Learning. AACC Report

    Science.gov (United States)

    Herman, Joan L.; Osmundson, Ellen; Dietel, Ronald

    2010-01-01

    This report describes the purposes of benchmark assessments and provides recommendations for selecting and using benchmark assessments--addressing validity, alignment, reliability, fairness and bias and accessibility, instructional sensitivity, utility, and reporting issues. We also present recommendations on building capacity to support schools'…

  6. Benchmark Two-Good Utility Functions

    NARCIS (Netherlands)

    de Jaegher, K.

    Benchmark two-good utility functions involving a good with zero income elasticity and unit income elasticity are well known. This paper derives utility functions for the additional benchmark cases where one good has zero cross-price elasticity, unit own-price elasticity, and zero own price

  7. Benchmark Two-Good Utility Functions

    NARCIS (Netherlands)

    de Jaegher, K.

    2007-01-01

    Benchmark two-good utility functions involving a good with zero income elasticity and unit income elasticity are well known. This paper derives utility functions for the additional benchmark cases where one good has zero cross-price elasticity, unit own-price elasticity, and zero own price elasticit

  8. A Seafloor Benchmark for 3-dimensional Geodesy

    Science.gov (United States)

    Chadwell, C. D.; Webb, S. C.; Nooner, S. L.

    2014-12-01

    We have developed an inexpensive, permanent seafloor benchmark to increase the longevity of seafloor geodetic measurements. The benchmark provides a physical tie to the sea floor lasting for decades (perhaps longer) on which geodetic sensors can be repeatedly placed and removed with millimeter resolution. Global coordinates estimated with seafloor geodetic techniques will remain attached to the benchmark allowing for the interchange of sensors as they fail or become obsolete, or for the sensors to be removed and used elsewhere, all the while maintaining a coherent series of positions referenced to the benchmark. The benchmark has been designed to free fall from the sea surface with transponders attached. The transponder can be recalled via an acoustic command sent from the surface to release from the benchmark and freely float to the sea surface for recovery. The duration of the sensor attachment to the benchmark will last from a few days to a few years depending on the specific needs of the experiment. The recovered sensors are then available to be reused at other locations, or again at the same site in the future. Three pins on the sensor frame mate precisely and unambiguously with three grooves on the benchmark. To reoccupy a benchmark a Remotely Operated Vehicle (ROV) uses its manipulator arm to place the sensor pins into the benchmark grooves. In June 2014 we deployed four benchmarks offshore central Oregon. We used the ROV Jason to successfully demonstrate the removal and replacement of packages onto the benchmark. We will show the benchmark design and its operational capabilities. Presently models of megathrust slip within the Cascadia Subduction Zone (CSZ) are mostly constrained by the sub-aerial GPS vectors from the Plate Boundary Observatory, a part of Earthscope. More long-lived seafloor geodetic measures are needed to better understand the earthquake and tsunami risk associated with a large rupture of the thrust fault within the Cascadia subduction zone

  9. Benchmark for license plate character segmentation

    Science.gov (United States)

    Gonçalves, Gabriel Resende; da Silva, Sirlene Pio Gomes; Menotti, David; Shwartz, William Robson

    2016-09-01

    Automatic license plate recognition (ALPR) has been the focus of many researches in the past years. In general, ALPR is divided into the following problems: detection of on-track vehicles, license plate detection, segmentation of license plate characters, and optical character recognition (OCR). Even though commercial solutions are available for controlled acquisition conditions, e.g., the entrance of a parking lot, ALPR is still an open problem when dealing with data acquired from uncontrolled environments, such as roads and highways when relying only on imaging sensors. Due to the multiple orientations and scales of the license plates captured by the camera, a very challenging task of the ALPR is the license plate character segmentation (LPCS) step, because its effectiveness is required to be (near) optimal to achieve a high recognition rate by the OCR. To tackle the LPCS problem, this work proposes a benchmark composed of a dataset designed to focus specifically on the character segmentation step of the ALPR within an evaluation protocol. Furthermore, we propose the Jaccard-centroid coefficient, an evaluation measure more suitable than the Jaccard coefficient regarding the location of the bounding box within the ground-truth annotation. The dataset is composed of 2000 Brazilian license plates consisting of 14000 alphanumeric symbols and their corresponding bounding box annotations. We also present a straightforward approach to perform LPCS efficiently. Finally, we provide an experimental evaluation for the dataset based on five LPCS approaches and demonstrate the importance of character segmentation for achieving an accurate OCR.

  10. Fluconazole dosing predictions in critically-ill patients receiving prolonged intermittent renal replacement therapy: a Monte Carlo simulation approach.

    Science.gov (United States)

    Gharibian, Katherine N; Mueller, Bruce A

    2016-07-01

    Fluconazole is a renally-eliminated antifungal commonly used to treat Candida species infections. In critically-ill patients receiving prolonged intermittent renal replacement therapy (PIRRT), limited pharmacokinetic (PK) data are available to guide fluconazole dosing. We used previously-published fluconazole clearance data and PK data of critically-ill patients with acute kidney injury to develop a PK model with the goal of determining a therapeutic dosing regimen for critically-ill patients receiving PIRRT. Monte Carlo simulations were performed to create a virtual cohort of patients receiving different fluconazole dosing regimens. Plasma drug concentration-time profiles were evaluated on the probability of attaining a mean 24-hour area under the drug concentration-time curve to minimum inhibitory concentration ratio (AUC24h : MIC) of 100 during the initial 48 hours of antifungal therapy. At the susceptibility breakpoint of Candida albicans (2 mg/L), 93 - 96% of simulated subjects receiving PIRRT attained the pharmacodynamic target with a fluconazole 800-mg loading dose plus 400 mg twice daily (q12h or pre and post PIRRT) regimen. Monte Carlo simulations of a PK model of PIRRT provided a basis for the development of an informed fluconazole dosing recommendation when PK data was limited. This finding should be validated in the clinical setting.

  11. Combined magnetic resonance urography and targeted helical CT in patients with renal colic: a new approach to reduce delivered dose.

    Science.gov (United States)

    Blandino, Alfredo; Minutoli, Fabio; Scribano, Emanuele; Vinci, Sergio; Magno, Carlo; Pergolizzi, Stefano; Settineri, Nicola; Pandolfo, Ignazio; Gaeta, Michele

    2004-08-01

    To determine whether magnetic resonance urography (MRU), obtained before helical computed tomography (CT) in patients with acute renal colic, can help delimit the obstructed area to be subsequently examined by a targeted CT scan, thus reducing the dose of radiation. Patients (51) with symptoms of acute renal colic underwent MRU and a total urinary tract helical CT. CT images from the 5 cm below the level of ureteral obstruction as demonstrated by MRU were selected out. Combined interpretation of MRU and selected CT images constituted protocol A. Protocol B consisted of the entire unenhanced helical CT of the urinary tract. The two protocols were compared regarding the following points: 1) sensitivity in diagnosing the presence of obstructing urinary stones, and 2) the delivered radiation dose. Protocol A and protocol B had, respectively, 98% and 100% sensitivity in demonstrating ureteral stone as a cause of renal colic. Estimated average dose calculated from phantom study was 0.52 mSv for protocol A and 2.83 mSv for protocol B. Therefore, the effective radiation dose was 5.4 times lower in protocol A compared to protocol B. Combined MRU and short helical CT has a high sensitivity in detecting ureteral calculi with a reduced radiation dose. Copyright 2004 Wiley-Liss, Inc.

  12. Analysis of ovarian dose of women employed in the radium watch dial industry: A macrodosimetric and microdosimetric approach

    Energy Technology Data Exchange (ETDEWEB)

    Roeske, J.C. [Univ. of Chicago, IL (United States). Dept. of Radiation and Cellular Oncology; Stinchcomb, T.G. [DePaul Univ., Chicago, IL (United States). Dept. of Physics; Schieve, L. [Univ. of Illinois, Chicago, IL (United States). Dept. of Epidemiology and Biostatistics; Keane, A. [Argonne National Lab., IL (United States). Environmental, Safety and Health Div.

    1999-01-01

    In the 1920s, painters in the radium watch dial industry frequently tipped their brushes with their tongues resulting in the ingestion of radium-226 and/or radium-228. Earlier dosimetric studies (1950--1990) attempted to correlate the magnitude of biological effects (e.g., increased cancer incidence) with variations in radium uptake. Recently, there is a renewed interest on the part of epidemiologists studying additional possible effects (e.g., low birthrate and sex ratio). The goal of this work is to review and update the determination of dose to the ovaries from both external and internal radiation hazards in an attempt to correlate ovarian dose with these additional possible effects. The dose to the ovaries can be attributed to four major sources: (1) external gamma irradiation from the containers of radium paint; (2) alpha and (3) beta particle emissions due to sources which decay within the ovaries; and (4) internal gamma irradiation released throughout the body. Data obtained in earlier dosimetric studies on the quantity of Ra-226 and/or Ra-228 ingested were used in this study. Dose is estimated on a macroscopic scale by calculating the average dose deposited within the entire ovary. In addition, a microdosimetric analysis is performed which considers the statistical variation of energy deposited within individual oocyte nuclei. Sources of uncertainty, and the use of these data in new epidemiological studies are discussed.

  13. Radiation dose and intra-articular access: comparison of the lateral mortise and anterior midline approaches to fluoroscopically guided tibiotalar joint injections

    Energy Technology Data Exchange (ETDEWEB)

    Huang, Ambrose J.; Torriani, Martin; Bredella, Miriam A.; Chang, Connie Y.; Simeone, Frank J.; Palmer, William E. [Massachusetts General Hospital, Department of Radiology, Division of Musculoskeletal Imaging and Intervention, Boston, MA (United States); Balza, Rene [Centro Medico de Occidente, Department of Radiology, Maracaibo (Venezuela, Bolivarian Republic of)

    2016-03-15

    To compare the lateral mortise and anterior midline approaches to fluoroscopically guided tibiotalar joint injections with respect to successful intra-articular needle placement, fluoroscopy time, radiation dose, and dose area product (DAP). This retrospective study was IRB-approved and HIPAA-compliant. 498 fluoroscopically guided tibiotalar joint injections were performed or supervised by one of nine staff radiologists from 11/1/2010-12/31/2013. The injection approach was determined by operator preference. Images were reviewed on a PACS workstation to determine the injection approach (lateral mortise versus anterior midline) and to confirm intra-articular needle placement. Fluoroscopy time (minutes), radiation dose (mGy), and DAP (μGy-m{sup 2}) were recorded and compared using the student's t-test (fluoroscopy time) or the Wilcoxon rank sum test (radiation dose and DAP). There were 246 lateral mortise injections and 252 anterior midline injections. Two lateral mortise injections were excluded from further analysis because no contrast was administered. Intra-articular location of the needle tip was documented in 242/244 lateral mortise injections and 252/252 anterior midline injections. Mean fluoroscopy time was shorter for the lateral mortise group than the anterior midline group (0.7 ± 0.5 min versus 1.2 ± 0.8 min, P < 0.0001). Mean radiation dose and DAP were less for the lateral mortise group than the anterior midline group (2.1 ± 3.7 mGy versus 2.5 ± 3.5 mGy, P = 0.04; 11.5 ± 15.3 μGy-m{sup 2} versus 13.5 ± 17.3 μGy-m{sup 2}, P = 0.006). Both injection approaches resulted in nearly 100 % rates of intra-articular needle placement, but the lateral mortise approach used approximately 40 % less fluoroscopy time and delivered 15 % lower radiation dose and DAP to the patient. (orig.)

  14. A proposed benchmark problem for cargo nuclear threat monitoring

    Energy Technology Data Exchange (ETDEWEB)

    Holmes, Thomas Wesley, E-mail: twholmes@ncsu.edu [Center for Engineering Applications of Radioisotopes, Nuclear Engineering Department, North Carolina State University, Raleigh, NC 27695-7909 (United States); Calderon, Adan; Peeples, Cody R.; Gardner, Robin P. [Center for Engineering Applications of Radioisotopes, Nuclear Engineering Department, North Carolina State University, Raleigh, NC 27695-7909 (United States)

    2011-10-01

    There is currently a great deal of technical and political effort focused on reducing the risk of potential attacks on the United States involving radiological dispersal devices or nuclear weapons. This paper proposes a benchmark problem for gamma-ray and X-ray cargo monitoring with results calculated using MCNP5, v1.51. The primary goal is to provide a benchmark problem that will allow researchers in this area to evaluate Monte Carlo models for both speed and accuracy in both forward and inverse calculational codes and approaches for nuclear security applications. A previous benchmark problem was developed by one of the authors (RPG) for two similar oil well logging problems (Gardner and Verghese, 1991, ). One of those benchmarks has recently been used by at least two researchers in the nuclear threat area to evaluate the speed and accuracy of Monte Carlo codes combined with variance reduction techniques. This apparent need has prompted us to design this benchmark problem specifically for the nuclear threat researcher. This benchmark consists of conceptual design and preliminary calculational results using gamma-ray interactions on a system containing three thicknesses of three different shielding materials. A point source is placed inside the three materials lead, aluminum, and plywood. The first two materials are in right circular cylindrical form while the third is a cube. The entire system rests on a sufficiently thick lead base so as to reduce undesired scattering events. The configuration was arranged in such a manner that as gamma-ray moves from the source outward it first passes through the lead circular cylinder, then the aluminum circular cylinder, and finally the wooden cube before reaching the detector. A 2 in.x4 in.x16 in. box style NaI (Tl) detector was placed 1 m from the point source located in the center with the 4 in.x16 in. side facing the system. The two sources used in the benchmark are {sup 137}Cs and {sup 235}U.

  15. GPUs benchmarking in subpixel image registration algorithm

    Science.gov (United States)

    Sanz-Sabater, Martin; Picazo-Bueno, Jose Angel; Micó, Vicente; Ferrerira, Carlos; Granero, Luis; Garcia, Javier

    2015-05-01

    Image registration techniques are used among different scientific fields, like medical imaging or optical metrology. The straightest way to calculate shifting between two images is using the cross correlation, taking the highest value of this correlation image. Shifting resolution is given in whole pixels which cannot be enough for certain applications. Better results can be achieved interpolating both images, as much as the desired resolution we want to get, and applying the same technique described before, but the memory needed by the system is significantly higher. To avoid memory consuming we are implementing a subpixel shifting method based on FFT. With the original images, subpixel shifting can be achieved multiplying its discrete Fourier transform by a linear phase with different slopes. This method is high time consuming method because checking a concrete shifting means new calculations. The algorithm, highly parallelizable, is very suitable for high performance computing systems. GPU (Graphics Processing Unit) accelerated computing became very popular more than ten years ago because they have hundreds of computational cores in a reasonable cheap card. In our case, we are going to register the shifting between two images, doing the first approach by FFT based correlation, and later doing the subpixel approach using the technique described before. We consider it as `brute force' method. So we will present a benchmark of the algorithm consisting on a first approach (pixel resolution) and then do subpixel resolution approaching, decreasing the shifting step in every loop achieving a high resolution in few steps. This program will be executed in three different computers. At the end, we will present the results of the computation, with different kind of CPUs and GPUs, checking the accuracy of the method, and the time consumed in each computer, discussing the advantages, disadvantages of the use of GPUs.

  16. A molecular and phenotypic integrative approach to identify a no-effect dose level for antiandrogen-induced testicular toxicity.

    Science.gov (United States)

    Ludwig, Sophie; Tinwell, Helen; Schorsch, Frédéric; Cavaillé, Christel; Pallardy, Marc; Rouquié, David; Bars, Rémi

    2011-07-01

    The safety assessment of chemicals for humans relies on identifying no-observed adverse effect levels (NOAELs) in animal toxicity studies using standard methods. With the advent of high information content technologies, especially microarrays, it is pertinent to determine the impact of molecular data on the NOAELs. Consequently, we conducted an integrative study to identify a no-transcriptomic effect dose using microarray analyses coupled with quantitative reverse transcriptase PCR (RT-qPCR) and determined how this correlated with the NOAEL. We assessed the testicular effects of the antiandrogen, flutamide (FM), in a rat 28-day toxicity study using doses of 0.2-30 mg/kg/day. Plasma testosterone levels and testicular histopathology indicated a NOAEL of 1 mg/kg/day. A no-effect dose of 0.2 mg/kg/day was established based on molecular data relevant to the phenotypic changes. We observed differential gene expression starting from 1 mg/kg/day and a deregulation of more than 1500 genes at 30 mg/kg/day. Dose-related changes were identified for the major pathways (e.g., fatty acid metabolism) associated with the testicular lesion (Leydig cell hyperplasia) that were confirmed by RT-qPCR. These data, along with protein accumulation profiles and FM metabolite concentrations in testis, supported the no-effect dose of 0.2 mg/kg/day. Furthermore, the microarray data indicated a dose-dependent change in the fatty acid catabolism pathway, a biological process described for the first time to be affected by FM in testicular tissue. In conclusion, the present data indicate the existence of a transcriptomic threshold, which must be exceeded to progress from a normal state to an adaptative state and subsequently to adverse toxicity.

  17. SU-E-T-117: Dose to Organs Outside of CT Scan Range- Monte Carlo and Hybrid Phantom Approach

    Energy Technology Data Exchange (ETDEWEB)

    Pelletier, C; Jung, J [East Carolina University, Greenville, NC (United States); Lee, C [University of Michigan, Ann Arbor, MI (United States); Kim, J [University of Pittsburgh Medical Center, Pittsburgh, PA (United States); Lee, C [National Cancer Institute, Rockville, MD (United States)

    2014-06-01

    Purpose: Epidemiological study of second cancer risk for cancer survivors often requires the dose to normal tissues located outside the anatomy covered by radiological imaging, which is usually limited to tumor and organs at risk. We have investigated the feasibility of using whole body computational human phantoms for estimating out-of-field organ doses for patients treated by Intensity Modulated Radiation Therapy (IMRT). Methods: Identical 7-field IMRT prostate plans were performed using X-ray Voxel Monte Carlo (XVMC), a radiotherapy-specific Monte Carlo transport code, on the computed tomography (CT) images of the torso of an adult male patient (175 cm height, 66 kg weight) and an adult male hybrid computational phantom with the equivalent body size. Dose to the liver, right lung, and left lung were calculated and compared. Results: Considerable differences are seen between the doses calculated by XVMC for the patient CT and the hybrid phantom. One major contributing factor is the treatment method, deep inspiration breath hold (DIBH), used for this patient. This leads to significant differences in the organ position relative to the treatment isocenter. The transverse distances from the treatment isocenter to the inferior border of the liver, left lung, and right lung are 19.5cm, 29.5cm, and 30.0cm, respectively for the patient CT, compared with 24.3cm, 36.6cm, and 39.1cm, respectively, for the hybrid phantom. When corrected for the distance, the mean doses calculated using the hybrid phantom are within 28% of those calculated using the patient CT. Conclusion: This study showed that mean dose to the organs located in the missing CT coverage can be reconstructed by using whole body computational human phantoms within reasonable dosimetric uncertainty, however appropriate corrections may be necessary if the patient is treated with a technique that will significantly deform the size or location of the organs relative to the hybrid phantom.

  18. Fundamental approach to the design of a dose-rate calculation program for use in brachytherapy planning

    Energy Technology Data Exchange (ETDEWEB)

    Cassell, K.J. (Saint Luke' s Hospital, Guildford (UK))

    1983-02-01

    A method, developed from the Quantisation Method, of calculating dose-rate distributions around uniformly and non-uniformly loaded brachytherapy sources is described. It allows accurate and straightforward corrections for oblique filtration and self-absorption to be made. Using this method, dose-rate distributions have been calculated for sources of radium 226, gold 198, iridium 192, caesium 137 and cobalt 60, all of which show very good agreement with existing measured and calculated data. This method is now the basis of the Interstitial and Intracavitary Dosimetry (IID) program on the General Electric RT/PLAN computerised treatment planning system.

  19. Utilizing Benchmarking to Study the Effectiveness of Parent-Child Interaction Therapy Implemented in a Community Setting

    Science.gov (United States)

    Self-Brown, Shannon; Valente, Jessica R.; Wild, Robert C.; Whitaker, Daniel J.; Galanter, Rachel; Dorsey, Shannon; Stanley, Jenelle

    2012-01-01

    Benchmarking is a program evaluation approach that can be used to study whether the outcomes of parents/children who participate in an evidence-based program in the community approximate the outcomes found in randomized trials. This paper presents a case illustration using benchmarking methodology to examine a community implementation of…

  20. Cyclosporine Regimens in Plaque Psoriasis: An Overview with Special Emphasis on Dose, Duration, and Old and New Treatment Approaches

    Directory of Open Access Journals (Sweden)

    M. D. Colombo

    2013-01-01

    Full Text Available Cyclosporine A (CsA is one of the most effective systemic drugs available for the treatment of psoriasis, as evidenced by the results of several randomized studies and by a prolonged experience in dermatological setting. In clinical practice, CsA is usually used for the induction of psoriasis remission at a daily dose included in the range of 2.5–5 mg/kg and with intermittent short-term regimens, lasting on average 3–6 months. The magnitude and rapidity of response are dose dependent, as well as the risk of development of adverse events. Therefore, the dose should be tailored to patient’s needs and general characteristics and adjusted during the treatment course according to both the efficacy and tolerability. Some studies support the feasibility of pulse administration of CsA for a few days per week for both the induction and the maintenance of response in psoriasis patients. This paper will review the data on CsA regimens for plaque-type psoriasis and will focus the attention on dose, treatment duration, novel schedules, and role in combination therapies, including the association with biologicals.

  1. Use of a protocolized approach to the management of sepsis can improve time to first dose of antibiotics.

    Science.gov (United States)

    Tipler, Pamela S; Pamplin, Jeremy; Mysliwiec, Vincent; Anderson, David; Mount, Cristin A

    2013-04-01

    The Surviving Sepsis Guidelines established recommendations for early recognition and rapid treatment of patients with sepsis. Recognizing systemic difficulties that delayed the application of early goal-directed therapy, the Emergency Department and Critical Care leadership instituted a sepsis protocol to identify patients with sepsis and expedite antibiotic delivery. We aimed to determine if the sepsis protocol improved the time to first dose of antibiotics in patients diagnosed with sepsis. We performed a retrospective chart review of patients with sepsis comparing the time from antibiotic order placement to the first dose of antibiotic therapy over a 3-year period. Patients who received vancomycin and ciprofloxacin underwent additional subgroup analysis, as these antibiotics were made available by protocol for use without infectious disease consultation. The average time to first dose of antibiotics for the presepsis protocol group was 160 minutes, and the average time for the sepsis protocol group was 99 minutes. Fifty-eight patients received vancomycin, and 30 received ciprofloxacin, with a decrease in time of 65 minutes and 41 minutes, respectively. Initiation of a sepsis protocol, which emphasizes early goal-directed therapy, can improve time to administration of first dose of antibiotics. Published by Elsevier Inc.

  2. A KPI framework for process-based benchmarking of hospital information systems.

    Science.gov (United States)

    Jahn, Franziska; Winter, Alfred

    2011-01-01

    Benchmarking is a major topic for monitoring, directing and elucidating the performance of hospital information systems (HIS). Current approaches neglect the outcome of the processes that are supported by the HIS and their contribution to the hospital's strategic goals. We suggest to benchmark HIS based on clinical documentation processes and their outcome. A framework consisting of a general process model and outcome criteria for clinical documentation processes is introduced.

  3. ICSBEP Benchmarks For Nuclear Data Applications

    Science.gov (United States)

    Briggs, J. Blair

    2005-05-01

    The International Criticality Safety Benchmark Evaluation Project (ICSBEP) was initiated in 1992 by the United States Department of Energy. The ICSBEP became an official activity of the Organization for Economic Cooperation and Development (OECD) — Nuclear Energy Agency (NEA) in 1995. Representatives from the United States, United Kingdom, France, Japan, the Russian Federation, Hungary, Republic of Korea, Slovenia, Serbia and Montenegro (formerly Yugoslavia), Kazakhstan, Spain, Israel, Brazil, Poland, and the Czech Republic are now participating. South Africa, India, China, and Germany are considering participation. The purpose of the ICSBEP is to identify, evaluate, verify, and formally document a comprehensive and internationally peer-reviewed set of criticality safety benchmark data. The work of the ICSBEP is published as an OECD handbook entitled "International Handbook of Evaluated Criticality Safety Benchmark Experiments." The 2004 Edition of the Handbook contains benchmark specifications for 3331 critical or subcritical configurations that are intended for use in validation efforts and for testing basic nuclear data. New to the 2004 Edition of the Handbook is a draft criticality alarm / shielding type benchmark that should be finalized in 2005 along with two other similar benchmarks. The Handbook is being used extensively for nuclear data testing and is expected to be a valuable resource for code and data validation and improvement efforts for decades to come. Specific benchmarks that are useful for testing structural materials such as iron, chromium, nickel, and manganese; beryllium; lead; thorium; and 238U are highlighted.

  4. The Isprs Benchmark on Indoor Modelling

    Science.gov (United States)

    Khoshelham, K.; Díaz Vilariño, L.; Peter, M.; Kang, Z.; Acharya, D.

    2017-09-01

    Automated generation of 3D indoor models from point cloud data has been a topic of intensive research in recent years. While results on various datasets have been reported in literature, a comparison of the performance of different methods has not been possible due to the lack of benchmark datasets and a common evaluation framework. The ISPRS benchmark on indoor modelling aims to address this issue by providing a public benchmark dataset and an evaluation framework for performance comparison of indoor modelling methods. In this paper, we present the benchmark dataset comprising several point clouds of indoor environments captured by different sensors. We also discuss the evaluation and comparison of indoor modelling methods based on manually created reference models and appropriate quality evaluation criteria. The benchmark dataset is available for download at: http://www2.isprs.org/commissions/comm4/wg5/benchmark-on-indoor-modelling.html"target="_blank">http://www2.isprs.org/commissions/comm4/wg5/benchmark-on-indoor-modelling.html.

  5. Plans to update benchmarking tool.

    Science.gov (United States)

    Stokoe, Mark

    2013-02-01

    The use of the current AssetMark system by hospital health facilities managers and engineers (in Australia) has decreased to a point of no activity occurring. A number of reasons have been cited, including cost, time to do, slow process, and level of information required. Based on current levels of activity, it would not be of any value to IHEA, or to its members, to continue with this form of AssetMark. For AssetMark to remain viable, it needs to be developed as a tool seen to be of value to healthcare facilities managers, and not just healthcare facility engineers. Benchmarking is still a very important requirement in the industry, and AssetMark can fulfil this need provided that it remains abreast of customer needs. The proposed future direction is to develop an online version of AssetMark with its current capabilities regarding capturing of data (12 Key Performance Indicators), reporting, and user interaction. The system would also provide end-users with access to live reporting features via a user-friendly web nterface linked through the IHEA web page.

  6. Academic Benchmarks for Otolaryngology Leaders.

    Science.gov (United States)

    Eloy, Jean Anderson; Blake, Danielle M; D'Aguillo, Christine; Svider, Peter F; Folbe, Adam J; Baredes, Soly

    2015-08-01

    This study aimed to characterize current benchmarks for academic otolaryngologists serving in positions of leadership and identify factors potentially associated with promotion to these positions. Information regarding chairs (or division chiefs), vice chairs, and residency program directors was obtained from faculty listings and organized by degree(s) obtained, academic rank, fellowship training status, sex, and experience. Research productivity was characterized by (a) successful procurement of active grants from the National Institutes of Health and prior grants from the American Academy of Otolaryngology-Head and Neck Surgery Foundation Centralized Otolaryngology Research Efforts program and (b) scholarly impact, as measured by the h-index. Chairs had the greatest amount of experience (32.4 years) and were the least likely to have multiple degrees, with 75.8% having an MD degree only. Program directors were the most likely to be fellowship trained (84.8%). Women represented 16% of program directors, 3% of chairs, and no vice chairs. Chairs had the highest scholarly impact (as measured by the h-index) and the greatest external grant funding. This analysis characterizes the current picture of leadership in academic otolaryngology. Chairs, when compared to their vice chair and program director counterparts, had more experience and greater research impact. Women were poorly represented among all academic leadership positions. © The Author(s) 2015.

  7. Benchmarking Measures of Network Influence

    Science.gov (United States)

    Bramson, Aaron; Vandermarliere, Benjamin

    2016-01-01

    Identifying key agents for the transmission of diseases (ideas, technology, etc.) across social networks has predominantly relied on measures of centrality on a static base network or a temporally flattened graph of agent interactions. Various measures have been proposed as the best trackers of influence, such as degree centrality, betweenness, and k-shell, depending on the structure of the connectivity. We consider SIR and SIS propagation dynamics on a temporally-extruded network of observed interactions and measure the conditional marginal spread as the change in the magnitude of the infection given the removal of each agent at each time: its temporal knockout (TKO) score. We argue that this TKO score is an effective benchmark measure for evaluating the accuracy of other, often more practical, measures of influence. We find that none of the network measures applied to the induced flat graphs are accurate predictors of network propagation influence on the systems studied; however, temporal networks and the TKO measure provide the requisite targets for the search for effective predictive measures. PMID:27670635

  8. Acoustic dose and acoustic dose-rate.

    Science.gov (United States)

    Duck, Francis

    2009-10-01

    Acoustic dose is defined as the energy deposited by absorption of an acoustic wave per unit mass of the medium supporting the wave. Expressions for acoustic dose and acoustic dose-rate are given for plane-wave conditions, including temporal and frequency dependencies of energy deposition. The relationship between the acoustic dose-rate and the resulting temperature increase is explored, as is the relationship between acoustic dose-rate and radiation force. Energy transfer from the wave to the medium by means of acoustic cavitation is considered, and an approach is proposed in principle that could allow cavitation to be included within the proposed definitions of acoustic dose and acoustic dose-rate.

  9. Benchmark analysis for quantifying urban vulnerability to terrorist incidents.

    Science.gov (United States)

    Piegorsch, Walter W; Cutter, Susan L; Hardisty, Frank

    2007-12-01

    We describe a quantitative methodology to characterize the vulnerability of U.S. urban centers to terrorist attack, using a place-based vulnerability index and a database of terrorist incidents and related human casualties. Via generalized linear statistical models, we study the relationships between vulnerability and terrorist events, and find that our place-based vulnerability metric significantly describes both terrorist incidence and occurrence of human casualties from terrorist events in these urban centers. We also introduce benchmark analytic technologies from applications in toxicological risk assessment to this social risk/vulnerability paradigm, and use these to distinguish levels of high and low urban vulnerability to terrorism. It is seen that the benchmark approach translates quite flexibly from its biological roots to this social scientific archetype.

  10. Systematic Benchmarking of Diagnostic Technologies for an Electrical Power System

    Science.gov (United States)

    Kurtoglu, Tolga; Jensen, David; Poll, Scott

    2009-01-01

    Automated health management is a critical functionality for complex aerospace systems. A wide variety of diagnostic algorithms have been developed to address this technical challenge. Unfortunately, the lack of support to perform large-scale V&V (verification and validation) of diagnostic technologies continues to create barriers to effective development and deployment of such algorithms for aerospace vehicles. In this paper, we describe a formal framework developed for benchmarking of diagnostic technologies. The diagnosed system is the Advanced Diagnostics and Prognostics Testbed (ADAPT), a real-world electrical power system (EPS), developed and maintained at the NASA Ames Research Center. The benchmarking approach provides a systematic, empirical basis to the testing of diagnostic software and is used to provide performance assessment for different diagnostic algorithms.

  11. Benchmarking strategies for measuring the quality of healthcare: problems and prospects.

    Science.gov (United States)

    Lovaglio, Pietro Giorgio

    2012-01-01

    Over the last few years, increasing attention has been directed toward the problems inherent to measuring the quality of healthcare and implementing benchmarking strategies. Besides offering accreditation and certification processes, recent approaches measure the performance of healthcare institutions in order to evaluate their effectiveness, defined as the capacity to provide treatment that modifies and improves the patient's state of health. This paper, dealing with hospital effectiveness, focuses on research methods for effectiveness analyses within a strategy comparing different healthcare institutions. The paper, after having introduced readers to the principle debates on benchmarking strategies, which depend on the perspective and type of indicators used, focuses on the methodological problems related to performing consistent benchmarking analyses. Particularly, statistical methods suitable for controlling case-mix, analyzing aggregate data, rare events, and continuous outcomes measured with error are examined. Specific challenges of benchmarking strategies, such as the risk of risk adjustment (case-mix fallacy, underreporting, risk of comparing noncomparable hospitals), selection bias, and possible strategies for the development of consistent benchmarking analyses, are discussed. Finally, to demonstrate the feasibility of the illustrated benchmarking strategies, an application focused on determining regional benchmarks for patient satisfaction (using 2009 Lombardy Region Patient Satisfaction Questionnaire) is proposed.

  12. Effect of noise correlations on randomized benchmarking

    Science.gov (United States)

    Ball, Harrison; Stace, Thomas M.; Flammia, Steven T.; Biercuk, Michael J.

    2016-02-01

    Among the most popular and well-studied quantum characterization, verification, and validation techniques is randomized benchmarking (RB), an important statistical tool used to characterize the performance of physical logic operations useful in quantum information processing. In this work we provide a detailed mathematical treatment of the effect of temporal noise correlations on the outcomes of RB protocols. We provide a fully analytic framework capturing the accumulation of error in RB expressed in terms of a three-dimensional random walk in "Pauli space." Using this framework we derive the probability density function describing RB outcomes (averaged over noise) for both Markovian and correlated errors, which we show is generally described by a Γ distribution with shape and scale parameters depending on the correlation structure. Long temporal correlations impart large nonvanishing variance and skew in the distribution towards high-fidelity outcomes—consistent with existing experimental data—highlighting potential finite-sampling pitfalls and the divergence of the mean RB outcome from worst-case errors in the presence of noise correlations. We use the filter-transfer function formalism to reveal the underlying reason for these differences in terms of effective coherent averaging of correlated errors in certain random sequences. We conclude by commenting on the impact of these calculations on the utility of single-metric approaches to quantum characterization, verification, and validation.

  13. The implementation of benchmarking process in marketing education services by Ukrainian universities

    Directory of Open Access Journals (Sweden)

    G.V. Okhrimenko

    2016-03-01

    Full Text Available The aim of the article. The consideration of theoretical and practical aspects of benchmarking at universities is the main task of this research. At first, the researcher identified the essence of benchmarking. It involves comparing the characteristics of college or university leading competitors in the industry and copying of proven designs. Benchmarking tries to eliminate the fundamental problem of comparison – the impossibility of being better than the one from whom they borrow solution. Benchmarking involves, therefore, self-evaluation including systematic collection of data and information with the view to making relevant comparisons of strengths and weaknesses of performance aspects. Benchmarking identifies gaps in performance, seeks new approaches for improvements, monitors progress, reviews benefits and assures adoption of good practices. The results of the analysis. There are five types of benchmarking: internal, competitive, functional, procedural and general. Benchmarking is treated as a process of systematically applied and has specific stages: 1 identification of study object; 2 identification of businesses for comparison; 3 selection of data collection methods; 4 determining variations in terms of efficiency and determination of the levels of future results; 5 communicating of the results of benchmarking; 6 development of implementation plan, initiating the implementation, monitoring implementation; 7 new benchmarks definition. The researcher gave the results of practical use of the benchmarking algorithm at universities. In particular, the monitoring and SWOT-analysis were identified competitive practices used at Ukrainian universities. The main criteria for determining the potential for benchmarking of universities were: 1 the presence of new teaching methods at universities; 2 the involvement of foreign lecturers, partners of other universities for cooperation; 3 promoting education services for target groups; 4 violation of

  14. Benchmarking – A tool for judgment or improvement?

    DEFF Research Database (Denmark)

    Rasmussen, Grane Mikael Gregaard

    2010-01-01

    these issues, and describes how effects are closely connected to the perception of benchmarking, the intended users of the system and the application of the benchmarking results. The fundamental basis of this paper is taken from the development of benchmarking in the Danish construction sector. Two distinct...... perceptions of benchmarking will be presented; public benchmarking and best practice benchmarking. These two types of benchmarking are used to characterize and discuss the Danish benchmarking system and to enhance which effects, possibilities and challenges that follow in the wake of using this kind...... of benchmarking. In conclusion it is argued that clients and the Danish government are the intended users of the benchmarking system. The benchmarking results are primarily used by the government for monitoring and regulation of the construction sector and by clients for contractor selection. The dominating use...

  15. Benchmarking Ionizing Space Environment Models

    Science.gov (United States)

    Bourdarie, S.; Inguimbert, C.; Standarovski, D.; Vaillé, J.-R.; Sicard-Piet, A.; Falguere, D.; Ecoffet, R.; Poivey, C.; Lorfèvre, E.

    2017-08-01

    In-flight feedback data are collected, such as displacement damage doses, ionizing doses, and cumulated Single Event upset (SEU) on board various space vehicles and are compared to predictions performed with: 1) proton measurements performed with spectrometers data on board the same spacecraft if any and 2) protons spectrum predicted by the legacy AP8min model and the AP9 and Onera Proton Altitude Low models. When an accurate representation of the 3-D spacecraft shielding as well as appropriate ground calibrations are considered in the calculations, such comparisons provide powerful metrics to investigate engineering model accuracy. To describe >30 MeV trapped protons fluxes, the AP8 min model is found to provide closer predictions to observations than AP9 V1.30.001 (mean and perturbed mean).

  16. Benchmarking ENDF/B-VII.0

    Science.gov (United States)

    van der Marck, Steven C.

    2006-12-01

    The new major release VII.0 of the ENDF/B nuclear data library has been tested extensively using benchmark calculations. These were based upon MCNP-4C3 continuous-energy Monte Carlo neutronics simulations, together with nuclear data processed using the code NJOY. Three types of benchmarks were used, viz., criticality safety benchmarks, (fusion) shielding benchmarks, and reference systems for which the effective delayed neutron fraction is reported. For criticality safety, more than 700 benchmarks from the International Handbook of Criticality Safety Benchmark Experiments were used. Benchmarks from all categories were used, ranging from low-enriched uranium, compound fuel, thermal spectrum ones (LEU-COMP-THERM), to mixed uranium-plutonium, metallic fuel, fast spectrum ones (MIX-MET-FAST). For fusion shielding many benchmarks were based on IAEA specifications for the Oktavian experiments (for Al, Co, Cr, Cu, LiF, Mn, Mo, Si, Ti, W, Zr), Fusion Neutronics Source in Japan (for Be, C, N, O, Fe, Pb), and Pulsed Sphere experiments at Lawrence Livermore National Laboratory (for 6Li, 7Li, Be, C, N, O, Mg, Al, Ti, Fe, Pb, D 2O, H 2O, concrete, polyethylene and teflon). For testing delayed neutron data more than thirty measurements in widely varying systems were used. Among these were measurements in the Tank Critical Assembly (TCA in Japan) and IPEN/MB-01 (Brazil), both with a thermal spectrum, and two cores in Masurca (France) and three cores in the Fast Critical Assembly (FCA, Japan), all with fast spectra. In criticality safety, many benchmarks were chosen from the category with a thermal spectrum, low-enriched uranium, compound fuel (LEU-COMP-THERM), because this is typical of most current-day reactors, and because these benchmarks were previously underpredicted by as much as 0.5% by most nuclear data libraries (such as ENDF/B-VI.8, JEFF-3.0). The calculated results presented here show that this underprediction is no longer there for ENDF/B-VII.0. The average over 257

  17. Benchmarks for dynamic multi-objective optimisation

    CSIR Research Space (South Africa)

    Helbig, M

    2013-06-01

    Full Text Available When algorithms solve dynamic multi-objective optimisation problems (DMOOPs), benchmark functions should be used to determine whether the algorithm can overcome specific difficulties that can occur in real-world problems. However, for dynamic multi...

  18. Medicare Contracting - Redacted Benchmark Metric Reports

    Data.gov (United States)

    U.S. Department of Health & Human Services — The Centers for Medicare and Medicaid Services has compiled aggregate national benchmark cost and workload metrics using data submitted to CMS by the AB MACs and the...

  19. XWeB: The XML Warehouse Benchmark

    Science.gov (United States)

    Mahboubi, Hadj; Darmont, Jérôme

    With the emergence of XML as a standard for representing business data, new decision support applications are being developed. These XML data warehouses aim at supporting On-Line Analytical Processing (OLAP) operations that manipulate irregular XML data. To ensure feasibility of these new tools, important performance issues must be addressed. Performance is customarily assessed with the help of benchmarks. However, decision support benchmarks do not currently support XML features. In this paper, we introduce the XML Warehouse Benchmark (XWeB), which aims at filling this gap. XWeB derives from the relational decision support benchmark TPC-H. It is mainly composed of a test data warehouse that is based on a unified reference model for XML warehouses and that features XML-specific structures, and its associate XQuery decision support workload. XWeB's usage is illustrated by experiments on several XML database management systems.

  20. XWeB: the XML Warehouse Benchmark

    CERN Document Server

    Mahboubi, Hadj

    2011-01-01

    With the emergence of XML as a standard for representing business data, new decision support applications are being developed. These XML data warehouses aim at supporting On-Line Analytical Processing (OLAP) operations that manipulate irregular XML data. To ensure feasibility of these new tools, important performance issues must be addressed. Performance is customarily assessed with the help of benchmarks. However, decision support benchmarks do not currently support XML features. In this paper, we introduce the XML Warehouse Benchmark (XWeB), which aims at filling this gap. XWeB derives from the relational decision support benchmark TPC-H. It is mainly composed of a test data warehouse that is based on a unified reference model for XML warehouses and that features XML-specific structures, and its associate XQuery decision support workload. XWeB's usage is illustrated by experiments on several XML database management systems.

  1. Professional Performance and Bureaucratic Benchmarking Information

    DEFF Research Database (Denmark)

    Schneider, Melanie L.; Mahlendorf, Matthias D.; Schäffer, Utz

    provision to the chief physician of the respective department. Professional performance is publicly disclosed due to regulatory requirements. At the same time, chief physicians typically receive bureaucratic benchmarking information from the administration. We find that more frequent bureaucratic...

  2. Benchmarking of PR Function in Serbian Companies

    National Research Council Canada - National Science Library

    Nikolić, Milan; Sajfert, Zvonko; Vukonjanski, Jelena

    2009-01-01

    The purpose of this paper is to present methodologies for carrying out benchmarking of the PR function in Serbian companies and to test the practical application of the research results and proposed...

  3. A framework of benchmarking land models

    Science.gov (United States)

    Luo, Y. Q.; Randerson, J.; Abramowitz, G.; Bacour, C.; Blyth, E.; Carvalhais, N.; Ciais, P.; Dalmonech, D.; Fisher, J.; Fisher, R.; Friedlingstein, P.; Hibbard, K.; Hoffman, F.; Huntzinger, D.; Jones, C. D.; Koven, C.; Lawrence, D.; Li, D. J.; Mahecha, M.; Niu, S. L.; Norby, R.; Piao, S. L.; Qi, X.; Peylin, P.; Prentice, I. C.; Riley, W.; Reichstein, M.; Schwalm, C.; Wang, Y. P.; Xia, J. Y.; Zaehle, S.; Zhou, X. H.

    2012-02-01

    Land models, which have been developed by the modeling community in the past two decades to predict future states of ecosystems and climate, have to be critically evaluated for their performance skills of simulating ecosystem responses and feedback to climate change. Benchmarking is an emerging procedure to measure and evaluate performance of models against a set of defined standards. This paper proposes a benchmarking framework for evaluation of land models. The framework includes (1) targeted aspects of model performance to be evaluated; (2) a set of benchmarks as defined references to test model performance; (3) metrics to measure and compare performance skills among models so as to identify model strengths and deficiencies; and (4) model improvement. Component 4 may or may not be involved in a benchmark analysis but is an ultimate goal of general modeling research. Land models are required to simulate exchange of water, energy, carbon and sometimes other trace gases between the atmosphere and the land-surface, and should be evaluated for their simulations of biophysical processes, biogeochemical cycles, and vegetation dynamics across timescales in response to both weather and climate change. Benchmarks that are used to evaluate models generally consist of direct observations, data-model products, and data-derived patterns and relationships. Metrics of measuring mismatches between models and benchmarks may include (1) a priori thresholds of acceptable model performance and (2) a scoring system to combine data-model mismatches for various processes at different temporal and spatial scales. The benchmark analyses should identify clues of weak model performance for future improvement. Iterations between model evaluation and improvement via benchmarking shall demonstrate progress of land modeling and help establish confidence in land models for their predictions of future states of ecosystems and climate.

  4. A framework of benchmarking land models

    Directory of Open Access Journals (Sweden)

    Y. Q. Luo

    2012-02-01

    Full Text Available Land models, which have been developed by the modeling community in the past two decades to predict future states of ecosystems and climate, have to be critically evaluated for their performance skills of simulating ecosystem responses and feedback to climate change. Benchmarking is an emerging procedure to measure and evaluate performance of models against a set of defined standards. This paper proposes a benchmarking framework for evaluation of land models. The framework includes (1 targeted aspects of model performance to be evaluated; (2 a set of benchmarks as defined references to test model performance; (3 metrics to measure and compare performance skills among models so as to identify model strengths and deficiencies; and (4 model improvement. Component 4 may or may not be involved in a benchmark analysis but is an ultimate goal of general modeling research. Land models are required to simulate exchange of water, energy, carbon and sometimes other trace gases between the atmosphere and the land-surface, and should be evaluated for their simulations of biophysical processes, biogeochemical cycles, and vegetation dynamics across timescales in response to both weather and climate change. Benchmarks that are used to evaluate models generally consist of direct observations, data-model products, and data-derived patterns and relationships. Metrics of measuring mismatches between models and benchmarks may include (1 a priori thresholds of acceptable model performance and (2 a scoring system to combine data-model mismatches for various processes at different temporal and spatial scales. The benchmark analyses should identify clues of weak model performance for future improvement. Iterations between model evaluation and improvement via benchmarking shall demonstrate progress of land modeling and help establish confidence in land models for their predictions of future states of ecosystems and climate.

  5. A framework for benchmarking land models

    Science.gov (United States)

    Luo, Y. Q.; Randerson, J. T.; Abramowitz, G.; Bacour, C.; Blyth, E.; Carvalhais, N.; Ciais, P.; Dalmonech, D.; Fisher, J. B.; Fisher, R.; Friedlingstein, P.; Hibbard, K.; Hoffman, F.; Huntzinger, D.; Jones, C. D.; Koven, C.; Lawrence, D.; Li, D. J.; Mahecha, M.; Niu, S. L.; Norby, R.; Piao, S. L.; Qi, X.; Peylin, P.; Prentice, I. C.; Riley, W.; Reichstein, M.; Schwalm, C.; Wang, Y. P.; Xia, J. Y.; Zaehle, S.; Zhou, X. H.

    2012-10-01

    Land models, which have been developed by the modeling community in the past few decades to predict future states of ecosystems and climate, have to be critically evaluated for their performance skills of simulating ecosystem responses and feedback to climate change. Benchmarking is an emerging procedure to measure performance of models against a set of defined standards. This paper proposes a benchmarking framework for evaluation of land model performances and, meanwhile, highlights major challenges at this infant stage of benchmark analysis. The framework includes (1) targeted aspects of model performance to be evaluated, (2) a set of benchmarks as defined references to test model performance, (3) metrics to measure and compare performance skills among models so as to identify model strengths and deficiencies, and (4) model improvement. Land models are required to simulate exchange of water, energy, carbon and sometimes other trace gases between the atmosphere and land surface, and should be evaluated for their simulations of biophysical processes, biogeochemical cycles, and vegetation dynamics in response to climate change across broad temporal and spatial scales. Thus, one major challenge is to select and define a limited number of benchmarks to effectively evaluate land model performance. The second challenge is to develop metrics of measuring mismatches between models and benchmarks. The metrics may include (1) a priori thresholds of acceptable model performance and (2) a scoring system to combine data-model mismatches for various processes at different temporal and spatial scales. The benchmark analyses should identify clues of weak model performance to guide future development, thus enabling improved predictions of future states of ecosystems and climate. The near-future research effort should be on development of a set of widely acceptable benchmarks that can be used to objectively, effectively, and reliably evaluate fundamental properties of land models

  6. Benchmarking Attosecond Physics with Atomic Hydrogen

    Science.gov (United States)

    2015-05-25

    Final 3. DATES COVERED (From - To) 12 Mar 12 – 11 Mar 15 4. TITLE AND SUBTITLE Benchmarking attosecond physics with atomic hydrogen 5a...AND SUBTITLE Benchmarking attosecond physics with atomic hydrogen 5a. CONTRACT NUMBER FA2386-12-1-4025 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER...attosecond physics with atomic hydrogen ” May 25, 2015 PI information: David Kielpinski, dave.kielpinski@gmail.com Griffith University Centre

  7. Aerodynamic Benchmarking of the Deepwind Design

    DEFF Research Database (Denmark)

    Bedona, Gabriele; Schmidt Paulsen, Uwe; Aagaard Madsen, Helge;

    2015-01-01

    The aerodynamic benchmarking for the DeepWind rotor is conducted comparing different rotor geometries and solutions and keeping the comparison as fair as possible. The objective for the benchmarking is to find the most suitable configuration in order to maximize the power production and minimize...... NACA airfoil family. (C) 2015 Published by Elsevier Ltd. This is an open access article under the CC BY-NC-ND license...

  8. Benchmarking Danish Vocational Education and Training Programmes

    DEFF Research Database (Denmark)

    Bogetoft, Peter; Wittrup, Jesper

    This study paper discusses methods whereby Danish vocational education and training colleges can be benchmarked, and presents results from a number of models. It is conceptually complicated to benchmark vocational colleges, as the various colleges in Denmark offer a wide range of course programmes...... attempt to summarise the various effects that the colleges have in two relevant figures, namely retention rates of students and employment rates among students who have completed training programmes....

  9. Implementation of NAS Parallel Benchmarks in Java

    Science.gov (United States)

    Frumkin, Michael; Schultz, Matthew; Jin, Hao-Qiang; Yan, Jerry

    2000-01-01

    A number of features make Java an attractive but a debatable choice for High Performance Computing (HPC). In order to gauge the applicability of Java to the Computational Fluid Dynamics (CFD) we have implemented NAS Parallel Benchmarks in Java. The performance and scalability of the benchmarks point out the areas where improvement in Java compiler technology and in Java thread implementation would move Java closer to Fortran in the competition for CFD applications.

  10. A framework for benchmarking land models

    Directory of Open Access Journals (Sweden)

    Y. Q. Luo

    2012-10-01

    Full Text Available Land models, which have been developed by the modeling community in the past few decades to predict future states of ecosystems and climate, have to be critically evaluated for their performance skills of simulating ecosystem responses and feedback to climate change. Benchmarking is an emerging procedure to measure performance of models against a set of defined standards. This paper proposes a benchmarking framework for evaluation of land model performances and, meanwhile, highlights major challenges at this infant stage of benchmark analysis. The framework includes (1 targeted aspects of model performance to be evaluated, (2 a set of benchmarks as defined references to test model performance, (3 metrics to measure and compare performance skills among models so as to identify model strengths and deficiencies, and (4 model improvement. Land models are required to simulate exchange of water, energy, carbon and sometimes other trace gases between the atmosphere and land surface, and should be evaluated for their simulations of biophysical processes, biogeochemical cycles, and vegetation dynamics in response to climate change across broad temporal and spatial scales. Thus, one major challenge is to select and define a limited number of benchmarks to effectively evaluate land model performance. The second challenge is to develop metrics of measuring mismatches between models and benchmarks. The metrics may include (1 a priori thresholds of acceptable model performance and (2 a scoring system to combine data–model mismatches for various processes at different temporal and spatial scales. The benchmark analyses should identify clues of weak model performance to guide future development, thus enabling improved predictions of future states of ecosystems and climate. The near-future research effort should be on development of a set of widely acceptable benchmarks that can be used to objectively, effectively, and reliably evaluate fundamental properties

  11. The MCNP6 Analytic Criticality Benchmark Suite

    Energy Technology Data Exchange (ETDEWEB)

    Brown, Forrest B. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States). Monte Carlo Codes Group

    2016-06-16

    Analytical benchmarks provide an invaluable tool for verifying computer codes used to simulate neutron transport. Several collections of analytical benchmark problems [1-4] are used routinely in the verification of production Monte Carlo codes such as MCNP® [5,6]. Verification of a computer code is a necessary prerequisite to the more complex validation process. The verification process confirms that a code performs its intended functions correctly. The validation process involves determining the absolute accuracy of code results vs. nature. In typical validations, results are computed for a set of benchmark experiments using a particular methodology (code, cross-section data with uncertainties, and modeling) and compared to the measured results from the set of benchmark experiments. The validation process determines bias, bias uncertainty, and possibly additional margins. Verification is generally performed by the code developers, while validation is generally performed by code users for a particular application space. The VERIFICATION_KEFF suite of criticality problems [1,2] was originally a set of 75 criticality problems found in the literature for which exact analytical solutions are available. Even though the spatial and energy detail is necessarily limited in analytical benchmarks, typically to a few regions or energy groups, the exact solutions obtained can be used to verify that the basic algorithms, mathematics, and methods used in complex production codes perform correctly. The present work has focused on revisiting this benchmark suite. A thorough review of the problems resulted in discarding some of them as not suitable for MCNP benchmarking. For the remaining problems, many of them were reformulated to permit execution in either multigroup mode or in the normal continuous-energy mode for MCNP. Execution of the benchmarks in continuous-energy mode provides a significant advance to MCNP verification methods.

  12. Simple Benchmark Specifications for Space Radiation Protection

    Science.gov (United States)

    Singleterry, Robert C. Jr.; Aghara, Sukesh K.

    2013-01-01

    This report defines space radiation benchmark specifications. This specification starts with simple, monoenergetic, mono-directional particles on slabs and progresses to human models in spacecraft. This report specifies the models and sources needed to what the team performing the benchmark needs to produce in a report. Also included are brief descriptions of how OLTARIS, the NASA Langley website for space radiation analysis, performs its analysis.

  13. Benchmarking for Cost Improvement. Final report

    Energy Technology Data Exchange (ETDEWEB)

    1993-09-01

    The US Department of Energy`s (DOE) Office of Environmental Restoration and Waste Management (EM) conducted the Benchmarking for Cost Improvement initiative with three objectives: Pilot test benchmarking as an EM cost improvement tool; identify areas for cost improvement and recommend actions to address these areas; provide a framework for future cost improvement. The benchmarking initiative featured the use of four principal methods (program classification, nationwide cost improvement survey, paired cost comparison and component benchmarking). Interested parties contributed during both the design and execution phases. The benchmarking initiative was conducted on an accelerated basis. Of necessity, it considered only a limited set of data that may not be fully representative of the diverse and complex conditions found at the many DOE installations. The initiative generated preliminary data about cost differences and it found a high degree of convergence on several issues. Based on this convergence, the report recommends cost improvement strategies and actions. This report describes the steps taken as part of the benchmarking initiative and discusses the findings and recommended actions for achieving cost improvement. The results and summary recommendations, reported below, are organized by the study objectives.

  14. Benchmarking infrastructure for mutation text mining

    Science.gov (United States)

    2014-01-01

    Background Experimental research on the automatic extraction of information about mutations from texts is greatly hindered by the lack of consensus evaluation infrastructure for the testing and benchmarking of mutation text mining systems. Results We propose a community-oriented annotation and benchmarking infrastructure to support development, testing, benchmarking, and comparison of mutation text mining systems. The design is based on semantic standards, where RDF is used to represent annotations, an OWL ontology provides an extensible schema for the data and SPARQL is used to compute various performance metrics, so that in many cases no programming is needed to analyze results from a text mining system. While large benchmark corpora for biological entity and relation extraction are focused mostly on genes, proteins, diseases, and species, our benchmarking infrastructure fills the gap for mutation information. The core infrastructure comprises (1) an ontology for modelling annotations, (2) SPARQL queries for computing performance metrics, and (3) a sizeable collection of manually curated documents, that can support mutation grounding and mutation impact extraction experiments. Conclusion We have developed the principal infrastructure for the benchmarking of mutation text mining tasks. The use of RDF and OWL as the representation for corpora ensures extensibility. The infrastructure is suitable for out-of-the-box use in several important scenarios and is ready, in its current state, for initial community adoption. PMID:24568600

  15. SUSTAINABLE SUCCESS IN HIGHER EDUCATION BY SHARING THE BEST PRACTICES AS A RESULT OF BENCHMARKING PROCESS

    Directory of Open Access Journals (Sweden)

    Anca Gabriela Ilie

    2011-11-01

    Full Text Available The paper proposes to review the main benchmarking criteria, based on the quality indicators used by the higher education institutions and to present new indicators of reference as a result of the inter-universities cooperation. Once these indicators are defined, a national database could be created and through benchmarking methods, there could be established the level of national performance of the educational system. Going forward and generalizing the process, we can compare the national educational system with the European one, using the benchmarking approach. The final purpose is that of establishing a group of universities who come together to explore opportunities for benchmarks and best practices sharing on common interest areas in order to create a „quality culture” for the Romanian higher education system

  16. Influence of photon beam energy on the dose enhancement factor caused by gold and silver nanoparticles: An experimental approach

    Energy Technology Data Exchange (ETDEWEB)

    Guidelli, Eder José, E-mail: ederguidelli@pg.ffclrp.usp.br; Baffa, Oswaldo [Departamento de Física, Faculdade de Filosofia, Ciências e Letras de Ribeirão Preto, Universidade de São Paulo, Av. Bandeirantes, 3900, 14040-901 Ribeirão Preto, SP (Brazil)

    2014-03-15

    Purpose: Noble metal nanoparticles have found several medical applications in the areas of radiation detection; x-ray contrast agents and cancer radiation therapy. Based on computational methods, many papers have reported the nanoparticle effect on the dose deposition in the surrounding medium. Here the authors report experimental results on how silver and gold nanoparticles affect the dose deposition in alanine dosimeters containing several concentrations of silver and gold nanoparticles, for five different beam energies, using electron spin resonance spectroscopy (ESR). Methods: The authors produced alanine dosimeters containing several mass percentage of silver and gold nanoparticles. Nanoparticle sizes were measured by dynamic light scattering and by transmission electron microscopy. The authors determined the dose enhancement factor (DEF) theoretically, using a widely accepted method, and experimentally, using ESR spectroscopy. Results: The DEF is governed by nanoparticle concentration, size, and position in the alanine matrix. Samples containing gold nanoparticles afford a DEF higher than 1.0, because gold nanoparticle size is homogeneous for all gold concentrations utilized. For samples containing silver particles, the silver mass percentage governs the nanoparticles size, which, in turns, modifies nanoparticle position in the alanine dosimeters. In this sense, DEF decreases for dosimeters containing large and segregated particles. The influence of nanoparticle size-position is more noticeable for dosimeters irradiated with higher beam energies, and dosimeters containing large and segregated particles become less sensitive than pure alanine (DEF < 1). Conclusions: ESR dosimetry gives the DEF in a medium containing metal nanoparticles, although particle concentration, size, and position are closely related in the system. Because this is also the case as in many real systems of materials containing inorganic nanoparticles, ESR is a valuable tool for

  17. [A new approach to shielding function calculation: radiation dose estimation for a phantome inside space station compartment].

    Science.gov (United States)

    Kartashov, D A; Shurshakov, V A

    2012-01-01

    The article presents a new procedure of calculating the shielding functions for irregular objects formed from a set of nonintersecting (adjacent) triangles covering completely the surface of each object. Calculated and experimentally derived distributions of space ionizing radiation doses in the spherical tissue-equivalent phantom (experiment MATRYOSHKA-R) inside the International space station were in good agreement in the mass of phantom depths with allowance for measurement error (-10%). The procedure can be applied in modeling radiation loads on cosmonauts, calculating effectiveness of secondary protection in spacecraft, and design review of radiation protection for future space exploration missions.

  18. Treatment of advanced head and neck cancer: multiple daily dose fractionated radiation therapy and sequential multimodal treatment approach.

    Science.gov (United States)

    Nissenbaum, M; Browde, S; Bezwoda, W R; de Moor, N G; Derman, D P

    1984-01-01

    Fifty-eight patients with advanced head and neck cancer were entered into a randomised trial comparing chemotherapy (DDP + bleomycin) alone, multiple daily fractionated radiation therapy, and multimodality therapy consisting of chemotherapy plus multiple fractionated radiation therapy. Multimodal therapy gave a significantly higher response rate (69%) than either single-treatment modality. The use of a multiple daily dose fractionation allowed radiation therapy to be completed over 10 treatment days, and the addition of chemotherapy to the radiation treatment did not significantly increase toxicity. Patients receiving multimodal therapy also survived significantly longer (median 50 weeks) than those receiving single-modality therapy (median 24 weeks).

  19. From active shape model to active optical flow model: a shape-based approach to predicting voxel-level dose distributions in spine SBRT.

    Science.gov (United States)

    Liu, Jianfei; Wu, Q Jackie; Kirkpatrick, John P; Yin, Fang-Fang; Yuan, Lulin; Ge, Yaorong

    2015-03-07

    vertebral bodies, and the ‘PTV’ is the involved segment of the vertebral body expanded uniformly by 2 mm but excluding the spinal cord volume expanded by 2 mm (Ref. RTOG 0631). These results suggested that the AOFM-based approach is a promising tool for predicting accurate spinal cord dose in clinical practice. In this work, we demonstrated the feasibility of using AOFM and ASM models derived from previously treated patients to estimate the achievable dose distributions for new patients.

  20. Toxicological benchmark for screening of potential contaminants of concern for effects on aquatic biota on the Oak Ridge Reservation, Oak Ridge, Tennessee; Environmental Restoration Program

    Energy Technology Data Exchange (ETDEWEB)

    Suter, G.W. II; Futrell, M.A.; Kerchner, G.A.

    1992-09-01

    One of the initial stages in ecological risk assessment of hazardous waste sites is the screening of contaminants to determine which of them are worthy of further consideration. This report presents potential screening benchmarks for protection of aquatic life from contaminants in water. Because there is no guidance for screening benchmarks, a set of alternative benchmarks is presented here. The alternative benchmarks are based on different conceptual approaches to estimating concentrations causing significant effects. To the extent that toxicity data are available, this report presents the alternative benchmarks for chemicals that have been detected on the Oak Ridge Reservation. It also presents the data used to calculate the benchmarks, and the sources of the data. It compares the benchmarks and discusses their relative conservatism and utility.

  1. The results of the pantograph-catenary interaction benchmark

    Science.gov (United States)

    Bruni, Stefano; Ambrosio, Jorge; Carnicero, Alberto; Cho, Yong Hyeon; Finner, Lars; Ikeda, Mitsuru; Kwon, Sam Young; Massat, Jean-Pierre; Stichel, Sebastian; Tur, Manuel; Zhang, Weihua

    2015-03-01

    This paper describes the results of a voluntary benchmark initiative concerning the simulation of pantograph-catenary interaction, which was proposed and coordinated by Politecnico di Milano and participated by 10 research institutions established in 9 different countries across Europe and Asia. The aims of the benchmark are to assess the dispersion of results on the same simulation study cases, to demonstrate the accuracy of numerical methodologies and simulation models and to identify the best suited modelling approaches to study pantograph-catenary interaction. One static and three dynamic simulation cases were defined for a non-existing but realistic high-speed pantograph-catenary couple. These cases were run using 10 of the major simulation codes presently in use for the study of pantograph-catenary interaction, and the results are presented and critically discussed here. All input data required to run the study cases are also provided, allowing the use of this benchmark as a term of comparison for other simulation codes.

  2. Cervical intervertebral disc herniation treatment via radiofrequency combined with low-dose collagenase injection into the disc interior using an anterior cervical approach.

    Science.gov (United States)

    Wang, Zhi-Jian; Zhu, Meng-Ye; Liu, Xiao-Jian; Zhang, Xue-Xue; Zhang, Da-Ying; Wei, Jian-Mei

    2016-06-01

    This study aimed to determine the therapeutic effect of radiofrequency combined with low-dose collagenase injected into the disc interior via an anterior cervical approach for cervical intervertebral disc herniation.Forty-three patients (26-62-year old; male/female ratio: 31/12) with cervical intervertebral disc herniation received radiofrequency combined with 60 to 100 U of collagenase, injected via an anterior cervical approach. The degree of nerve function was assessed using the current Japanese Orthopaedic Association (JOA) scoring system at 3 and 12 months postoperation. A visual analogue scale (VAS) was used to evaluate the degree of pain preoperation and 7 days postoperation. The preoperative and 3 month postoperative protrusion areas were measured and compared via magnetic resonance imaging (MRI) and picture archiving and communication systems (PACS).Compared with the preoperative pain scores, the 7-day postoperative pain was significantly reduced (P <0.01). The excellent and good rates of nerve function amelioration were 93.0% and 90.7% at 3 and 12 months postoperation, respectively, which was not significantly different. Twenty-seven cases exhibited a significantly reduced protrusion area (P <0.01) at 3 months postoperation. No serious side effects were noted.To our knowledge, this is the first study to demonstrate that the use of radiofrequency combined with low-dose collagenase injection into the disc interior via an anterior cervical approach is effective and safe for the treatment of cervical intervertebral disc herniation.

  3. Clean Energy Manufacturing Analysis Center Benchmark Report: Framework and Methodologies

    Energy Technology Data Exchange (ETDEWEB)

    Sandor, Debra [National Renewable Energy Lab. (NREL), Golden, CO (United States); Chung, Donald [National Renewable Energy Lab. (NREL), Golden, CO (United States); Keyser, David [National Renewable Energy Lab. (NREL), Golden, CO (United States); Mann, Margaret [National Renewable Energy Lab. (NREL), Golden, CO (United States); Engel-Cox, Jill [National Renewable Energy Lab. (NREL), Golden, CO (United States)

    2017-05-23

    This report documents the CEMAC methodologies for developing and reporting annual global clean energy manufacturing benchmarks. The report reviews previously published manufacturing benchmark reports and foundational data, establishes a framework for benchmarking clean energy technologies, describes the CEMAC benchmark analysis methodologies, and describes the application of the methodologies to the manufacturing of four specific clean energy technologies.

  4. Benchmarking von Krankenhausinformationssystemen – eine vergleichende Analyse deutschsprachiger Benchmarkingcluster

    Directory of Open Access Journals (Sweden)

    Jahn, Franziska

    2015-08-01

    Full Text Available Benchmarking is a method of strategic information management used by many hospitals today. During the last years, several benchmarking clusters have been established within the German-speaking countries. They support hospitals in comparing and positioning their information system’s and information management’s costs, performance and efficiency against other hospitals. In order to differentiate between these benchmarking clusters and to provide decision support in selecting an appropriate benchmarking cluster, a classification scheme is developed. The classification scheme observes both general conditions and examined contents of the benchmarking clusters. It is applied to seven benchmarking clusters which have been active in the German-speaking countries within the last years. Currently, performance benchmarking is the most frequent benchmarking type, whereas the observed benchmarking clusters differ in the number of benchmarking partners and their cooperation forms. The benchmarking clusters also deal with different benchmarking subjects. Assessing costs and quality application systems, physical data processing systems, organizational structures of information management and IT services processes are the most frequent benchmarking subjects. There is still potential for further activities within the benchmarking clusters to measure strategic and tactical information management, IT governance and quality of data and data-processing processes. Based on the classification scheme and the comparison of the benchmarking clusters, we derive general recommendations for benchmarking of hospital information systems.

  5. Storage-Intensive Supercomputing Benchmark Study

    Energy Technology Data Exchange (ETDEWEB)

    Cohen, J; Dossa, D; Gokhale, M; Hysom, D; May, J; Pearce, R; Yoo, A

    2007-10-30

    Critical data science applications requiring frequent access to storage perform poorly on today's computing architectures. This project addresses efficient computation of data-intensive problems in national security and basic science by exploring, advancing, and applying a new form of computing called storage-intensive supercomputing (SISC). Our goal is to enable applications that simply cannot run on current systems, and, for a broad range of data-intensive problems, to deliver an order of magnitude improvement in price/performance over today's data-intensive architectures. This technical report documents much of the work done under LDRD 07-ERD-063 Storage Intensive Supercomputing during the period 05/07-09/07. The following chapters describe: (1) a new file I/O monitoring tool iotrace developed to capture the dynamic I/O profiles of Linux processes; (2) an out-of-core graph benchmark for level-set expansion of scale-free graphs; (3) an entity extraction benchmark consisting of a pipeline of eight components; and (4) an image resampling benchmark drawn from the SWarp program in the LSST data processing pipeline. The performance of the graph and entity extraction benchmarks was measured in three different scenarios: data sets residing on the NFS file server and accessed over the network; data sets stored on local disk; and data sets stored on the Fusion I/O parallel NAND Flash array. The image resampling benchmark compared performance of software-only to GPU-accelerated. In addition to the work reported here, an additional text processing application was developed that used an FPGA to accelerate n-gram profiling for language classification. The n-gram application will be presented at SC07 at the High Performance Reconfigurable Computing Technologies and Applications Workshop. The graph and entity extraction benchmarks were run on a Supermicro server housing the NAND Flash 40GB parallel disk array, the Fusion-io. The Fusion system specs are as follows

  6. Terminate lung cancer (TLC) study-A mixed-methods population approach to increase lung cancer screening awareness and low-dose computed tomography in Eastern Kentucky.

    Science.gov (United States)

    Cardarelli, Roberto; Reese, David; Roper, Karen L; Cardarelli, Kathryn; Feltner, Frances J; Studts, Jamie L; Knight, Jennifer R; Armstrong, Debra; Weaver, Anthony; Shaffer, Dana

    2017-02-01

    For low dose CT lung cancer screening to be effective in curbing disease mortality, efforts are needed to overcome barriers to awareness and facilitate uptake of the current evidence-based screening guidelines. A sequential mixed-methods approach was employed to design a screening campaign utilizing messages developed from community focus groups, followed by implementation of the outreach campaign intervention in two high-risk Kentucky regions. This study reports on rates of awareness and screening in intervention regions, as compared to a control region.

  7. Benchmark experiment to verify radiation transport calculations for dosimetry in radiation therapy; Benchmark-Experiment zur Verifikation von Strahlungstransportrechnungen fuer die Dosimetrie in der Strahlentherapie

    Energy Technology Data Exchange (ETDEWEB)

    Renner, Franziska [Physikalisch-Technische Bundesanstalt (PTB), Braunschweig (Germany)

    2016-11-01

    Monte Carlo simulations are regarded as the most accurate method of solving complex problems in the field of dosimetry and radiation transport. In (external) radiation therapy they are increasingly used for the calculation of dose distributions during treatment planning. In comparison to other algorithms for the calculation of dose distributions, Monte Carlo methods have the capability of improving the accuracy of dose calculations - especially under complex circumstances (e.g. consideration of inhomogeneities). However, there is a lack of knowledge of how accurate the results of Monte Carlo calculations are on an absolute basis. A practical verification of the calculations can be performed by direct comparison with the results of a benchmark experiment. This work presents such a benchmark experiment and compares its results (with detailed consideration of measurement uncertainty) with the results of Monte Carlo calculations using the well-established Monte Carlo code EGSnrc. The experiment was designed to have parallels to external beam radiation therapy with respect to the type and energy of the radiation, the materials used and the kind of dose measurement. Because the properties of the beam have to be well known in order to compare the results of the experiment and the simulation on an absolute basis, the benchmark experiment was performed using the research electron accelerator of the Physikalisch-Technische Bundesanstalt (PTB), whose beam was accurately characterized in advance. The benchmark experiment and the corresponding Monte Carlo simulations were carried out for two different types of ionization chambers and the results were compared. Considering the uncertainty, which is about 0.7 % for the experimental values and about 1.0 % for the Monte Carlo simulation, the results of the simulation and the experiment coincide.

  8. Full sphere hydrodynamic and dynamo benchmarks

    KAUST Repository

    Marti, P.

    2014-01-26

    Convection in planetary cores can generate fluid flow and magnetic fields, and a number of sophisticated codes exist to simulate the dynamic behaviour of such systems. We report on the first community activity to compare numerical results of computer codes designed to calculate fluid flow within a whole sphere. The flows are incompressible and rapidly rotating and the forcing of the flow is either due to thermal convection or due to moving boundaries. All problems defined have solutions that alloweasy comparison, since they are either steady, slowly drifting or perfectly periodic. The first two benchmarks are defined based on uniform internal heating within the sphere under the Boussinesq approximation with boundary conditions that are uniform in temperature and stress-free for the flow. Benchmark 1 is purely hydrodynamic, and has a drifting solution. Benchmark 2 is a magnetohydrodynamic benchmark that can generate oscillatory, purely periodic, flows and magnetic fields. In contrast, Benchmark 3 is a hydrodynamic rotating bubble benchmark using no slip boundary conditions that has a stationary solution. Results from a variety of types of code are reported, including codes that are fully spectral (based on spherical harmonic expansions in angular coordinates and polynomial expansions in radius), mixed spectral and finite difference, finite volume, finite element and also a mixed Fourier-finite element code. There is good agreement between codes. It is found that in Benchmarks 1 and 2, the approximation of a whole sphere problem by a domain that is a spherical shell (a sphere possessing an inner core) does not represent an adequate approximation to the system, since the results differ from whole sphere results. © The Authors 2014. Published by Oxford University Press on behalf of The Royal Astronomical Society.

  9. Fault Diagnosis of an Advanced Wind Turbine Benchmark using Interval-based ARRs and Observers

    DEFF Research Database (Denmark)

    Sardi, Hector Eloy Sanchez; Escobet, Teressa; Puig, Vicenc;

    2015-01-01

    This paper proposes a model-based fault diagnosis (FD) approach for wind turbines and its application to a realistic wind turbine FD benchmark. The proposed FD approach combines the use of analytical redundancy relations (ARRs) and interval observers. Interval observers consider an unknown...

  10. Organizational and economic aspects of benchmarking innovative products at the automobile industry enterprises

    Directory of Open Access Journals (Sweden)

    L.M. Taraniuk

    2016-06-01

    Full Text Available The aim of the article. The aim of the article is to determine the nature and characteristics of the use of benchmarking in the activity of domestic enterprises of automobile industry under current economic conditions. The results of the analysis. The article identified the concept of benchmarking, examining the stages of benchmarking, determination the efficiency of benchmarking in work automakers. It is considered the historical aspects of the emergence of benchmarking method in world economics. It is determined the economic aspects of the benchmarking in the work of enterprise automobile industry. The analysis on the stages of benchmarking of innovative products in the modern development of the productive forces and the impact of market factors on the economic activities of companies, including in the enterprise of automobile industry. The attention is focused on the specifics of implementing benchmarking at companies of automobile industry. It is considered statistics number of owners of electric vehicles worldwide. The authors researched market of electric vehicles in Ukraine. Also, it is considered the need of benchmarking using to improve the competitiveness of the national automobile industry especially CJSC “Zaporizhia Automobile Building Plant”. Authors suggested reasonable steps for its improvement. The authors improved methodical approach to assessing the selection of vehicles with the best technical parameters based on benchmarking, which, unlike the existing ones, based on the calculation of the integral factor of technical specifications of vehicles in order to establish better competitive products of companies automobile industry among evaluated. The main indicators of the national production of electric vehicles are shown. Attention is paid to the development of important ways of CJSC “Zaporizhia Automobile Building Plant”, where authors established the aspects that need to pay attention in the management of the

  11. Exposure to low-dose bisphenol A impairs meiosis in the rat seminiferous tubule culture model: a physiotoxicogenomic approach.

    Directory of Open Access Journals (Sweden)

    Sazan Ali

    Full Text Available BACKGROUND: Bisphenol A (BPA is one of the most widespread chemicals in the world and is suspected of being responsible for male reproductive impairments. Nevertheless, its molecular mode of action on spermatogenesis is unclear. This work combines physiology and toxicogenomics to identify mechanisms by which BPA affects the timing of meiosis and induces germ-cell abnormalities. METHODS: We used a rat seminiferous tubule culture model mimicking the in vivo adult rat situation. BPA (1 nM and 10 nM was added to the culture medium. Transcriptomic and meiotic studies were performed on the same cultures at the same exposure times (days 8, 14, and 21. Transcriptomics was performed using pangenomic rat microarrays. Immunocytochemistry was conducted with an anti-SCP3 antibody. RESULTS: The gene expression analysis showed that the total number of differentially expressed transcripts was time but not dose dependent. We focused on 120 genes directly involved in the first meiotic prophase, sustaining immunocytochemistry. Sixty-two genes were directly involved in pairing and recombination, some of them with high fold changes. Immunocytochemistry indicated alteration of meiotic progression in the presence of BPA, with increased leptotene and decreased diplotene spermatocyte percentages and partial meiotic arrest at the pachytene checkpoint. Morphological abnormalities were observed at all stages of the meiotic prophase. The prevalent abnormalities were total asynapsis and apoptosis. Transcriptomic analysis sustained immunocytological observations. CONCLUSION: We showed that low doses of BPA alter numerous genes expression, especially those involved in the reproductive system, and severely impair crucial events of the meiotic prophase leading to partial arrest of meiosis in rat seminiferous tubule cultures.

  12. Criteria of benchmark selection for efficient flexible multibody system formalisms

    Directory of Open Access Journals (Sweden)

    Valášek M.

    2007-10-01

    Full Text Available The paper deals with the selection process of benchmarks for testing and comparing efficient flexible multibody formalisms. The existing benchmarks are briefly summarized. The purposes for benchmark selection are investigated. The result of this analysis is the formulation of the criteria of benchmark selection for flexible multibody formalisms. Based on them the initial set of suitable benchmarks is described. Besides that the evaluation measures are revised and extended.

  13. Test Nationally, Benchmark Locally: Using Local DIBELS Benchmarks to Predict Performance on the Pssa

    Science.gov (United States)

    Ferchalk, Matthew R.

    2013-01-01

    The Dynamic Indicators of Basic Early Literacy Skills (DIBELS) benchmarks are frequently used to make important decision regarding student performance. More information, however, is needed to understand if the nationally-derived benchmarks created by the DIBELS system provide the most accurate criterion for evaluating reading proficiency. The…

  14. Benchmarking local healthcare-associated infections: available benchmarks and interpretation challenges.

    Science.gov (United States)

    El-Saed, Aiman; Balkhy, Hanan H; Weber, David J

    2013-10-01

    Growing numbers of healthcare facilities are routinely collecting standardized data on healthcare-associated infection (HAI), which can be used not only to track internal performance but also to compare local data to national and international benchmarks. Benchmarking overall (crude) HAI surveillance metrics without accounting or adjusting for potential confounders can result in misleading conclusions. Methods commonly used to provide risk-adjusted metrics include multivariate logistic regression analysis, stratification, indirect standardization, and restrictions. The characteristics of recognized benchmarks worldwide, including the advantages and limitations are described. The choice of the right benchmark for the data from the Gulf Cooperation Council (GCC) states is challenging. The chosen benchmark should have similar data collection and presentation methods. Additionally, differences in surveillance environments including regulations should be taken into consideration when considering such a benchmark. The GCC center for infection control took some steps to unify HAI surveillance systems in the region. GCC hospitals still need to overcome legislative and logistic difficulties in sharing data to create their own benchmark. The availability of a regional GCC benchmark may better enable health care workers and researchers to obtain more accurate and realistic comparisons.

  15. The Concepts "Benchmarks and Benchmarking" Used in Education Planning: Teacher Education as Example

    Science.gov (United States)

    Steyn, H. J.

    2015-01-01

    Planning in education is a structured activity that includes several phases and steps that take into account several kinds of information (Steyn, Steyn, De Waal & Wolhuter, 2002: 146). One of the sets of information that are usually considered is the (so-called) "benchmarks" and "benchmarking" regarding the focus of a…

  16. Features and technology of enterprise internal benchmarking

    Directory of Open Access Journals (Sweden)

    A.V. Dubodelova

    2013-06-01

    Full Text Available The aim of the article. The aim of the article is to generalize characteristics, objectives, advantages of internal benchmarking. The stages sequence of internal benchmarking technology is formed. It is focused on continuous improvement of process of the enterprise by implementing existing best practices.The results of the analysis. Business activity of domestic enterprises in crisis business environment has to focus on the best success factors of their structural units by using standard research assessment of their performance and their innovative experience in practice. Modern method of those needs satisfying is internal benchmarking. According to Bain & Co internal benchmarking is one the three most common methods of business management.The features and benefits of benchmarking are defined in the article. The sequence and methodology of implementation of individual stages of benchmarking technology projects are formulated.The authors define benchmarking as a strategic orientation on the best achievement by comparing performance and working methods with the standard. It covers the processes of researching, organization of production and distribution, management and marketing methods to reference objects to identify innovative practices and its implementation in a particular business.Benchmarking development at domestic enterprises requires analysis of theoretical bases and practical experience. Choice best of experience helps to develop recommendations for their application in practice.Also it is essential to classificate species, identify characteristics, study appropriate areas of use and development methodology of implementation. The structure of internal benchmarking objectives includes: promoting research and establishment of minimum acceptable levels of efficiency processes and activities which are available at the enterprise; identification of current problems and areas that need improvement without involvement of foreign experience

  17. Benchmarks and statistics of entanglement dynamics

    Energy Technology Data Exchange (ETDEWEB)

    Tiersch, Markus

    2009-09-04

    In the present thesis we investigate how the quantum entanglement of multicomponent systems evolves under realistic conditions. More specifically, we focus on open quantum systems coupled to the (uncontrolled) degrees of freedom of an environment. We identify key quantities that describe the entanglement dynamics, and provide efficient tools for its calculation. For quantum systems of high dimension, entanglement dynamics can be characterized with high precision. In the first part of this work, we derive evolution equations for entanglement. These formulas determine the entanglement after a given time in terms of a product of two distinct quantities: the initial amount of entanglement and a factor that merely contains the parameters that characterize the dynamics. The latter is given by the entanglement evolution of an initially maximally entangled state. A maximally entangled state thus benchmarks the dynamics, and hence allows for the immediate calculation or - under more general conditions - estimation of the change in entanglement. Thereafter, a statistical analysis supports that the derived (in-)equalities describe the entanglement dynamics of the majority of weakly mixed and thus experimentally highly relevant states with high precision. The second part of this work approaches entanglement dynamics from a topological perspective. This allows for a quantitative description with a minimum amount of assumptions about Hilbert space (sub-)structure and environment coupling. In particular, we investigate the limit of increasing system size and density of states, i.e. the macroscopic limit. In this limit, a universal behaviour of entanglement emerges following a ''reference trajectory'', similar to the central role of the entanglement dynamics of a maximally entangled state found in the first part of the present work. (orig.)

  18. Toxicological benchmarks for wildlife: 1994 Revision

    Energy Technology Data Exchange (ETDEWEB)

    Opresko, D.M.; Sample, B.E.; Suter, G.W. II

    1994-09-01

    The process by which ecological risks of environmental contaminants are evaluated is two-tiered. The first tier is a screening assessment where concentrations of contaminants in the environment are compared to toxicological benchmarks which represent concentrations of chemicals in environmental media (water, sediment, soil, food, etc.) that are presumed to be nonhazardous to the surrounding biota. The second tier is a baseline ecological risk assessment where toxicological benchmarks are one of several lines of evidence used to support or refute the presence of ecological effects. The report presents toxicological benchmarks for assessment of effects of 76 chemicals on 8 representative mammalian wildlife species and 31 chemicals on 9 avian wildlife species. The chemicals are some of those that occur at United States Department of Energy waste sites; the wildlife species were chosen because they are widely distributed and provide a representative range of body sizes and diets. Further descriptions of the chosen wildlife species and chemicals are provided in the report. The benchmarks presented in this report represent values believed to be nonhazardous for the listed wildlife species. These benchmarks only consider contaminant exposure through oral ingestion of contaminated media; exposure through inhalation or direct dermal exposure are not considered in this report.

  19. Exchange Rate Exposure Management: The Benchmarking Process of Industrial Companies

    DEFF Research Database (Denmark)

    Aabo, Tom

    Based on a cross-case study of Danish industrial companies the paper analyzes the benchmarking of the optimal hedging strategy. A stock market approach is pursued but a serious question mark is put on the validity of the obtained information seen from a corporate value-adding point of view....... The conducted interviews show that empirical reasons behind actual hedging strategies vary considerably - some in accordance with mainstream finance theory, some resting on asymmetric information. The diversity of attitudes seems to be partly a result of different competitive environments, partly a result...

  20. Experiment vs simulation RT WFNDEC 2014 benchmark: CIVA results

    Science.gov (United States)

    Tisseur, D.; Costin, M.; Rattoni, B.; Vienne, C.; Vabre, A.; Cattiaux, G.; Sollier, T.

    2015-03-01

    The French Atomic Energy Commission and Alternative Energies (CEA) has developed for years the CIVA software dedicated to simulation of NDE techniques such as Radiographic Testing (RT). RT modelling is achieved in CIVA using combination of a determinist approach based on ray tracing for transmission beam simulation and a Monte Carlo model for the scattered beam computation. Furthermore, CIVA includes various detectors models, in particular common x-ray films and a photostimulable phosphor plates. This communication presents the results obtained with the configurations proposed in the World Federation of NDEC 2014 RT modelling benchmark with the RT models implemented in the CIVA software.