WorldWideScience

Sample records for benchmark criticality calculations

  1. Validation of VHTRC calculation benchmark of critical experiment using the MCB code

    Directory of Open Access Journals (Sweden)

    Stanisz Przemysław

    2016-01-01

    Full Text Available The calculation benchmark problem Very High Temperature Reactor Critical (VHTR a pin-in-block type core critical assembly has been investigated with the Monte Carlo Burnup (MCB code in order to validate the latest version of Nuclear Data Library based on ENDF format. Executed benchmark has been made on the basis of VHTR benchmark available from the International Handbook of Evaluated Reactor Physics Benchmark Experiments. This benchmark is useful for verifying the discrepancies in keff values between various libraries and experimental values. This allows to improve accuracy of the neutron transport calculations that may help in designing the high performance commercial VHTRs. Almost all safety parameters depend on the accuracy of neutron transport calculation results that, in turn depend on the accuracy of nuclear data libraries. Thus, evaluation of the libraries applicability to VHTR modelling is one of the important subjects. We compared the numerical experiment results with experimental measurements using two versions of available nuclear data (ENDF-B-VII.1 and JEFF-3.2 prepared for required temperatures. Calculations have been performed with the MCB code which allows to obtain very precise representation of complex VHTR geometry, including the double heterogeneity of a fuel element. In this paper, together with impact of nuclear data, we discuss also the impact of different lattice modelling inside the fuel pins. The discrepancies of keff have been successfully observed and show good agreement with each other and with the experimental data within the 1 σ range of the experimental uncertainty. Because some propagated discrepancies observed, we proposed appropriate corrections in experimental constants which can improve the reactivity coefficient dependency. Obtained results confirm the accuracy of the new Nuclear Data Libraries.

  2. OECD/NEA burnup credit calculational criticality benchmark Phase I-B results

    Energy Technology Data Exchange (ETDEWEB)

    DeHart, M.D.; Parks, C.V. [Oak Ridge National Lab., TN (United States); Brady, M.C. [Sandia National Labs., Las Vegas, NV (United States)

    1996-06-01

    In most countries, criticality analysis of LWR fuel stored in racks and casks has assumed that the fuel is fresh with the maximum allowable initial enrichment. This assumption has led to the design of widely spaced and/or highly poisoned storage and transport arrays. If credit is assumed for fuel burnup, initial enrichment limitations can be raised in existing systems, and more compact and economical arrays can be designed. Such reliance on the reduced reactivity of spent fuel for criticality control is referred to as burnup credit. The Burnup Credit Working Group, formed under the auspices of the Nuclear Energy Agency of the Organization for Economic Cooperation and Development, has established a set of well-defined calculational benchmarks designed to study significant aspects of burnup credit computational methods. These benchmarks are intended to provide a means for the intercomparison of computer codes, methods, and data applied in spent fuel analysis. The benchmarks have been divided into multiple phases, each phase focusing on a particular feature of burnup credit analysis. This report summarizes the results and findings of the Phase I-B benchmark, which was proposed to provide a comparison of the ability of different code systems and data libraries to perform depletion analysis for the prediction of spent fuel isotopic concentrations. Results included here represent 21 different sets of calculations submitted by 16 different organizations worldwide and are based on a limited set of nuclides determined to have the most important effect on the neutron multiplication factor of light-water-reactor spent fuel. A comparison of all sets of results demonstrates that most methods agree to within 10% in the ability to estimate the spent fuel concentrations of most actinides. All methods agree within 11% about the average for all fission products studied. Most deviations are less than 10%, and many are less than 5%. The exceptions are Sm 149, Sm 151, and Gd 155.

  3. OECD/NEA burnup credit criticality benchmarks phase IIIA: Criticality calculations of BWR spent fuel assemblies in storage and transport

    Energy Technology Data Exchange (ETDEWEB)

    Okuno, Hiroshi; Naito, Yoshitaka [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment; Ando, Yoshihira [Toshiba Corp., Kawasaki, Kanagawa (Japan)

    2000-09-01

    The report describes the final results of Phase IIIA Benchmarks conducted by the Burnup Credit Criticality Calculation Working Group under the auspices of the Nuclear Energy Agency of the Organization for Economic Cooperation and Development (OECD/NEA). The benchmarks are intended to confirm the predictive capability of the current computer code and data library combinations for the neutron multiplication factor (k{sub eff}) of a layer of irradiated BWR fuel assembly array model. In total 22 benchmark problems are proposed for calculations of k{sub eff}. The effects of following parameters are investigated: cooling time, inclusion/exclusion of FP nuclides and axial burnup profile, and inclusion of axial profile of void fraction or constant void fractions during burnup. Axial profiles of fractional fission rates are further requested for five cases out of the 22 problems. Twenty-one sets of results are presented, contributed by 17 institutes from 9 countries. The relative dispersion of k{sub eff} values calculated by the participants from the mean value is almost within the band of {+-}1%{delta}k/k. The deviations from the averaged calculated fission rate profiles are found to be within {+-}5% for most cases. (author)

  4. OECD/NEA Burnup Credit Calculational Criticality Benchmark Phase I-B Results

    Energy Technology Data Exchange (ETDEWEB)

    DeHart, M.D.

    1993-01-01

    Burnup credit is an ongoing technical concern for many countries that operate commercial nuclear power reactors. In a multinational cooperative effort to resolve burnup credit issues, a Burnup Credit Working Group has been formed under the auspices of the Nuclear Energy Agency of the Organization for Economic Cooperation and Development. This working group has established a set of well-defined calculational benchmarks designed to study significant aspects of burnup credit computational methods. These benchmarks are intended to provide a means for the intercomparison of computer codes, methods, and data applied in spent fuel analysis. The benchmarks have been divided into multiple phases, each phase focusing on a particular feature of burnup credit analysis. This report summarizes the results and findings of the Phase I-B benchmark, which was proposed to provide a comparison of the ability of different code systems and data libraries to perform depletion analysis for the prediction of spent fuel isotopic concentrations. Results included here represent 21 different sets of calculations submitted by 16 different organizations worldwide, and are based on a limited set of nuclides determined to have the most important effect on the neutron multiplication factor of light-water-reactor spent fuel. A comparison of all sets of results demonstrates that most methods are in agreement to within 10% in the ability to estimate the spent fuel concentrations of most actinides. All methods are within 11% agreement about the average for all fission products studied. Furthermore, most deviations are less than 10%, and many are less than 5%. The exceptions are {sup 149}Sm, {sup 151}Sm, and {sup 155}Gd.

  5. Benchmark calculation of SCALE-PC 4.3 CSAS6 module and burnup credit criticality analysis

    Energy Technology Data Exchange (ETDEWEB)

    Shin, Hee Sung; Ro, Seong Gy; Shin, Young Joon; Kim, Ik Soo [Korea Atomic Energy Research Institute, Taejon (Korea)

    1998-12-01

    Calculation biases of SCALE-PC CSAS6 module for PWR spent fuel, metallized spent fuel and solution of nuclear materials have been determined on the basis of the benchmark to be 0.01100, 0.02650 and 0.00997, respectively. With the aid of the code system, nuclear criticality safety analysis for the spent fuel storage pool has been carried out to determine the minimum burnup of spent fuel required for safe storage. The criticality safety analysis is performed using three types of isotopic composition of spent fuel: ORIGEN2-calculated isotopic compositions; the conservative inventory obtained from the multiplication of ORIGEN2-calculated isotopic compositions by isotopic correction factors; the conservative inventory of only U, Pu and {sup 241}Am. The results show that the minimum burnup for three cases are 990,6190 and 7270 MWd/tU, respectively in the case of 5.0 wt% initial enriched spent fuel. (author). 74 refs., 68 figs., 35 tabs.

  6. The MCNP6 Analytic Criticality Benchmark Suite

    Energy Technology Data Exchange (ETDEWEB)

    Brown, Forrest B. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States). Monte Carlo Codes Group

    2016-06-16

    Analytical benchmarks provide an invaluable tool for verifying computer codes used to simulate neutron transport. Several collections of analytical benchmark problems [1-4] are used routinely in the verification of production Monte Carlo codes such as MCNP® [5,6]. Verification of a computer code is a necessary prerequisite to the more complex validation process. The verification process confirms that a code performs its intended functions correctly. The validation process involves determining the absolute accuracy of code results vs. nature. In typical validations, results are computed for a set of benchmark experiments using a particular methodology (code, cross-section data with uncertainties, and modeling) and compared to the measured results from the set of benchmark experiments. The validation process determines bias, bias uncertainty, and possibly additional margins. Verification is generally performed by the code developers, while validation is generally performed by code users for a particular application space. The VERIFICATION_KEFF suite of criticality problems [1,2] was originally a set of 75 criticality problems found in the literature for which exact analytical solutions are available. Even though the spatial and energy detail is necessarily limited in analytical benchmarks, typically to a few regions or energy groups, the exact solutions obtained can be used to verify that the basic algorithms, mathematics, and methods used in complex production codes perform correctly. The present work has focused on revisiting this benchmark suite. A thorough review of the problems resulted in discarding some of them as not suitable for MCNP benchmarking. For the remaining problems, many of them were reformulated to permit execution in either multigroup mode or in the normal continuous-energy mode for MCNP. Execution of the benchmarks in continuous-energy mode provides a significant advance to MCNP verification methods.

  7. Benchmark of Different Electromagnetic Codes for the High Frequency Calculation

    Energy Technology Data Exchange (ETDEWEB)

    Kai Tian, Haipeng Wang, Frank Marhauser, Guangfeng Cheng, Chuandong Zhou

    2009-05-01

    In this paper, we present benchmarking results for highclass 3D electromagnetic (EM) codes in designing RF cavities today. These codes include Omega3P [1], VORPAL [2], CST Microwave Studio [3], Ansoft HFSS [4], and ANSYS [5]. Two spherical cavities are selected as the benchmark models. We have compared not only the accuracy of resonant frequencies, but also that of surface EM fields, which are critical for superconducting RF cavities. By removing degenerated modes, we calculate all the resonant modes up to 10 GHz with similar mesh densities, so that the geometry approximation and field interpolation error related to the wavelength can be observed.

  8. Extended calculations of OECD/NEA phase II-C burnup credit criticality benchmark problem for PWR spent fuel transport cask by using MCNP-4B2 code and JENDL-3.2 library

    Energy Technology Data Exchange (ETDEWEB)

    Kuroishi, Takeshi; Hoang, Anh Tuan; Nomura, Yasushi; Okuno, Hiroshi [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment

    2003-03-01

    The reactivity effect of the asymmetry of axial burnup profile in burnup credit criticality safety is studied for a realistic PWR spent fuel transport cask proposed in the current OECD/NEA Phase II-C benchmark problem. The axial burnup profiles are simulated in 21 material zones based on in-core flux measurements varying from strong asymmetry to more or less no asymmetry. Criticality calculations in a 3-D model have been performed using the continuous energy Monte Carlo code MCNP-4B2 and the nuclear data library JENDL-3.2. Calculation conditions are determined with consideration of the axial fission source convergence. Calculations are carried out not only for cases proposed in the benchmark but also for additional cases assuming symmetric burnup profile. The actinide-only approach supposed for first domestic introduction of burnup credit into criticality evaluation is also considered in addition to the actinide plus fission product approach adopted in the benchmark. The calculated results show that k{sub eff} and the end effect increase almost linearly with increasing burnup axial offset that is defined as one of typical parameters showing the intensity of axial burnup asymmetry. The end effect is more sensitive to the asymmetry of burnup profile for the higher burnup. For an axially distributed burnup, the axial fission source distribution becomes strongly asymmetric as its peak shifts toward the top end of the fuel's active zone where the local burnup is less than that of the bottom end. The peak of fission source distribution becomes higher with the increase of either the asymmetry of burnup profile or the assembly-averaged burnup. The conservatism of the assumption of uniform axial burnup based on the actinide-only approach is estimated quantitatively in comparison with the k{sub eff} result calculated with experiment-based strongest asymmetric axial burnup profile with the actinide plus fission product approach. (author)

  9. RECENT ADDITIONS OF CRITICALITY SAFETY RELATED INTEGRAL BENCHMARK DATA TO THE ICSBEP AND IRPHEP HANDBOOKS

    Energy Technology Data Exchange (ETDEWEB)

    J. Blair Briggs; Lori Scott; Yolanda Rugama; Enrico Sartori

    2009-09-01

    High-quality integral benchmark experiments have always been a priority for criticality safety. However, interest in integral benchmark data is increasing as efforts to quantify and reduce calculational uncertainties accelerate to meet the demands of future criticality safety needs to support next generation reactor and advanced fuel cycle concepts. The importance of drawing upon existing benchmark data is becoming more apparent because of dwindling availability of critical facilities worldwide and the high cost of performing new experiments. Integral benchmark data from the International Handbook of Evaluated Criticality Safety Benchmark Experiments and the International Handbook of Reactor Physics Benchmark Experiments are widely used. Benchmark data have been added to these two handbooks since the last Nuclear Criticality Safety Division Topical Meeting in Knoxville, Tennessee (September 2005). This paper highlights these additions.

  10. Assessment of evaluated (n,d) energy-angle elastic scattering distributions using MCNP simulations of critical measurements and simplified calculation benchmarks

    Energy Technology Data Exchange (ETDEWEB)

    Kozier, K.S. [Atomic Energy of Canada Limited, Chalk River Laboratories, Chalk River, Ontario (Canada)

    2008-07-01

    Different evaluated (n,d) energy-angle elastic scattering distributions produce k-effective differences in MCNP5 simulations of critical experiments involving heavy water (D{sub 2}O) of sufficient magnitude to suggest a need for new (n,d) scattering measurements and/or distributions derived from modern theoretical nuclear models, especially at neutron energies below a few MeV. The present work focuses on the small reactivity change of < 1 mk that is observed in the MCNP5 D{sub 2}O coolant-void-reactivity calculation bias for simulations of two pairs of critical experiments performed in the ZED-2 reactor at the Chalk River Laboratories when different nuclear data libraries are used for deuterium. The deuterium data libraries tested include Endf/B-VII.0, Endf/B-VI.4, JENDL-3.3 and a new evaluation, labelled Bonn-B, which is based on recent theoretical nuclear-model calculations. Comparison calculations were also performed for a simplified, two-region, spherical model having an inner, 250-cm radius, homogeneous sphere of UO{sub 2}, without and with deuterium, and an outer 20-cm-thick deuterium reflector. A notable observation from this work is the reduction of about 0.4 mk in the MCNP5 ZED-2 CVR calculation bias that is obtained when the O-in-UO{sub 2} thermal scattering data comes from Endf-B-VII.0. (author)

  11. Standard Guide for Benchmark Testing of Light Water Reactor Calculations

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    2010-01-01

    1.1 This guide covers general approaches for benchmarking neutron transport calculations in light water reactor systems. A companion guide (Guide E2005) covers use of benchmark fields for testing neutron transport calculations and cross sections in well controlled environments. This guide covers experimental benchmarking of neutron fluence calculations (or calculations of other exposure parameters such as dpa) in more complex geometries relevant to reactor surveillance. Particular sections of the guide discuss: the use of well-characterized benchmark neutron fields to provide an indication of the accuracy of the calculational methods and nuclear data when applied to typical cases; and the use of plant specific measurements to indicate bias in individual plant calculations. Use of these two benchmark techniques will serve to limit plant-specific calculational uncertainty, and, when combined with analytical uncertainty estimates for the calculations, will provide uncertainty estimates for reactor fluences with ...

  12. 42 CFR 422.258 - Calculation of benchmarks.

    Science.gov (United States)

    2010-10-01

    ... 42 Public Health 3 2010-10-01 2010-10-01 false Calculation of benchmarks. 422.258 Section 422.258... and Plan Approval § 422.258 Calculation of benchmarks. (a) The term “MA area-specific non-drug monthly benchmark amount” means, for a month in a year: (1) For MA local plans with service areas entirely within...

  13. ICSBEP Criticality Benchmark Eigenvalues with ENDF/B-VII.1 Cross Sections

    Energy Technology Data Exchange (ETDEWEB)

    Kahler, Albert C. III [Los Alamos National Laboratory; MacFarlane, Robert [Los Alamos National Laboratory

    2012-06-28

    We review MCNP eigenvalue calculations from a suite of International Criticality Safety Benchmark Evaluation Project (ICSBEP) Handbook evaluations with the recently distributed ENDF/B-VII.1 cross section library.

  14. WIPP Benchmark calculations with the large strain SPECTROM codes

    Energy Technology Data Exchange (ETDEWEB)

    Callahan, G.D.; DeVries, K.L. [RE/SPEC, Inc., Rapid City, SD (United States)

    1995-08-01

    This report provides calculational results from the updated Lagrangian structural finite-element programs SPECTROM-32 and SPECTROM-333 for the purpose of qualifying these codes to perform analyses of structural situations in the Waste Isolation Pilot Plant (WIPP). Results are presented for the Second WIPP Benchmark (Benchmark II) Problems and for a simplified heated room problem used in a parallel design calculation study. The Benchmark II problems consist of an isothermal room problem and a heated room problem. The stratigraphy involves 27 distinct geologic layers including ten clay seams of which four are modeled as frictionless sliding interfaces. The analyses of the Benchmark II problems consider a 10-year simulation period. The evaluation of nine structural codes used in the Benchmark II problems shows that inclusion of finite-strain effects is not as significant as observed for the simplified heated room problem, and a variety of finite-strain and small-strain formulations produced similar results. The simplified heated room problem provides stratigraphic complexity equivalent to the Benchmark II problems but neglects sliding along the clay seams. The simplified heated problem does, however, provide a calculational check case where the small strain-formulation produced room closures about 20 percent greater than those obtained using finite-strain formulations. A discussion is given of each of the solved problems, and the computational results are compared with available published results. In general, the results of the two SPECTROM large strain codes compare favorably with results from other codes used to solve the problems.

  15. Benchmarking calculations of excitonic couplings between bacteriochlorophylls

    CERN Document Server

    Kenny, Elise P

    2015-01-01

    Excitonic couplings between (bacterio)chlorophyll molecules are necessary for simulating energy transport in photosynthetic complexes. Many techniques for calculating the couplings are in use, from the simple (but inaccurate) point-dipole approximation to fully quantum-chemical methods. We compared several approximations to determine their range of applicability, noting that the propagation of experimental uncertainties poses a fundamental limit on the achievable accuracy. In particular, the uncertainty in crystallographic coordinates yields an uncertainty of about 20% in the calculated couplings. Because quantum-chemical corrections are smaller than 20% in most biologically relevant cases, their considerable computational cost is rarely justified. We therefore recommend the electrostatic TrEsp method across the entire range of molecular separations and orientations because its cost is minimal and it generally agrees with quantum-chemical calculations to better than the geometric uncertainty. We also caution ...

  16. The ORSphere Benchmark Evaluation and Its Potential Impact on Nuclear Criticality Safety

    Energy Technology Data Exchange (ETDEWEB)

    John D. Bess; Margaret A. Marshall; J. Blair Briggs

    2013-10-01

    In the early 1970’s, critical experiments using an unreflected metal sphere of highly enriched uranium (HEU) were performed with the focus to provide a “very accurate description…as an ideal benchmark for calculational methods and cross-section data files.” Two near-critical configurations of the Oak Ridge Sphere (ORSphere) were evaluated as acceptable benchmark experiments for inclusion in the International Handbook of Evaluated Criticality Safety Benchmark Experiments (ICSBEP Handbook). The results from those benchmark experiments were then compared with additional unmoderated and unreflected HEU metal benchmark experiment configurations currently found in the ICSBEP Handbook. For basic geometries (spheres, cylinders, and slabs) the eigenvalues calculated using MCNP5 and ENDF/B-VII.0 were within 3 of their respective benchmark values. There appears to be generally a good agreement between calculated and benchmark values for spherical and slab geometry systems. Cylindrical geometry configurations tended to calculate low, including more complex bare HEU metal systems containing cylinders. The ORSphere experiments do not calculate within their 1s uncertainty and there is a possibility that the effect of the measured uncertainties for the GODIVA I benchmark may need reevaluated. There is significant scatter in the calculations for the highly-correlated ORCEF cylinder experiments, which are constructed from close-fitting HEU discs and annuli. Selection of a nuclear data library can have a larger impact on calculated eigenvalue results than the variation found within calculations of a given experimental series, such as the ORCEF cylinders, using a single nuclear data set.

  17. INTEGRAL BENCHMARKS AVAILABLE THROUGH THE INTERNATIONAL REACTOR PHYSICS EXPERIMENT EVALUATION PROJECT AND THE INTERNATIONAL CRITICALITY SAFETY BENCHMARK EVALUATION PROJECT

    Energy Technology Data Exchange (ETDEWEB)

    J. Blair Briggs; Lori Scott; Enrico Sartori; Yolanda Rugama

    2008-09-01

    Interest in high-quality integral benchmark data is increasing as efforts to quantify and reduce calculational uncertainties accelerate to meet the demands of next generation reactor and advanced fuel cycle concepts. The International Reactor Physics Experiment Evaluation Project (IRPhEP) and the International Criticality Safety Benchmark Evaluation Project (ICSBEP) continue to expand their efforts and broaden their scope to identify, evaluate, and provide integral benchmark data for method and data validation. Benchmark model specifications provided by these two projects are used heavily by the international reactor physics, nuclear data, and criticality safety communities. Thus far, 14 countries have contributed to the IRPhEP, and 20 have contributed to the ICSBEP. The status of the IRPhEP and ICSBEP is discussed in this paper, and the future of the two projects is outlined and discussed. Selected benchmarks that have been added to the IRPhEP and ICSBEP handbooks since PHYSOR’06 are highlighted, and the future of the two projects is discussed.

  18. Providing Nuclear Criticality Safety Analysis Education through Benchmark Experiment Evaluation

    Energy Technology Data Exchange (ETDEWEB)

    John D. Bess; J. Blair Briggs; David W. Nigg

    2009-11-01

    One of the challenges that today's new workforce of nuclear criticality safety engineers face is the opportunity to provide assessment of nuclear systems and establish safety guidelines without having received significant experience or hands-on training prior to graduation. Participation in the International Criticality Safety Benchmark Evaluation Project (ICSBEP) and/or the International Reactor Physics Experiment Evaluation Project (IRPhEP) provides students and young professionals the opportunity to gain experience and enhance critical engineering skills.

  19. Benchmark density functional theory calculations for nanoscale conductance

    DEFF Research Database (Denmark)

    Strange, Mikkel; Bækgaard, Iben Sig Buur; Thygesen, Kristian Sommer;

    2008-01-01

    We present a set of benchmark calculations for the Kohn-Sham elastic transmission function of five representative single-molecule junctions. The transmission functions are calculated using two different density functional theory methods, namely an ultrasoft pseudopotential plane-wave code...... in combination with maximally localized Wannier functions and the norm-conserving pseudopotential code SIESTA which applies an atomic orbital basis set. All calculations have been converged with respect to the supercell size and the number of k(parallel to) points in the surface plane. For all systems we find...

  20. Criticality safety benchmark evaluation project: Recovering the past

    Energy Technology Data Exchange (ETDEWEB)

    Trumble, E.F.

    1997-06-01

    A very brief summary of the Criticality Safety Benchmark Evaluation Project of the Westinghouse Savannah River Company is provided in this paper. The purpose of the project is to provide a source of evaluated criticality safety experiments in an easily usable format. Another project goal is to search for any experiments that may have been lost or contain discrepancies, and to determine if they can be used. Results of evaluated experiments are being published as US DOE handbooks.

  1. VENUS-F: A fast lead critical core for benchmarking

    Energy Technology Data Exchange (ETDEWEB)

    Kochetkov, A.; Wagemans, J.; Vittiglio, G. [SCK.CEN, Boeretang 200, 2400 Mol (Belgium)

    2011-07-01

    The zero-power thermal neutron water-moderated facility VENUS at SCK-CEN has been extensively used for benchmarking in the past. In accordance with GEN-IV design tasks (fast reactor systems and accelerator driven systems), the VENUS facility was modified in 2007-2010 into the fast neutron facility VENUS-F with solid core components. This paper introduces the projects GUINEVERE and FREYA, which are being conducted at the VENUS-F facility, and it presents the measurement results obtained at the first critical core. Throughout the projects other fast lead benchmarks also will be investigated. The measurement results of the different configurations can all be used as fast neutron benchmarks. (authors)

  2. Criticality benchmark guide for light-water-reactor fuel in transportation and storage packages

    Energy Technology Data Exchange (ETDEWEB)

    Lichtenwalter, J.J.; Bowman, S.M.; DeHart, M.D.; Hopper, C.M.

    1997-03-01

    This report is designed as a guide for performing criticality benchmark calculations for light-water-reactor (LWR) fuel applications. The guide provides documentation of 180 criticality experiments with geometries, materials, and neutron interaction characteristics representative of transportation packages containing LWR fuel or uranium oxide pellets or powder. These experiments should benefit the U.S. Nuclear Regulatory Commission (NRC) staff and licensees in validation of computational methods used in LWR fuel storage and transportation concerns. The experiments are classified by key parameters such as enrichment, water/fuel volume, hydrogen-to-fissile ratio (H/X), and lattice pitch. Groups of experiments with common features such as separator plates, shielding walls, and soluble boron are also identified. In addition, a sample validation using these experiments and a statistical analysis of the results are provided. Recommendations for selecting suitable experiments and determination of calculational bias and uncertainty are presented as part of this benchmark guide.

  3. AGING FACILITY CRITICALITY SAFETY CALCULATIONS

    Energy Technology Data Exchange (ETDEWEB)

    C.E. Sanders

    2004-09-10

    The purpose of this design calculation is to revise and update the previous criticality calculation for the Aging Facility (documented in BSC 2004a). This design calculation will also demonstrate and ensure that the storage and aging operations to be performed in the Aging Facility meet the criticality safety design criteria in the ''Project Design Criteria Document'' (Doraswamy 2004, Section 4.9.2.2), and the functional nuclear criticality safety requirement described in the ''SNF Aging System Description Document'' (BSC [Bechtel SAIC Company] 2004f, p. 3-12). The scope of this design calculation covers the systems and processes for aging commercial spent nuclear fuel (SNF) and staging Department of Energy (DOE) SNF/High-Level Waste (HLW) prior to its placement in the final waste package (WP) (BSC 2004f, p. 1-1). Aging commercial SNF is a thermal management strategy, while staging DOE SNF/HLW will make loading of WPs more efficient (note that aging DOE SNF/HLW is not needed since these wastes are not expected to exceed the thermal limits form emplacement) (BSC 2004f, p. 1-2). The description of the changes in this revised document is as follows: (1) Include DOE SNF/HLW in addition to commercial SNF per the current ''SNF Aging System Description Document'' (BSC 2004f). (2) Update the evaluation of Category 1 and 2 event sequences for the Aging Facility as identified in the ''Categorization of Event Sequences for License Application'' (BSC 2004c, Section 7). (3) Further evaluate the design and criticality controls required for a storage/aging cask, referred to as MGR Site-specific Cask (MSC), to accommodate commercial fuel outside the content specification in the Certificate of Compliance for the existing NRC-certified storage casks. In addition, evaluate the design required for the MSC that will accommodate DOE SNF/HLW. This design calculation will achieve the objective of providing the

  4. 47 CFR 54.805 - Zone and study area above benchmark revenues calculated by the Administrator.

    Science.gov (United States)

    2010-10-01

    ... 47 Telecommunication 3 2010-10-01 2010-10-01 false Zone and study area above benchmark revenues... Mechanism § 54.805 Zone and study area above benchmark revenues calculated by the Administrator. (a) The following steps shall be performed by the Administrator to determine Zone Above Benchmark Revenues for...

  5. Benchmarking analytical calculations of proton doses in heterogeneous matter.

    Science.gov (United States)

    Ciangaru, George; Polf, Jerimy C; Bues, Martin; Smith, Alfred R

    2005-12-01

    A proton dose computational algorithm, performing an analytical superposition of infinitely narrow proton beamlets (ASPB) is introduced. The algorithm uses the standard pencil beam technique of laterally distributing the central axis broad beam doses according to the Moliere scattering theory extended to slablike varying density media. The purpose of this study was to determine the accuracy of our computational tool by comparing it with experimental and Monte Carlo (MC) simulation data as benchmarks. In the tests, parallel wide beams of protons were scattered in water phantoms containing embedded air and bone materials with simple geometrical forms and spatial dimensions of a few centimeters. For homogeneous water and bone phantoms, the proton doses we calculated with the ASPB algorithm were found very comparable to experimental and MC data. For layered bone slab inhomogeneity in water, the comparison between our analytical calculation and the MC simulation showed reasonable agreement, even when the inhomogeneity was placed at the Bragg peak depth. There also was reasonable agreement for the parallelepiped bone block inhomogeneity placed at various depths, except for cases in which the bone was located in the region of the Bragg peak, when discrepancies were as large as more than 10%. When the inhomogeneity was in the form of abutting air-bone slabs, discrepancies of as much as 8% occurred in the lateral dose profiles on the air cavity side of the phantom. Additionally, the analytical depth-dose calculations disagreed with the MC calculations within 3% of the Bragg peak dose, at the entry and midway depths in the phantom. The distal depth-dose 20%-80% fall-off widths and ranges calculated with our algorithm and the MC simulation were generally within 0.1 cm of agreement. The analytical lateral-dose profile calculations showed smaller (by less than 0.1 cm) 20%-80% penumbra widths and shorter fall-off tails than did those calculated by the MC simulations. Overall

  6. Benchmarking criticality analysis of TRIGA fuel storage racks.

    Science.gov (United States)

    Robinson, Matthew Loren; DeBey, Timothy M; Higginbotham, Jack F

    2017-01-01

    A criticality analysis was benchmarked to sub-criticality measurements of the hexagonal fuel storage racks at the United States Geological Survey TRIGA MARK I reactor in Denver. These racks, which hold up to 19 fuel elements each, are arranged at 0.61m (2 feet) spacings around the outer edge of the reactor. A 3-dimensional model was created of the racks using MCNP5, and the model was verified experimentally by comparison to measured subcritical multiplication data collected in an approach to critical loading of two of the racks. The validated model was then used to show that in the extreme condition where the entire circumference of the pool was lined with racks loaded with used fuel the storage array is subcritical with a k value of about 0.71; well below the regulatory limit of 0.8. A model was also constructed of the rectangular 2×10 fuel storage array used in many other TRIGA reactors to validate the technique against the original TRIGA licensing sub-critical analysis performed in 1966. The fuel used in this study was standard 20% enriched (LEU) aluminum or stainless steel clad TRIGA fuel.

  7. Benchmarks to supplant export FPDR (Floating Point Data Rate) calculations

    Energy Technology Data Exchange (ETDEWEB)

    Bailey, D.; Brooks, E.; Dongarra, J.; Hayes, A.; Lyon, G.

    1988-06-01

    Because modern computer architectures render application of the FPDR (Floating Point Data Processing Rate) increasingly difficult, there has been increased interest in export evaluation via actual system performances. The report discusses benchmarking of uniprocessor (usually vector) machines for scientific computation (SIMD array processors are not included), and parallel processing and its characterization for export control.

  8. Calculational Benchmark Problems for VVER-1000 Mixed Oxide Fuel Cycle

    Energy Technology Data Exchange (ETDEWEB)

    Emmett, M.B.

    2000-03-17

    Standard problems were created to test the ability of American and Russian computational methods and data regarding the analysis of the storage and handling of Russian pressurized water reactor (VVER) mixed oxide fuel. Criticality safety and radiation shielding problems were analyzed. Analysis of American and Russian multiplication factors for fresh fuel storage for low-enriched uranium (UOX), weapons- (MOX-W) and reactor-grade (MOX-R) MOX differ by less than 2% for all variations of water density. For shielding calculations for fresh fuel, the ORNL results for the neutron source differ from the Russian results by less than 1% for UOX and MOX-R and by approximately 3% for MOX-W. For shielding calculations for fresh fuel assemblies, neutron dose rates at the surface of the assemblies differ from the Russian results by 5% to 9%; the level of agreement for gamma dose varies depending on the type of fuel, with UOX differing by the largest amount. The use of different gamma group structures and instantaneous versus asymptotic decay assumptions also complicate the comparison. For the calculation of dose rates from spent fuel in a shipping cask, the neutron source for UOX after 3-year cooling is within 1% and for MOX-W within 5% of one of the Russian results while the MOX-R difference is the largest at over 10%. These studies are a portion of the documentation required by the Russian nuclear regulatory authority, GAN, in order to certify Russian programs and data as being acceptably accurate for the analysis of mixed oxide fuels.

  9. Standardizing Benchmark Dose Calculations to Improve Science-Based Decisions in Human Health Assessments

    Science.gov (United States)

    Wignall, Jessica A.; Shapiro, Andrew J.; Wright, Fred A.; Woodruff, Tracey J.; Chiu, Weihsueh A.; Guyton, Kathryn Z.

    2014-01-01

    Background: Benchmark dose (BMD) modeling computes the dose associated with a prespecified response level. While offering advantages over traditional points of departure (PODs), such as no-observed-adverse-effect-levels (NOAELs), BMD methods have lacked consistency and transparency in application, interpretation, and reporting in human health assessments of chemicals. Objectives: We aimed to apply a standardized process for conducting BMD modeling to reduce inconsistencies in model fitting and selection. Methods: We evaluated 880 dose–response data sets for 352 environmental chemicals with existing human health assessments. We calculated benchmark doses and their lower limits [10% extra risk, or change in the mean equal to 1 SD (BMD/L10/1SD)] for each chemical in a standardized way with prespecified criteria for model fit acceptance. We identified study design features associated with acceptable model fits. Results: We derived values for 255 (72%) of the chemicals. Batch-calculated BMD/L10/1SD values were significantly and highly correlated (R2 of 0.95 and 0.83, respectively, n = 42) with PODs previously used in human health assessments, with values similar to reported NOAELs. Specifically, the median ratio of BMDs10/1SD:NOAELs was 1.96, and the median ratio of BMDLs10/1SD:NOAELs was 0.89. We also observed a significant trend of increasing model viability with increasing number of dose groups. Conclusions: BMD/L10/1SD values can be calculated in a standardized way for use in health assessments on a large number of chemicals and critical effects. This facilitates the exploration of health effects across multiple studies of a given chemical or, when chemicals need to be compared, providing greater transparency and efficiency than current approaches. Citation: Wignall JA, Shapiro AJ, Wright FA, Woodruff TJ, Chiu WA, Guyton KZ, Rusyn I. 2014. Standardizing benchmark dose calculations to improve science-based decisions in human health assessments. Environ Health

  10. Benchmark calculations for elastic fermion-dimer scattering

    CERN Document Server

    Bour, Shahin; Lee, Dean; Meißner, Ulf-G

    2012-01-01

    We present continuum and lattice calculations for elastic scattering between a fermion and a bound dimer in the shallow binding limit. For the continuum calculation we use the Skorniakov-Ter-Martirosian (STM) integral equation to determine the scattering length and effective range parameter to high precision. For the lattice calculation we use the finite-volume method of L\\"uscher. We take into account topological finite-volume corrections to the dimer binding energy which depend on the momentum of the dimer. After subtracting these effects, we find from the lattice calculation kappa a_fd = 1.174(9) and kappa r_fd = -0.029(13). These results agree well with the continuum values kappa a_fd = 1.17907(1) and kappa r_fd = -0.0383(3) obtained from the STM equation. We discuss applications to cold atomic Fermi gases, deuteron-neutron scattering in the spin-quartet channel, and lattice calculations of scattering for nuclei and hadronic molecules at finite volume.

  11. Benchmark calculations on residue production within the EURISOL DS project; Part I: thin targets

    CERN Document Server

    David, J.C; Boudard, A; Doré, D; Leray, S; Rapp, B; Ridikas, D; Thiollière, N

    Report on benchmark calculations on residue production in thin targets. Calculations were performed using MCNPX 2.5.0 coupled to a selection of reaction models. The results were compared to nuclide production cross-sections measured in GSI in inverse kinematics

  12. Benchmark calculations on residue production within the EURISOL DS project; Part II: thick targets

    CERN Document Server

    David, J.-C; Boudard, A; Doré, D; Leray, S; Rapp, B; Ridikas, D; Thiollière, N

    Benchmark calculations on residue production using MCNPX 2.5.0. Calculations were compared to mass-distribution data for 5 different elements measured at ISOLDE, and to specific activities of 28 radionuclides in different places along the thick target measured in Dubna.

  13. Benchmarking Geant4 for spallation neutron source calculations

    Science.gov (United States)

    DiJulio, Douglas D.; Batkov, Konstantin; Stenander, John; Cherkashyna, Nataliia; Bentley, Phillip M.

    2016-09-01

    Geant4 is becoming increasingly used for radiation transport simulations of spallation neutron sources and related components. Historically, the code has seen little usage in this field and it is of general interest to investigate the suitability of Geant4 for such applications. For this purpose, we carried out Geant4 calculations based on simple spallation source geometries and also with the the European Spallation Source Technical Design Report target and moderator configuration. The results are compared to calculations performed with the Monte Carlo N- Particle extended code. The comparisons are carried out over the full spallation neutron source energy spectrum, from sub-eV energies up to thousands of MeV. Our preliminary results reveal that there is generally good agreement between the simulations using both codes. Additionally, we have also implemented a general weight-window generator for Geant4 based applications and present some results of the method applied to the ESS target model.

  14. Neutronics Benchmarks for the Utilization of Mixed-Oxide Fuel: Joint US/Russian Progress Report for Fiscal 1997. Volume 3 - Calculations Performed in the Russian Federation

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1998-06-01

    This volume of the progress report provides documentation of reactor physics and criticality safety studies conducted in the Russian Federation during fiscal year 1997 and sponsored by the Fissile Materials Disposition Program of the US Department of Energy. Descriptions of computational and experimental benchmarks for the verification and validation of computer programs for neutron physics analyses are included. All benchmarks include either plutonium, uranium, or mixed uranium and plutonium fuels. Calculated physics parameters are reported for all of the contaminated benchmarks that the United States and Russia mutually agreed in November 1996 were applicable to mixed-oxide fuel cycles for light-water reactors.

  15. Benchmarking Benchmarks

    NARCIS (Netherlands)

    D.C. Blitz (David)

    2011-01-01

    textabstractBenchmarking benchmarks is a bundle of six studies that are inspired by the prevalence of benchmarking in academic finance research as well as in investment practice. Three studies examine if current benchmark asset pricing models adequately describe the cross-section of stock returns. W

  16. BENCHMARKING UPGRADED HOTSPOT DOSE CALCULATIONS AGAINST MACCS2 RESULTS

    Energy Technology Data Exchange (ETDEWEB)

    Brotherton, Kevin

    2009-04-30

    The radiological consequence of interest for a documented safety analysis (DSA) is the centerline Total Effective Dose Equivalent (TEDE) incurred by the Maximally Exposed Offsite Individual (MOI) evaluated at the 95th percentile consequence level. An upgraded version of HotSpot (Version 2.07) has been developed with the capabilities to read site meteorological data and perform the necessary statistical calculations to determine the 95th percentile consequence result. These capabilities should allow HotSpot to join MACCS2 (Version 1.13.1) and GENII (Version 1.485) as radiological consequence toolbox codes in the Department of Energy (DOE) Safety Software Central Registry. Using the same meteorological data file, scenarios involving a one curie release of {sup 239}Pu were modeled in both HotSpot and MACCS2. Several sets of release conditions were modeled, and the results compared. In each case, input parameter specifications for each code were chosen to match one another as much as the codes would allow. The results from the two codes are in excellent agreement. Slight differences observed in results are explained by algorithm differences.

  17. A Critical Thinking Benchmark for a Department of Agricultural Education and Studies

    Science.gov (United States)

    Perry, Dustin K.; Retallick, Michael S.; Paulsen, Thomas H.

    2014-01-01

    Due to an ever changing world where technology seemingly provides endless answers, today's higher education students must master a new skill set reflecting an emphasis on critical thinking, problem solving, and communications. The purpose of this study was to establish a departmental benchmark for critical thinking abilities of students majoring…

  18. VENUS-2 MOX Core Benchmark: Results of ORNL Calculations Using HELIOS-1.4

    Energy Technology Data Exchange (ETDEWEB)

    Ellis, RJ

    2001-02-02

    The Task Force on Reactor-Based Plutonium Disposition, now an Expert Group, was set up through the Organization for Economic Cooperation and Development/Nuclear Energy Agency to facilitate technical assessments of burning weapons-grade plutonium mixed-oxide (MOX) fuel in U.S. pressurized-water reactors and Russian VVER nuclear reactors. More than ten countries participated to advance the work of the Task Force in a major initiative, which was a blind benchmark study to compare code benchmark calculations against experimental data for the VENUS-2 MOX core at SCK-CEN in Mol, Belgium. At the Oak Ridge National Laboratory, the HELIOS-1.4 code was used to perform a comprehensive study of pin-cell and core calculations for the VENUS-2 benchmark.

  19. Assessment of the uncertainties of MULTICELL calculations by the OECD NEA UAM PWR pin cell burnup benchmark

    Energy Technology Data Exchange (ETDEWEB)

    Kereszturi, Andras [Hungarian Academy of Sciences, Budapest (Hungary). Centre for Energy Research; Panka, Istvan

    2015-09-15

    Defining precisely the burnup of the nuclear fuel is important from the point of view of core design calculations, safety analyses, criticality calculations (e.g. burnup credit calculations), etc. This paper deals with the uncertainties of MULTICELL calculations obtained by the solution of the OECD NEA UAM PWR pin cell burnup benchmark. In this assessment Monte-Carlo type statistical analyses are applied and the energy dependent covariance matrices of the cross-sections are taken into account. Additionally, the impact of the uncertainties of the fission yields is also considered. The target quantities are the burnup dependent uncertainties of the infinite multiplication factor, the two-group cross-sections, the reaction rates and the number densities of some isotopes up to the burnup of 60 MWd/kgU. In the paper the burnup dependent tendencies of the corresponding uncertainties and their sources are analyzed.

  20. Reactor Physics and Criticality Benchmark Evaluations for Advanced Nuclear Fuel - Final Technical Report

    Energy Technology Data Exchange (ETDEWEB)

    William Anderson; James Tulenko; Bradley Rearden; Gary Harms

    2008-09-11

    The nuclear industry interest in advanced fuel and reactor design often drives towards fuel with uranium enrichments greater than 5 wt% 235U. Unfortunately, little data exists, in the form of reactor physics and criticality benchmarks, for uranium enrichments ranging between 5 and 10 wt% 235U. The primary purpose of this project is to provide benchmarks for fuel similar to what may be required for advanced light water reactors (LWRs). These experiments will ultimately provide additional information for application to the criticality-safety bases for commercial fuel facilities handling greater than 5 wt% 235U fuel.

  1. Criticality benchmarks validation of the Monte Carlo code TRIPOLI-2

    Energy Technology Data Exchange (ETDEWEB)

    Maubert, L. (Commissariat a l' Energie Atomique, Inst. de Protection et de Surete Nucleaire, Service d' Etudes de Criticite, 92 - Fontenay-aux-Roses (France)); Nouri, A. (Commissariat a l' Energie Atomique, Inst. de Protection et de Surete Nucleaire, Service d' Etudes de Criticite, 92 - Fontenay-aux-Roses (France)); Vergnaud, T. (Commissariat a l' Energie Atomique, Direction des Reacteurs Nucleaires, Service d' Etudes des Reacteurs et de Mathematique Appliquees, 91 - Gif-sur-Yvette (France))

    1993-04-01

    The three-dimensional energy pointwise Monte-Carlo code TRIPOLI-2 includes metallic spheres of uranium and plutonium, nitrate plutonium solutions, square and triangular pitch assemblies of uranium oxide. Results show good agreements between experiments and calculations, and avoid a part of the code and its ENDF-B4 library validation. (orig./DG)

  2. ENDF/B-VII.1 Neutron Cross Section Data Testing with Critical Assembly Benchmarks and Reactor Experiments

    Energy Technology Data Exchange (ETDEWEB)

    Kahler, A. [Los Alamos National Laboratory (LANL); Macfarlane, R E [Los Alamos National Laboratory (LANL); Mosteller, R D [Los Alamos National Laboratory (LANL); Kiedrowski, B C [Los Alamos National Laboratory (LANL); Frankle, S C [Los Alamos National Laboratory (LANL); Chadwick, M. B. [Los Alamos National Laboratory (LANL); Mcknight, R D [Argonne National Laboratory (ANL); Lell, R M [Argonne National Laboratory (ANL); Palmiotti, G [Idaho National Laboratory (INL); Hiruta, h [Idaho National Laboratory (INL); Herman, Micheal W [Brookhaven National Laboratory (BNL); Arcilla, r [Brookhaven National Laboratory (BNL); Mughabghab, S F [Brookhaven National Laboratory (BNL); Sublet, J C [Culham Science Center, Abington, UK; Trkov, A. [Jozef Stefan Institute, Slovenia; Trumbull, T H [Knolls Atomic Power Laboratory; Dunn, Michael E [ORNL

    2011-01-01

    The ENDF/B-VII.1 library is the latest revision to the United States' Evaluated Nuclear Data File (ENDF). The ENDF library is currently in its seventh generation, with ENDF/B-VII.0 being released in 2006. This revision expands upon that library, including the addition of new evaluated files (was 393 neutron files previously, now 423 including replacement of elemental vanadium and zinc evaluations with isotopic evaluations) and extension or updating of many existing neutron data files. Complete details are provided in the companion paper [1]. This paper focuses on how accurately application libraries may be expected to perform in criticality calculations with these data. Continuous energy cross section libraries, suitable for use with the MCNP Monte Carlo transport code, have been generated and applied to a suite of nearly one thousand critical benchmark assemblies defined in the International Criticality Safety Benchmark Evaluation Project's International Handbook of Evaluated Criticality Safety Benchmark Experiments. This suite covers uranium and plutonium fuel systems in a variety of forms such as metallic, oxide or solution, and under a variety of spectral conditions, including unmoderated (i.e., bare), metal reflected and water or other light element reflected. Assembly eigenvalues that were accurately predicted with ENDF/B-VII.0 cross sections such as unrnoderated and uranium reflected (235)U and (239)Pu assemblies, HEU solution systems and LEU oxide lattice systems that mimic commercial PWR configurations continue to be accurately calculated with ENDF/B-VII.1 cross sections, and deficiencies in predicted eigenvalues for assemblies containing selected materials, including titanium, manganese, cadmium and tungsten are greatly reduced. Improvements are also confirmed for selected actinide reaction rates such as (236)U; (238,242)Pu and (241,243)Am capture in fast systems. Other deficiencies, such as the overprediction of Pu solution system critical

  3. ENDF/B-VII.1 Neutron Cross Section Data Testing with Critical Assembly Benchmarks and Reactor Experiments

    Energy Technology Data Exchange (ETDEWEB)

    Kahler, A.C.; Herman, M.; Kahler,A.C.; MacFarlane,R.E.; Mosteller,R.D.; Kiedrowski,B.C.; Frankle,S.C.; Chadwick,M.B.; McKnight,R.D.; Lell,R.M.; Palmiotti,G.; Hiruta,H.; Herman,M.; Arcilla,R.; Mughabghab,S.F.; Sublet,J.C.; Trkov,A.; Trumbull,T.H.; Dunn,M.

    2011-12-01

    The ENDF/B-VII.1 library is the latest revision to the United States Evaluated Nuclear Data File (ENDF). The ENDF library is currently in its seventh generation, with ENDF/B-VII.0 being released in 2006. This revision expands upon that library, including the addition of new evaluated files (was 393 neutron files previously, now 423 including replacement of elemental vanadium and zinc evaluations with isotopic evaluations) and extension or updating of many existing neutron data files. Complete details are provided in the companion paper [M. B. Chadwick et al., 'ENDF/B-VII.1 Nuclear Data for Science and Technology: Cross Sections, Covariances, Fission Product Yields and Decay Data,' Nuclear Data Sheets, 112, 2887 (2011)]. This paper focuses on how accurately application libraries may be expected to perform in criticality calculations with these data. Continuous energy cross section libraries, suitable for use with the MCNP Monte Carlo transport code, have been generated and applied to a suite of nearly one thousand critical benchmark assemblies defined in the International Criticality Safety Benchmark Evaluation Project's International Handbook of Evaluated Criticality Safety Benchmark Experiments. This suite covers uranium and plutonium fuel systems in a variety of forms such as metallic, oxide or solution, and under a variety of spectral conditions, including unmoderated (i.e., bare), metal reflected and water or other light element reflected. Assembly eigenvalues that were accurately predicted with ENDF/B-VII.0 cross sections such as unmoderated and uranium reflected {sup 235}U and {sup 239}Pu assemblies, HEU solution systems and LEU oxide lattice systems that mimic commercial PWR configurations continue to be accurately calculated with ENDF/B-VII.1 cross sections, and deficiencies in predicted eigenvalues for assemblies containing selected materials, including titanium, manganese, cadmium and tungsten are greatly reduced. Improvements are also

  4. ENDF/B-VII.1 Neutron Cross Section Data Testing with Critical Assembly Benchmarks and Reactor Experiments

    Science.gov (United States)

    Kahler, A. C.; MacFarlane, R. E.; Mosteller, R. D.; Kiedrowski, B. C.; Frankle, S. C.; Chadwick, M. B.; McKnight, R. D.; Lell, R. M.; Palmiotti, G.; Hiruta, H.; Herman, M.; Arcilla, R.; Mughabghab, S. F.; Sublet, J. C.; Trkov, A.; Trumbull, T. H.; Dunn, M.

    2011-12-01

    The ENDF/B-VII.1 library is the latest revision to the United States' Evaluated Nuclear Data File (ENDF). The ENDF library is currently in its seventh generation, with ENDF/B-VII.0 being released in 2006. This revision expands upon that library, including the addition of new evaluated files (was 393 neutron files previously, now 423 including replacement of elemental vanadium and zinc evaluations with isotopic evaluations) and extension or updating of many existing neutron data files. Complete details are provided in the companion paper [M. B. Chadwick et al., "ENDF/B-VII.1 Nuclear Data for Science and Technology: Cross Sections, Covariances, Fission Product Yields and Decay Data," Nuclear Data Sheets, 112, 2887 (2011)]. This paper focuses on how accurately application libraries may be expected to perform in criticality calculations with these data. Continuous energy cross section libraries, suitable for use with the MCNP Monte Carlo transport code, have been generated and applied to a suite of nearly one thousand critical benchmark assemblies defined in the International Criticality Safety Benchmark Evaluation Project's International Handbook of Evaluated Criticality Safety Benchmark Experiments. This suite covers uranium and plutonium fuel systems in a variety of forms such as metallic, oxide or solution, and under a variety of spectral conditions, including unmoderated (i.e., bare), metal reflected and water or other light element reflected. Assembly eigenvalues that were accurately predicted with ENDF/B-VII.0 cross sections such as unmoderated and uranium reflected 235U and 239Pu assemblies, HEU solution systems and LEU oxide lattice systems that mimic commercial PWR configurations continue to be accurately calculated with ENDF/B-VII.1 cross sections, and deficiencies in predicted eigenvalues for assemblies containing selected materials, including titanium, manganese, cadmium and tungsten are greatly reduced. Improvements are also confirmed for selected

  5. Implementation and qualification of MCNP 5 through the intercomparison with the benchmark for the calculation of critical systems Godiva and Jezebel; Implementacao e qualificacao do MCNP5 atraves da intercomparacao com o benchmark para o calculo dos sistemas criticos Godiva e Jezebel

    Energy Technology Data Exchange (ETDEWEB)

    Lara, Rafael G.; Maiorino, Jose R., E-mail: rafael.lara@aluno.ufabc.edu.br, E-mail: joserubens.maiorino@ufabc.edu.br [Universidade Federal do ABC (UFABC), Santo Andre, SP (Brazil). Centro de Engenharia, Modelagem e Ciencias Sociais Aplicadas

    2013-07-01

    This work aimed at the implementation and qualification of MCNP code in a supercomputer of the Universidade Federal do ABC, so that may be available a next-generation simulation tool for precise calculations of nuclear reactors and systems subject to radiation. The implementation of this tool will have multidisciplinary applications, covering various areas of engineering (nuclear, aerospace, biomedical), radiation physics and others.

  6. Benchmark Testing of a New 56Fe Evaluation for Criticality Safety Applications

    Energy Technology Data Exchange (ETDEWEB)

    Leal, Luiz C [ORNL; Ivanov, E. [Institut de Radioprotection et de Surete Nucleaire

    2015-01-01

    The SAMMY code was used to evaluate resonance parameters of the 56Fe cross section in the resolved resonance energy range of 0–2 MeV using transmission data, capture, elastic, inelastic, and double differential elastic cross sections. The resonance analysis was performed with the code SAMMY that fits R-matrix resonance parameters using the generalized least-squares technique (Bayes’ theory). The evaluation yielded a set of resonance parameters that reproduced the experimental data very well, along with a resonance parameter covariance matrix for data uncertainty calculations. Benchmark tests were conducted to assess the evaluation performance in benchmark calculations.

  7. Benchmark Calculations on the Atomization Enthalpy,Geometry and Vibrational Frequencies of UF6 with Relativistic DFT Methods

    Institute of Scientific and Technical Information of China (English)

    XIAO Hai; LI Jun

    2008-01-01

    Benchmark calculations on the molar atomization enthalpy, geometry, and vibrational frequencies of uranium hexafluoride (UF6) have been performed by using relativistic density functional theory (DFT) with various levels of relativistic effects, different types of basis sets, and exchange-correlation functionals. Scalar relativistic effects are shown to be critical for the structural properties. The spin-orbit coupling effects are important for the calculated energies, but are much less important for other calculated ground-state properties of closed-shell UF6. We conclude through systematic investigations that ZORA- and RECP-based relativistic DFT methods are both appropriate for incorporating relativistic effects. Comparisons of different types of basis sets (Slater, Gaussian, and plane-wave types) and various levels of theoretical approximation of the exchange-correlation functionals were also made.

  8. An integral nodal variational method for multigroup criticality calculations

    Energy Technology Data Exchange (ETDEWEB)

    Lewis, E.E. [Northwestern Univ., Evanston, IL (United States). Dept. of Mechanical Engineering]. E-mail: e-lewis@northwestern.edu; Smith, M.A.; Palmiotti, G. [Argonne National Lab., IL (United States)]. E-mail: masmith@ra.anl.gov; gpalmiotti@ra.anl.gov; Tsoulfanidis, N. [Missouri Univ., Rolla, MO (United States). Dept. of Nuclear Engineering]. E-mail: tsoul@umr.edu

    2003-07-01

    An integral formulation of the variational nodal method is presented and applied to a series of benchmark critically problems. The method combines an integral transport treatment of the even-parity flux within the spatial node with an odd-parity spherical harmonics expansion of the Lagrange multipliers at the node interfaces. The response matrices that result from this formulation are compatible with those in the VARIANT code at Argonne National Laboratory. Either homogeneous or heterogeneous nodes may be employed. In general, for calculations requiring higher-order angular approximations, the integral method yields solutions with comparable accuracy while requiring substantially less CPU time and memory than the standard spherical harmonics expansion using the same spatial approximations. (author)

  9. EA-MC Neutronic Calculations on IAEA ADS Benchmark 3.2

    Energy Technology Data Exchange (ETDEWEB)

    Dahlfors, Marcus [Uppsala Univ. (Sweden). Dept. of Radiation Sciences; Kadi, Yacine [CERN, Geneva (Switzerland). Emerging Energy Technologies

    2006-01-15

    The neutronics and the transmutation properties of the IAEA ADS benchmark 3.2 setup, the 'Yalina' experiment or ISTC project B-70, have been studied through an extensive amount of 3-D Monte Carlo calculations at CERN. The simulations were performed with the state-of-the-art computer code package EA-MC, developed at CERN. The calculational approach is outlined and the results are presented in accordance with the guidelines given in the benchmark description. A variety of experimental conditions and parameters are examined; three different fuel rod configurations and three types of neutron sources are applied to the system. Reactivity change effects introduced by removal of fuel rods in both central and peripheral positions are also computed. Irradiation samples located in a total of 8 geometrical positions are examined. Calculations of capture reaction rates in {sup 129}I, {sup 237}Np and {sup 243}Am samples and of fission reaction rates in {sup 235}U, {sup 237}Np and {sup 243}Am samples are presented. Simulated neutron flux densities and energy spectra as well as spectral indices inside experimental channels are also given according to benchmark specifications. Two different nuclear data libraries, JAR-95 and JENDL-3.2, are applied for the calculations.

  10. VENUS-2 MOX Core Benchmark: Results of ORNL Calculations Using HELIOS-1.4 - Revised Report

    Energy Technology Data Exchange (ETDEWEB)

    Ellis, RJ

    2001-06-01

    The Task Force on Reactor-Based Plutonium Disposition (TFRPD) was formed by the Organization for Economic Cooperation and Development/Nuclear Energy Agency (OECD/NEA) to study reactor physics, fuel performance, and fuel cycle issues related to the disposition of weapons-grade (WG) plutonium as mixed-oxide (MOX) reactor fuel. To advance the goals of the TFRPD, 10 countries and 12 institutions participated in a major TFRPD activity: a blind benchmark study to compare code calculations to experimental data for the VENUS-2 MOX core at SCK-CEN in Mol, Belgium. At Oak Ridge National Laboratory, the HELIOS-1.4 code system was used to perform the comprehensive study of pin-cell and MOX core calculations for the VENUS-2 MOX core benchmark study.

  11. CANISTER HANDLING FACILITY CRITICALITY SAFETY CALCULATIONS

    Energy Technology Data Exchange (ETDEWEB)

    C.E. Sanders

    2005-04-07

    This design calculation revises and updates the previous criticality evaluation for the canister handling, transfer and staging operations to be performed in the Canister Handling Facility (CHF) documented in BSC [Bechtel SAIC Company] 2004 [DIRS 167614]. The purpose of the calculation is to demonstrate that the handling operations of canisters performed in the CHF meet the nuclear criticality safety design criteria specified in the ''Project Design Criteria (PDC) Document'' (BSC 2004 [DIRS 171599], Section 4.9.2.2), the nuclear facility safety requirement in ''Project Requirements Document'' (Canori and Leitner 2003 [DIRS 166275], p. 4-206), the functional/operational nuclear safety requirement in the ''Project Functional and Operational Requirements'' document (Curry 2004 [DIRS 170557], p. 75), and the functional nuclear criticality safety requirements described in the ''Canister Handling Facility Description Document'' (BSC 2004 [DIRS 168992], Sections 3.1.1.3.4.13 and 3.2.3). Specific scope of work contained in this activity consists of updating the Category 1 and 2 event sequence evaluations as identified in the ''Categorization of Event Sequences for License Application'' (BSC 2004 [DIRS 167268], Section 7). The CHF is limited in throughput capacity to handling sealed U.S. Department of Energy (DOE) spent nuclear fuel (SNF) and high-level radioactive waste (HLW) canisters, defense high-level radioactive waste (DHLW), naval canisters, multicanister overpacks (MCOs), vertical dual-purpose canisters (DPCs), and multipurpose canisters (MPCs) (if and when they become available) (BSC 2004 [DIRS 168992], p. 1-1). It should be noted that the design and safety analyses of the naval canisters are the responsibility of the U.S. Department of the Navy (Naval Nuclear Propulsion Program) and will not be included in this document. In addition, this calculation is valid for

  12. Criticality experiments and benchmarks for cross section evaluation: the neptunium case

    Directory of Open Access Journals (Sweden)

    Duran I.

    2013-03-01

    Full Text Available The 237Np neutron-induced fission cross section has been recently measured in a large energy range (from eV to GeV at the n_TOF facility at CERN. When compared to previous measurement the n_TOF fission cross section appears to be higher by 5-7% beyond the fission threshold. To check the relevance of n_TOF data, we apply a criticality experiment performed at Los Alamos with a 6 kg sphere of 237Np, surrounded by enriched uranium 235U so as to approach criticality with fast neutrons. The multiplication factor ke f f of the calculation is in better agreement with the experiment (the deviation of 750 pcm is reduced to 250 pcm when we replace the ENDF/B-VII.0 evaluation of the 237Np fission cross section by the n_TOF data. We also explore the hypothesis of deficiencies of the inelastic cross section in 235U which has been invoked by some authors to explain the deviation of 750 pcm. With compare to inelastic large distortion calculation, it is incompatible with existing measurements. Also we show that the v of 237Np can hardly be incriminated because of the high accuracy of the existing data. Fission rate ratios or averaged fission cross sections measured in several fast neutron fields seem to give contradictory results on the validation of the 237Np cross section but at least one of the benchmark experiments, where the active deposits have been well calibrated for the number of atoms, favors the n_TOF data set. These outcomes support the hypothesis of a higher fission cross section of 237Np.

  13. Criticality experiments and benchmarks for cross section evaluation: the neptunium case

    Science.gov (United States)

    Leong, L. S.; Tassan-Got, L.; Audouin, L.; Paradela, C.; Wilson, J. N.; Tarrio, D.; Berthier, B.; Duran, I.; Le Naour, C.; Stéphan, C.

    2013-03-01

    The 237Np neutron-induced fission cross section has been recently measured in a large energy range (from eV to GeV) at the n_TOF facility at CERN. When compared to previous measurement the n_TOF fission cross section appears to be higher by 5-7% beyond the fission threshold. To check the relevance of n_TOF data, we apply a criticality experiment performed at Los Alamos with a 6 kg sphere of 237Np, surrounded by enriched uranium 235U so as to approach criticality with fast neutrons. The multiplication factor ke f f of the calculation is in better agreement with the experiment (the deviation of 750 pcm is reduced to 250 pcm) when we replace the ENDF/B-VII.0 evaluation of the 237Np fission cross section by the n_TOF data. We also explore the hypothesis of deficiencies of the inelastic cross section in 235U which has been invoked by some authors to explain the deviation of 750 pcm. With compare to inelastic large distortion calculation, it is incompatible with existing measurements. Also we show that the v of 237Np can hardly be incriminated because of the high accuracy of the existing data. Fission rate ratios or averaged fission cross sections measured in several fast neutron fields seem to give contradictory results on the validation of the 237Np cross section but at least one of the benchmark experiments, where the active deposits have been well calibrated for the number of atoms, favors the n_TOF data set. These outcomes support the hypothesis of a higher fission cross section of 237Np.

  14. Continuum discretization methods in a composite-particle scattering off a nucleus: the benchmark calculations

    CERN Document Server

    Rubtsova, O A; Moro, A M

    2008-01-01

    The direct comparison of two different continuum discretization methods towards the solution of a composite particle scattering off a nucleus is presented. The first approach -- the Continumm-Discretized Coupled Channel method -- is based on the differential equation formalism, while the second one -- the Wave-Packet Continuum Discretization method -- uses the integral equation formulation for the composite-particle scattering problem. As benchmark calculations we have chosen the deuteron off \

  15. Status of benchmark calculations of the neutron characteristics of the cascade molten salt ADS for the nuclear waste incineration

    Energy Technology Data Exchange (ETDEWEB)

    Dudnikov, A.A.; Alekseev, P.N.; Subbotin, S.A.; Vasiliev, A.V.; Abagyan, L.P.; Alexeyev, N.I.; Gomin, E.A.; Ponomarev, L.I.; Kolyaskin, O.E.; Men' shikov, L.I. [Russian Research Centre Kurchatov Inst., Moscow (Russian Federation); Kolesov, V.F.; Ivanin, I.A.; Zavialov, N.V. [Russian Federal Nuclear Center, RFNC-VNIIEF, Nizhnii Novgorod region (Russian Federation)

    2001-07-01

    The facility for incineration of long-lived minor actinides and some dangerous fission products should be an important feature of the future nuclear power (NP). For many reasons the liquid-fuel reactor driven by accelerator can be considered as the perspective reactor- burner for radioactive waste. The fuel of such reactor is the fluoride molten salt composition with minor actinides (Np, Cm, Am) and some fission products ({sup 99}Tc, {sup 129}I, etc.). Preliminary analysis shows that the values of keff, calculated with different codes and nuclear data differ up to several percents for such fuel compositions. Reliable critical and subcritical benchmark experiments with molten salt fuel compositions with significant quantities of minor actinides are absent. One of the main tasks for the numerical study of this problem is the estimation of nuclear data for such fuel compositions and verification of the different numerical codes used for the calculation of keff, neutron spectra and reaction rates. It is especially important for the resonance region where experimental data are poor or absent. The calculation benchmark of the cascade subcritical molten salt reactor is developed. For the chosen nuclear fuel composition the comparison of the results obtained by three different Monte-Carlo codes (MCNP4A, MCU, and C95) using three different nuclear data libraries are presented. This report concerns the investigation of subcritical molten salt reactor unit main peculiarities carried out at the beginning of ISTC project 1486. (author)

  16. Criticality Calculations with MCNP6 - Practical Lectures

    Energy Technology Data Exchange (ETDEWEB)

    Brown, Forrest B. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States). Monte Carlo Methods, Codes, and Applications (XCP-3); Rising, Michael Evan [Los Alamos National Lab. (LANL), Los Alamos, NM (United States). Monte Carlo Methods, Codes, and Applications (XCP-3); Alwin, Jennifer Louise [Los Alamos National Lab. (LANL), Los Alamos, NM (United States). Monte Carlo Methods, Codes, and Applications (XCP-3)

    2016-11-29

    These slides are used to teach MCNP (Monte Carlo N-Particle) usage to nuclear criticality safety analysts. The following are the lecture topics: course information, introduction, MCNP basics, criticality calculations, advanced geometry, tallies, adjoint-weighted tallies and sensitivities, physics and nuclear data, parameter studies, NCS validation I, NCS validation II, NCS validation III, case study 1 - solution tanks, case study 2 - fuel vault, case study 3 - B&W core, case study 4 - simple TRIGA, case study 5 - fissile mat. vault, criticality accident alarm systems. After completion of this course, you should be able to: Develop an input model for MCNP; Describe how cross section data impact Monte Carlo and deterministic codes; Describe the importance of validation of computer codes and how it is accomplished; Describe the methodology supporting Monte Carlo codes and deterministic codes; Describe pitfalls of Monte Carlo calculations; Discuss the strengths and weaknesses of Monte Carlo and Discrete Ordinants codes; The diffusion theory model is not strictly valid for treating fissile systems in which neutron absorption, voids, and/or material boundaries are present. In the context of these limitations, identify a fissile system for which a diffusion theory solution would be adequate.

  17. 239Pu Prompt Fission Neutron Spectra Impact on a Set of Criticality and Experimental Reactor Benchmarks

    Science.gov (United States)

    Peneliau, Y.; Litaize, O.; Archier, P.; De Saint Jean, C.

    2014-04-01

    A large set of nuclear data are investigated to improve the calculation predictions of the new neutron transport simulation codes. With the next generation of nuclear power plants (GEN IV projects), one expects to reduce the calculated uncertainties which are mainly coming from nuclear data and are still very important, before taking into account integral information in the adjustment process. In France, future nuclear power plant concepts will probably use MOX fuel, either in Sodium Fast Reactors or in Gas Cooled Fast Reactors. Consequently, the knowledge of 239Pu cross sections and other nuclear data is crucial issue in order to reduce these sources of uncertainty. The Prompt Fission Neutron Spectra (PFNS) for 239Pu are part of these relevant data (an IAEA working group is even dedicated to PFNS) and the work presented here deals with this particular topic. The main international data files (i.e. JEFF-3.1.1, ENDF/B-VII.0, JENDL-4.0, BRC-2009) have been considered and compared with two different spectra, coming from the works of Maslov and Kornilov respectively. The spectra are first compared by calculating their mathematical moments in order to characterize them. Then, a reference calculation using the whole JEFF-3.1.1 evaluation file is performed and compared with another calculation performed with a new evaluation file, in which the data block containing the fission spectra (MF=5, MT=18) is replaced by the investigated spectra (one for each evaluation). A set of benchmarks is used to analyze the effects of PFNS, covering criticality cases and mock-up cases in various neutron flux spectra (thermal, intermediate, and fast flux spectra). Data coming from many ICSBEP experiments are used (PU-SOL-THERM, PU-MET-FAST, PU-MET-INTER and PU-MET-MIXED) and French mock-up experiments are also investigated (EOLE for thermal neutron flux spectrum and MASURCA for fast neutron flux spectrum). This study shows that many experiments and neutron parameters are very sensitive to

  18. Uncertainties in Monte Carlo-based absorbed dose calculations for an experimental benchmark.

    Science.gov (United States)

    Renner, F; Wulff, J; Kapsch, R-P; Zink, K

    2015-10-01

    There is a need to verify the accuracy of general purpose Monte Carlo codes like EGSnrc, which are commonly employed for investigations of dosimetric problems in radiation therapy. A number of experimental benchmarks have been published to compare calculated values of absorbed dose to experimentally determined values. However, there is a lack of absolute benchmarks, i.e. benchmarks without involved normalization which may cause some quantities to be cancelled. Therefore, at the Physikalisch-Technische Bundesanstalt a benchmark experiment was performed, which aimed at the absolute verification of radiation transport calculations for dosimetry in radiation therapy. A thimble-type ionization chamber in a solid phantom was irradiated by high-energy bremsstrahlung and the mean absorbed dose in the sensitive volume was measured per incident electron of the target. The characteristics of the accelerator and experimental setup were precisely determined and the results of a corresponding Monte Carlo simulation with EGSnrc are presented within this study. For a meaningful comparison, an analysis of the uncertainty of the Monte Carlo simulation is necessary. In this study uncertainties with regard to the simulation geometry, the radiation source, transport options of the Monte Carlo code and specific interaction cross sections are investigated, applying the general methodology of the Guide to the expression of uncertainty in measurement. Besides studying the general influence of changes in transport options of the EGSnrc code, uncertainties are analyzed by estimating the sensitivity coefficients of various input quantities in a first step. Secondly, standard uncertainties are assigned to each quantity which are known from the experiment, e.g. uncertainties for geometric dimensions. Data for more fundamental quantities such as photon cross sections and the I-value of electron stopping powers are taken from literature. The significant uncertainty contributions are identified as

  19. FUEL HANDLING FACILITY CRITICALITY SAFETY CALCULATIONS

    Energy Technology Data Exchange (ETDEWEB)

    C.E. Sanders

    2005-06-30

    The purpose of this design calculation is to perform a criticality evaluation of the Fuel Handling Facility (FHF) and the operations and processes performed therein. The current intent of the FHF is to receive transportation casks whose contents will be unloaded and transferred to waste packages (WP) or MGR Specific Casks (MSC) in the fuel transfer bays. Further, the WPs will also be prepared in the FHF for transfer to the sub-surface facility (for disposal). The MSCs will be transferred to the Aging Facility for storage. The criticality evaluation of the FHF features the following: (I) Consider the types of waste to be received in the FHF as specified below: (1) Uncanistered commercial spent nuclear fuel (CSNF); (2) Canistered CSNF (with the exception of horizontal dual-purpose canister (DPC) and/or multi-purpose canisters (MPCs)); (3) Navy canistered SNF (long and short); (4) Department of Energy (DOE) canistered high-level waste (HLW); and (5) DOE canistered SNF (with the exception of MCOs). (II) Evaluate the criticality analyses previously performed for the existing Nuclear Regulatory Commission (NRC)-certified transportation casks (under 10 CFR 71) to be received in the FHF to ensure that these analyses address all FHF conditions including normal operations, and Category 1 and 2 event sequences. (III) Evaluate FHF criticality conditions resulting from various Category 1 and 2 event sequences. Note that there are currently no Category 1 and 2 event sequences identified for FHF. Consequently, potential hazards from a criticality point of view will be considered as identified in the ''Internal Hazards Analysis for License Application'' document (BSC 2004c, Section 6.6.4). (IV) Assess effects of potential moderator intrusion into the fuel transfer bay for defense in depth. The SNF/HLW waste transfer activity (i.e., assembly and canister transfer) that is being carried out in the FHF has been classified as safety category in the &apos

  20. Neutronics Benchmarks for the Utilization of Mixed-Oxide Fuel: Joint U.S./Russian Progress Report for Fiscal Year 1997 Volume 2-Calculations Performed in the United States

    Energy Technology Data Exchange (ETDEWEB)

    Primm III, RT

    2002-05-29

    This volume of the progress report provides documentation of reactor physics and criticality safety studies conducted in the US during fiscal year 1997 and sponsored by the Fissile Materials Disposition Program of the US Department of Energy. Descriptions of computational and experimental benchmarks for the verification and validation of computer programs for neutron physics analyses are included. All benchmarks include either plutonium, uranium, or mixed uranium and plutonium fuels. Calculated physics parameters are reported for all of the computational benchmarks and for those experimental benchmarks that the US and Russia mutually agreed in November 1996 were applicable to mixed-oxide fuel cycles for light-water reactors.

  1. Subgroup Benchmark Calculations for the Intra-Pellet Nonuniform Temperature Cases

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Kang Seog [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Jung, Yeon Sang [Seoul National Univ. (Korea, Republic of); Liu, Yuxuan [Univ. of Michigan, Ann Arbor, MI (United States); Joo, Han Gyu [Seoul National Univ. (Korea, Republic of)

    2016-08-01

    A benchmark suite has been developed by Seoul National University (SNU) for intrapellet nonuniform temperature distribution cases based on the practical temperature profiles according to the thermal power levels. Though a new subgroup capability for nonuniform temperature distribution was implemented in MPACT, no validation calculation has been performed for the new capability. This study focuses on bench-marking the new capability through a code-to-code comparison. Two continuous-energy Monte Carlo codes, McCARD and CE-KENO, are engaged in obtaining reference solutions, and the MPACT results are compared to the SNU nTRACER using a similar cross section library and subgroup method to obtain self-shielded cross sections.

  2. Benchmarking density-functional-theory calculations of rotational g tensors and magnetizabilities using accurate coupled-cluster calculations

    Science.gov (United States)

    Lutnæs, Ola B.; Teale, Andrew M.; Helgaker, Trygve; Tozer, David J.; Ruud, Kenneth; Gauss, Jürgen

    2009-10-01

    An accurate set of benchmark rotational g tensors and magnetizabilities are calculated using coupled-cluster singles-doubles (CCSD) theory and coupled-cluster single-doubles-perturbative-triples [CCSD(T)] theory, in a variety of basis sets consisting of (rotational) London atomic orbitals. The accuracy of the results obtained is established for the rotational g tensors by careful comparison with experimental data, taking into account zero-point vibrational corrections. After an analysis of the basis sets employed, extrapolation techniques are used to provide estimates of the basis-set-limit quantities, thereby establishing an accurate benchmark data set. The utility of the data set is demonstrated by examining a wide variety of density functionals for the calculation of these properties. None of the density-functional methods are competitive with the CCSD or CCSD(T) methods. The need for a careful consideration of vibrational effects is clearly illustrated. Finally, the pure coupled-cluster results are compared with the results of density-functional calculations constrained to give the same electronic density. The importance of current dependence in exchange-correlation functionals is discussed in light of this comparison.

  3. Influence of the ab-initio nd cross sections in the critical heavy-water benchmarks

    CERN Document Server

    Morillon, B; Carbonell, J

    2013-01-01

    The n-d elastic and breakup cross sections are computed by solving the three-body Faddeev equations for realistic and semi-realistic Nucleon-Nucleon potentials. These cross sections are inserted in the Monte Carlo simulation of the nuclear processes considered in the International Handbook of Evaluated Criticality Safety Benchmark Experiments (ICSBEP). The results obtained using thes ab initio n-d cross sections are compared with those provided by the most renown international libraries.

  4. Calculations to an IAHR-benchmark test using the CFD-code CFX-4

    Energy Technology Data Exchange (ETDEWEB)

    Krepper, E.

    1998-10-01

    The calculation concerns a test, which was defined as a benchmark for 3-D codes by the working group of advanced nuclear reactor types of IAHR (International Association of Hydraulic Research). The test is well documented and detailed measuring results are available. The test aims at the investigation of phenomena, which are important for heat removal at natural circulation conditions in a nuclear reactor. The task for the calculation was the modelling of the forced flow field of a single phase incompressible fluid with consideration of heat transfer and influence of gravity. These phenomena are typical also for other industrial processes. The importance of correct modelling of these phenomena also for other applications is a motivation for performing these calculations. (orig.)

  5. Sensitivity of MCNP5 calculations for a spherical numerical benchmark problem to the angular scattering distributions for deuterium

    Energy Technology Data Exchange (ETDEWEB)

    Kozier, K. S. [Atomic Energy of Canada Limited, Chalk River Laboratories, Chalk River, Ont. K0J 1J0 (Canada)

    2006-07-01

    This paper examines the sensitivity of MCNP5 k{sub eff} results to various deuterium data files for a simple benchmark problem consisting of an 8.4-cm radius sphere of uranium surrounded by an annulus of deuterium at the nuclide number density corresponding to heavy water. This study was performed to help clarify why {Delta}k{sub eff} values of about 10 mk are obtained when different ENDF/B deuterium data files are used in simulations of critical experiments involving solutions of high-enrichment uranyl fluoride in heavy water, while simulations of low-leakage, heterogeneous critical lattices of natural-uranium fuel rods in heavy water show differences of <1 mk. The benchmark calculations were performed as a function of deuterium reflector thickness for several uranium compositions using deuterium ACE files derived from ENDF/B-VII.b1 (release beta 1), ENDF/B-VI.4 and JENDL-3.3, which differ primarily in the energy/angle distributions for elastic scattering <3.2 MeV. Calculations were also performed using modified ACE files having equiprobable cosine bin values in the centre-of-mass reference frame in a progressive manner with increasing energy. It was found that the {Delta}k{sub eff} values increased with deuterium reflector thickness and uranium enrichment. The studies using modified ACE files indicate that most of the reactivity differences arise at energies <1 MeV; hence, this energy range should be given priority if new scattering distribution measurements are undertaken. (authors)

  6. Experimental Data from the Benchmark SuperCritical Wing Wind Tunnel Test on an Oscillating Turntable

    Science.gov (United States)

    Heeg, Jennifer; Piatak, David J.

    2013-01-01

    The Benchmark SuperCritical Wing (BSCW) wind tunnel model served as a semi-blind testcase for the 2012 AIAA Aeroelastic Prediction Workshop (AePW). The BSCW was chosen as a testcase due to its geometric simplicity and flow physics complexity. The data sets examined include unforced system information and forced pitching oscillations. The aerodynamic challenges presented by this AePW testcase include a strong shock that was observed to be unsteady for even the unforced system cases, shock-induced separation and trailing edge separation. The current paper quantifies these characteristics at the AePW test condition and at a suggested benchmarking test condition. General characteristics of the model's behavior are examined for the entire available data set.

  7. Benchmark experiment to verify radiation transport calculations for dosimetry in radiation therapy; Benchmark-Experiment zur Verifikation von Strahlungstransportrechnungen fuer die Dosimetrie in der Strahlentherapie

    Energy Technology Data Exchange (ETDEWEB)

    Renner, Franziska [Physikalisch-Technische Bundesanstalt (PTB), Braunschweig (Germany)

    2016-11-01

    Monte Carlo simulations are regarded as the most accurate method of solving complex problems in the field of dosimetry and radiation transport. In (external) radiation therapy they are increasingly used for the calculation of dose distributions during treatment planning. In comparison to other algorithms for the calculation of dose distributions, Monte Carlo methods have the capability of improving the accuracy of dose calculations - especially under complex circumstances (e.g. consideration of inhomogeneities). However, there is a lack of knowledge of how accurate the results of Monte Carlo calculations are on an absolute basis. A practical verification of the calculations can be performed by direct comparison with the results of a benchmark experiment. This work presents such a benchmark experiment and compares its results (with detailed consideration of measurement uncertainty) with the results of Monte Carlo calculations using the well-established Monte Carlo code EGSnrc. The experiment was designed to have parallels to external beam radiation therapy with respect to the type and energy of the radiation, the materials used and the kind of dose measurement. Because the properties of the beam have to be well known in order to compare the results of the experiment and the simulation on an absolute basis, the benchmark experiment was performed using the research electron accelerator of the Physikalisch-Technische Bundesanstalt (PTB), whose beam was accurately characterized in advance. The benchmark experiment and the corresponding Monte Carlo simulations were carried out for two different types of ionization chambers and the results were compared. Considering the uncertainty, which is about 0.7 % for the experimental values and about 1.0 % for the Monte Carlo simulation, the results of the simulation and the experiment coincide.

  8. Validation of new {sup 240}Pu cross section and covariance data via criticality calculation

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Do Heon; Gil, Choong-Sup; Kim, Hyeong Il; Lee, Young-Ouk, E-mail: kimdh@kaeri.re.kr, E-mail: csgil@kaeri.re.kr, E-mail: hikim@kaeri.re.kr, E-mail: yolee@kaeri.re.kr [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of); Leal, Luiz C.; Dunn, Michael E., E-mail: leallc@ornl.gov, E-mail: dunnme@ornl.gov [Oak Ridge National Laboratory, TN (United States)

    2011-07-01

    Recent collaboration between KAERI and ORNL has completed an evaluation for {sup 240}Pu neutron cross section with covariance data. The new {sup 240}Pu cross section data has been validated through 28 criticality safety benchmark problems taken from the ICSBEP and/or CSEWG specifications with MCNP calculations. The calculation results based on the new evaluation have been compared with those based on recent evaluations such as ENDF/B-VII.0, JEFF-3.1.1, and JENDL-4.0. In addition, the new {sup 240}Pu covariance data has been tested for some criticality benchmarks via the DANTSYS/SUSD3D-based nuclear data sensitivity and uncertainty analysis of k{sub eff}. The k{sub eff} uncertainty estimates by the new covariance data has been compared with those by JENDL-4.0, JENDL-3.3, and Low-Fidelity covariance data. (author)

  9. Transient void, pressure drop and critical power BFBT benchmark analysis and results with VIPRE-W / MEFISTO-T

    Energy Technology Data Exchange (ETDEWEB)

    Le Corre, J.M.; Adamsson, C.; Alvarez, P., E-mail: lecorrjm@westinghouse.com, E-mail: carl.adamsson@psi.ch, E-mail: alvarep@westinghouse.com [Westinghouse Electric Sweden AB (Sweden)

    2011-07-01

    A benchmark analysis of the transient BFBT data [1], measured in an 8x8 fuel assembly design under typical BWR transient conditions, was performed using the VIPRE-W/MEFISTO-T code package. This is a continuation of the BFBT steady-state benchmark activities documented in [2] and [3]. All available transient void and pressure drop experimental data were considered and the measurements were compared with the predictions of the VIPRE-W sub-channel analysis code using various modeling approaches, including the EPRI drift flux void correlation. Detailed analyses of the code results were performed and it was demonstrated that the VIPRE-W transient predictions are generally reliable over the tested conditions. Available transient dryout data were also considered and the measurements were compared with the predictions of the VIPRE-W/ MEFISTO-T film flow calculations. The code calculates the transient multi-film flowrate distributions in the BFBT bundle, including the effect of spacer grids on drop deposition enhancement, and the dryout criterion corresponds to the total liquid film disappearance. After calibration of the grid enhancement effect with a very small subset of the steady-state critical power database, the code could predict the time and location of transient dryout with very good accuracy. (author)

  10. Hybrid Numerical Solvers for Massively Parallel Eigenvalue Computation and Their Benchmark with Electronic Structure Calculations

    CERN Document Server

    Imachi, Hiroto

    2015-01-01

    Optimally hybrid numerical solvers were constructed for massively parallel generalized eigenvalue problem (GEP).The strong scaling benchmark was carried out on the K computer and other supercomputers for electronic structure calculation problems in the matrix sizes of M = 10^4-10^6 with upto 105 cores. The procedure of GEP is decomposed into the two subprocedures of the reducer to the standard eigenvalue problem (SEP) and the solver of SEP. A hybrid solver is constructed, when a routine is chosen for each subprocedure from the three parallel solver libraries of ScaLAPACK, ELPA and EigenExa. The hybrid solvers with the two newer libraries, ELPA and EigenExa, give better benchmark results than the conventional ScaLAPACK library. The detailed analysis on the results implies that the reducer can be a bottleneck in next-generation (exa-scale) supercomputers, which indicates the guidance for future research. The code was developed as a middleware and a mini-application and will appear online.

  11. An Analytical Benchmark for the Calculation of Current Distribution in Superconducting Cables

    CERN Document Server

    Bottura, L; Fabbri, M G

    2002-01-01

    The validation of numerical codes for the calculation of current distribution and AC loss in superconducting cables versus experimental results is essential, but could be affected by approximations in the electromagnetic model or incertitude in the evaluation of the model parameters. A preliminary validation of the codes by means of a comparison with analytical results can therefore be very useful, in order to distinguish among different error sources. We provide here a benchmark analytical solution for current distribution that applies to the case of a cable described using a distributed parameters electrical circuit model. The analytical solution of current distribution is valid for cables made of a generic number of strands, subjected to well defined symmetry and uniformity conditions in the electrical parameters. The closed form solution for the general case is rather complex to implement, and in this paper we give the analytical solutions for different simplified situations. In particular we examine the ...

  12. Continuum discretization methods in a composite particle scattering off a nucleus: Benchmark calculations

    Science.gov (United States)

    Rubtsova, O. A.; Kukulin, V. I.; Moro, A. M.

    2008-09-01

    The direct comparison of two different continuum discretization methods toward the solution of a composite particle scattering off a nucleus is presented. The first approach—the continuum-discretized coupled-channel method—is based on the differential equation formalism, while the second one—the wave-packet continuum discretization method—uses the integral equation formulation for the composite-particle scattering problem. As benchmark calculations, we have chosen the deuteron off Ni58 target scattering (as a realistic illustrative example) at three different incident energies: high, middle, and low. Clear nonvanishing effects of closed inelastic channels at small and intermediate energies are established. The elastic cross sections found in both approaches are very close to each other for all three considered energies.

  13. IAEA GT-MHR Benchmark Calculations Using the HELIOS/MASTER Two-Step Procedure

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Kyung Hoon; Kim, Kang Seog; Cho, Jin Young; Song, Jae Seung; Noh, Jae Man; Lee, Chung Chan; Zee, Sung Quun

    2007-05-15

    A new two-step procedure based on the HELISO/MASTER code system has been developed for the prismatic VHTR physics analysis. This procedure employs the HELIOS code for the transport lattice calculation to generate a few group constants, and the MASTER code for the 3-dimensional core calculation to perform the reactor physics analysis. Double heterogeneity effect due to the random distribution of the particulate fuel could be dealt with the recently developed reactivity-equivalent physical transformation (RPT) method. The strong spectral effects of the graphite moderated reactor core could be solved both by optimizing the number of energy groups and group boundaries, and by employing a partial core model instead of a single block one to generate a few group cross sections. Burnable poisons in the inner reflector and asymmetrically located large control rod can be treated by adopting the equivalence theory applied for the multi-block models to generate surface dependent discontinuity factors. Effective reflector cross sections were generated by using a simple mini-core model and an equivalence theory. In this study the IAEA GT-MHR benchmark problems with a plutonium fuel were analyzed by using the HELIOS/MASTER code package and the Monte Carlo code MCNP. Benchmark problems include pin, block and core models. The computational results of the HELIOS/MASTER code system were compared with those of MCNP and other participants. The results show that the 2-step procedure using HELIOS/MASTER can be applied to the reactor physics analysis for the prismatic VHTR with a good accuracy.

  14. Benchmarking the Calculation of Stochastic Heating and Emissivity of Dust Grains in the Context of Radiative Transfer Simulations

    CERN Document Server

    Camps, Peter; Bianchi, Simone; Lunttila, Tuomas; Pinte, Christophe; Natale, Giovanni; Juvela, Mika; Fischera, Joerg; Fitzgerald, Michael P; Gordon, Karl; Baes, Maarten; Steinacker, Juergen

    2015-01-01

    We define an appropriate problem for benchmarking dust emissivity calculations in the context of radiative transfer (RT) simulations, specifically including the emission from stochastically heated dust grains. Our aim is to provide a self-contained guide for implementors of such functionality, and to offer insights in the effects of the various approximations and heuristics implemented by the participating codes to accelerate the calculations. The benchmark problem definition includes the optical and calorimetric material properties, and the grain size distributions, for a typical astronomical dust mixture with silicate, graphite and PAH components; a series of analytically defined radiation fields to which the dust population is to be exposed; and instructions for the desired output. We process this problem using six RT codes participating in this benchmark effort, and compare the results to a reference solution computed with the publicly available dust emission code DustEM. The participating codes implement...

  15. Quantum computing applied to calculations of molecular energies: CH2 benchmark.

    Science.gov (United States)

    Veis, Libor; Pittner, Jiří

    2010-11-21

    Quantum computers are appealing for their ability to solve some tasks much faster than their classical counterparts. It was shown in [Aspuru-Guzik et al., Science 309, 1704 (2005)] that they, if available, would be able to perform the full configuration interaction (FCI) energy calculations with a polynomial scaling. This is in contrast to conventional computers where FCI scales exponentially. We have developed a code for simulation of quantum computers and implemented our version of the quantum FCI algorithm. We provide a detailed description of this algorithm and the results of the assessment of its performance on the four lowest lying electronic states of CH(2) molecule. This molecule was chosen as a benchmark, since its two lowest lying (1)A(1) states exhibit a multireference character at the equilibrium geometry. It has been shown that with a suitably chosen initial state of the quantum register, one is able to achieve the probability amplification regime of the iterative phase estimation algorithm even in this case.

  16. A proposal of benchmark calculation on reactor physics for metallic fueled and MOX fueled LMFBR based upon mack-up experiment at FCA

    Energy Technology Data Exchange (ETDEWEB)

    Oigawa, Hiroyuki; Iijima, Susumu; Sakurai, Takeshi; Okajima, Shigeaki; Andoh, Masaki; Nemoto, Tatsuo; Kato, Yuichi; Osugi, Toshitaka [Dept. of Nuclear Energy System, Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan)

    2000-02-01

    In order to assess the validity of the cross section library for fast reactor physics, a set of benchmark calculation is proposed. The benchmark calculation is based upon mock-up experiments at three FCA cores with various compositions of central test regions, two of which were mock-ups of metallic fueled LMFBR's, and the other was a mock-up of a mixed oxide fueled LMFBR. One of the metallic cores included enriched uranium in the test region, while the others did not. Physics parameters to be calculated are criticality, reaction rate ratios, plutonium and B{sub 4}C sample worth, sodium void reactivity worth, and Doppler reactivity worth of {sup 238}U. Homogenized atomic number densities and various correction factors are given so that anyone can easily perform diffusion calculation in two-dimensional RZ-model and compare the results with the experiments. The validity of the correction factors are proved by changing the calculation method and used nuclear data file. (author)

  17. Uncertainties of the KIKO3D-ATHLET calculations using the Kalinin-3 benchmark (Phase II) data

    Energy Technology Data Exchange (ETDEWEB)

    Panka, Istvan; Hegyi, Gyoergy; Maraczy, Csaba; Kereszturi, Andras [Hungarian Academy of Sciences, Centre for Energy Research, Budapest (Hungary). Reactor Analysis Dept.

    2016-09-15

    The best estimate simulation of three-dimensional phenomena in nuclear reactor cores requires the use of coupled neutron physics and thermal-hydraulics calculations. However these analyses should be supplemented by the survey of the corresponding uncertainties. In this paper the uncertainties of the coupled KIKO3D-ATHLET calculations are presented for a VVER-1000 type core using the OECD NEA Kalinin-3 (Phase II) benchmark data, although only the neutronic uncertainties are considered and further simplifications are applied and discussed. Additionally, this study has been performed in the conjunction with the OECD NEA UAM benchmark, as well. In the first part of the paper, the uncertainties of the effective multiplication factor, the assembly-wise radial power distribution, the axial power distribution, the rod worth, etc. are presented at steady-state. After that some uncertainties of the transient calculations are discussed for the considered switch-off of one Main Circulation Pump (MCP) type transient.

  18. The calculational VVER burnup Credit Benchmark No.3 results with the ENDF/B-VI rev.5 (1999)

    Energy Technology Data Exchange (ETDEWEB)

    Rodriguez Gual, Maritza [Centro de Tecnologia Nuclear, La Habana (Cuba). E-mail: mrgual@ctn.isctn.edu.cu

    2000-07-01

    The purpose of this papers to present the results of CB3 phase of the VVER calculational benchmark with the recent evaluated nuclear data library ENDF/B-VI Rev.5 (1999). This results are compared with the obtained from the other participants in the calculations (Czech Republic, Finland, Hungary, Slovaquia, Spain and the United Kingdom). The phase (CB3) of the VVER calculation benchmark is similar to the Phase II-A of the OECD/NEA/INSC BUC Working Group benchmark for PWR. The cases without burnup profile (BP) were performed with the WIMS/D-4 code. The rest of the cases have been carried with DOTIII discrete ordinates code. The neutron library used was the ENDF/B-VI rev. 5 (1999). The WIMS/D-4 (69 groups) is used to collapse cross sections from the ENDF/B-VI Rev. 5 (1999) to 36 groups working library for 2-D calculations. This work also comprises the results of CB1 (obtained with ENDF/B-VI rev. 5 (1999), too) and CB3 for cases with Burnup of 30 MWd/TU and cooling time of 1 and 5 years and for case with Burnup of 40 MWd/TU and cooling time of 1 year. (author)

  19. Comparison of Two Approaches for Nuclear Data Uncertainty Propagation in MCNPX for Selected Fast Spectrum Critical Benchmarks

    Science.gov (United States)

    Zhu, T.; Rochman, D.; Vasiliev, A.; Ferroukhi, H.; Wieselquist, W.; Pautz, A.

    2014-04-01

    Nuclear data uncertainty propagation based on stochastic sampling (SS) is becoming more attractive while leveraging modern computer power. Two variants of the SS approach are compared in this paper. The Total Monte Carlo (TMC) method by the Nuclear Research and Consultancy Group (NRG) generates perturbed ENDF-6-formatted nuclear data by varying nuclear reaction model parameters. At Paul Scherrer Institute (PSI) the Nuclear data Uncertainty Stochastic Sampling (NUSS) system generates perturbed ACE-formatted nuclear data files by applying multigroup nuclear data covariances onto pointwise ACE-formatted nuclear data. Uncertainties of 239Pu and 235U from ENDF/B-VII.1, ZZ-SCALE6/COVA-44G and TENDL covariance libraries are considered in NUSS and propagated in MCNPX calculations for well-studied Jezebel and Godiva fast spectrum critical benchmarks. The corresponding uncertainty results obtained by TMC are compared with NUSS results and the deterministic Sensitivity/Uncertainty method of TSUNAMI-3D from SCALE6 package is also applied to serve as a separate verification. The discrepancies in the propagated 239Pu and 235U uncertainties due to method and covariance differences are discussed.

  20. Criticality safety calculations of the Soreq research reactor storage pool

    Energy Technology Data Exchange (ETDEWEB)

    Caner, M.; Hirshfeld, H.; Nagler, A.; Silverman, I.; Bettan, M. [Soreq Nuclear Research Center, Yavne 81800 (Israel); Levine, S.H. [Penn State University, University Park 16802 (United States)

    2001-07-01

    The IRR-l spent fuel is to be relocated in a storage pool. The present paper describes the actual facility and summarizes the Monte Carlo criticality safety calculations. The fuel elements are to be placed inside cadmium boxes to reduce their reactivity. The fuel element is 7.6 cm by 8.0 cm in the horizontal plane. The cadmium box is effectively 9.7 cm by 9.7 cm, providing significant water between the cadmium and the fuel element. The present calculations show that the spent fuel storage pool is criticality safe even for fresh fuel elements. (author)

  1. Iterative acceleration methods for Monte Carlo and deterministic criticality calculations

    Energy Technology Data Exchange (ETDEWEB)

    Urbatsch, T.J.

    1995-11-01

    If you have ever given up on a nuclear criticality calculation and terminated it because it took so long to converge, you might find this thesis of interest. The author develops three methods for improving the fission source convergence in nuclear criticality calculations for physical systems with high dominance ratios for which convergence is slow. The Fission Matrix Acceleration Method and the Fission Diffusion Synthetic Acceleration (FDSA) Method are acceleration methods that speed fission source convergence for both Monte Carlo and deterministic methods. The third method is a hybrid Monte Carlo method that also converges for difficult problems where the unaccelerated Monte Carlo method fails. The author tested the feasibility of all three methods in a test bed consisting of idealized problems. He has successfully accelerated fission source convergence in both deterministic and Monte Carlo criticality calculations. By filtering statistical noise, he has incorporated deterministic attributes into the Monte Carlo calculations in order to speed their source convergence. He has used both the fission matrix and a diffusion approximation to perform unbiased accelerations. The Fission Matrix Acceleration method has been implemented in the production code MCNP and successfully applied to a real problem. When the unaccelerated calculations are unable to converge to the correct solution, they cannot be accelerated in an unbiased fashion. A Hybrid Monte Carlo method weds Monte Carlo and a modified diffusion calculation to overcome these deficiencies. The Hybrid method additionally possesses reduced statistical errors.

  2. Benchmark calculation of p-3H and n-3He scattering

    CERN Document Server

    Viviani, M; Lazauskas, R; Fonseca, A C; Kievsky, A; Marcucci, L E

    2016-01-01

    p-3H and n-3He scattering in the energy range above the n-3He but below the d-d thresholds is studied by solving the 4-nucleon problem with a realistic nucleon-nucleon interaction. Three different methods -- Alt, Grassberger and Sandhas, Hyperspherical Harmonics, and Faddeev-Yakubovsky -- have been employed and their results for both elastic and charge-exchange processes are compared. We observe a good agreement between the three different methods, thus the obtained results may serve as a benchmark. A comparison with the available experimental data is also reported and discussed.

  3. Nuclear data uncertainty propagation for a lead-cooled fast reactor: Combining TMC with criticality benchmarks for improved accuracy

    OpenAIRE

    Alhassan, Erwin

    2014-01-01

    For the successful deployment of advanced nuclear systems and for optimization of current reactor designs, high quality and accurate nuclear data are required. Before nuclear data can be used in applications, they are first evaluated, benchmarked against integral experiments and then converted into formats usable for applications. The evaluation process in the past was usually done by using differential experimental data which was then complimented with nuclear model calculations. This trend ...

  4. Criticality calculations with MCNP{sup TM}: A primer

    Energy Technology Data Exchange (ETDEWEB)

    Mendius, P.W. [ed.; Harmon, C.D. II; Busch, R.D.; Briesmeister, J.F.; Forster, R.A.

    1994-08-01

    The purpose of this Primer is to assist the nuclear criticality safety analyst to perform computer calculations using the Monte Carlo code MCNP. Because of the closure of many experimental facilities, reliance on computer simulation is increasing. Often the analyst has little experience with specific codes available at his/her facility. This Primer helps the analyst understand and use the MCNP Monte Carlo code for nuclear criticality analyses. It assumes no knowledge of or particular experience with Monte Carlo codes in general or with MCNP in particular. The document begins with a Quickstart chapter that introduces the basic concepts of using MCNP. The following chapters expand on those ideas, presenting a range of problems from simple cylinders to 3-dimensional lattices for calculating keff confidence intervals. Input files and results for all problems are included. The Primer can be used alone, but its best use is in conjunction with the MCNP4A manual. After completing the Primer, a criticality analyst should be capable of performing and understanding a majority of the calculations that will arise in the field of nuclear criticality safety.

  5. Calculation of Critical Values for Somerville's FDR Procedures

    Directory of Open Access Journals (Sweden)

    Paul N. Somerville

    2007-04-01

    Full Text Available A Fortran 95 program has been written to calculate critical values for the step-up and step-down FDR procedures developed by Somerville (2004. The program allows for arbitrary selection of number of hypotheses, FDR rate, one- or two-sided hypotheses, common correlation coefficient of the test statistics and degrees of freedom. An MCV (minimum critical value may be specified, or the program will calculate a specified number of critical values or steps in an FDR procedure. The program can also be used to efficiently ascertain an upper bound to the number of hypotheses which the procedure will reject, given either the values of the test statistics, or their p values. Limiting the number of steps in an FDR procedure can be used to control the number or proportion of false discoveries (Somerville and Hemmelmann 2007. Using the program to calculate the largest critical values makes possible efficient use of the FDR procedures for very large numbers of hypotheses

  6. Benchmark calculation for radioactivity inventory using MAXS library based on JENDL-4.0 and JEFF-3.0/A for decommissioning BWR plants

    Directory of Open Access Journals (Sweden)

    Tanaka Ken-ichi

    2016-01-01

    Full Text Available We performed benchmark calculation for radioactivity activated in a Primary Containment Vessel (PCV of a Boiling Water Reactor (BWR by using MAXS library, which was developed by collapsing with neutron energy spectra in the PCV of the BWR. Radioactivities due to neutron irradiation were measured by using activation foil detector of Gold (Au and Nickel (Ni at thirty locations in the PCV. We performed activation calculations of the foils with SCALE5.1/ORIGEN-S code with irradiation conditions of each foil location as the benchmark calculation. We compared calculations and measurements to estimate an effectiveness of MAXS library.

  7. Evaluation of the HTTR criticality and burnup calculations with continuous-energy and multigroup cross sections

    Energy Technology Data Exchange (ETDEWEB)

    Chiang, Min-Han; Wang, Jui-Yu [Institute of Nuclear Engineering and Science, National Tsing Hua University, 101, Section 2, Kung-Fu Road, Hsinchu 30013, Taiwan (China); Sheu, Rong-Jiun, E-mail: rjsheu@mx.nthu.edu.tw [Institute of Nuclear Engineering and Science, National Tsing Hua University, 101, Section 2, Kung-Fu Road, Hsinchu 30013, Taiwan (China); Department of Engineering System and Science, National Tsing Hua University, 101, Section 2, Kung-Fu Road, Hsinchu 30013, Taiwan (China); Liu, Yen-Wan Hsueh [Institute of Nuclear Engineering and Science, National Tsing Hua University, 101, Section 2, Kung-Fu Road, Hsinchu 30013, Taiwan (China); Department of Engineering System and Science, National Tsing Hua University, 101, Section 2, Kung-Fu Road, Hsinchu 30013, Taiwan (China)

    2014-05-01

    The High Temperature Engineering Test Reactor (HTTR) in Japan is a helium-cooled graphite-moderated reactor designed and operated for the future development of high-temperature gas-cooled reactors. Two detailed full-core models of HTTR have been established by using SCALE6 and MCNP5/X, respectively, to study its neutronic properties. Several benchmark problems were repeated first to validate the calculation models. Careful code-to-code comparisons were made to ensure that two calculation models are both correct and equivalent. Compared with experimental data, the two models show a consistent bias of approximately 20–30 mk overestimation in effective multiplication factor for a wide range of core states. Most of the bias could be related to the ENDF/B-VII.0 cross-section library or incomplete modeling of impurities in graphite. After that, a series of systematic analyses was performed to investigate the effects of cross sections on the HTTR criticality and burnup calculations, with special interest in the comparison between continuous-energy and multigroup results. Multigroup calculations in this study were carried out in 238-group structure and adopted the SCALE double-heterogeneity treatment for resonance self-shielding. The results show that multigroup calculations tend to underestimate the system eigenvalue by a constant amount of ∼5 mk compared to their continuous-energy counterparts. Further sensitivity studies suggest the differences between multigroup and continuous-energy results appear to be temperature independent and also insensitive to burnup effects.

  8. Criticality calculations with MCNP{trademark}: A primer

    Energy Technology Data Exchange (ETDEWEB)

    Harmon, C.D. II; Busch, R.D.; Briesmeister, J.F.; Forster, R.A. [New Mexico Univ., Albuquerque, NM (United States)

    1994-06-06

    With the closure of many experimental facilities, the nuclear criticality safety analyst increasingly is required to rely on computer calculations to identify safe limits for the handling and storage of fissile materials. However, in many cases, the analyst has little experience with the specific codes available at his/her facility. This primer will help you, the analyst, understand and use the MCNP Monte Carlo code for nuclear criticality safety analyses. It assumes that you have a college education in a technical field. There is no assumption of familiarity with Monte Carlo codes in general or with MCNP in particular. Appendix A gives an introduction to Monte Carlo techniques. The primer is designed to teach by example, with each example illustrating two or three features of MCNP that are useful in criticality analyses. Beginning with a Quickstart chapter, the primer gives an overview of the basic requirements for MCNP input and allows you to run a simple criticality problem with MCNP. This chapter is not designed to explain either the input or the MCNP options in detail; but rather it introduces basic concepts that are further explained in following chapters. Each chapter begins with a list of basic objectives that identify the goal of the chapter, and a list of the individual MCNP features that are covered in detail in the unique chapter example problems. It is expected that on completion of the primer you will be comfortable using MCNP in criticality calculations and will be capable of handling 80 to 90 percent of the situations that normally arise in a facility. The primer provides a set of basic input files that you can selectively modify to fit the particular problem at hand.

  9. A benchmark on the calculation of kinetic parameters based on reactivity effect experiments in the CROCUS reactor

    Energy Technology Data Exchange (ETDEWEB)

    Paratte, J.M. [Laboratory for Reactor Physics and Systems Behaviour (LRS), Paul Scherrer Institute, CH-5232 Villigen PSI (Switzerland); Frueh, R. [Ecole Polytechnique Federale de Lausanne (EPFL), CH-1015 Lausanne (Switzerland); Kasemeyer, U. [Laboratory for Reactor Physics and Systems Behaviour (LRS), Paul Scherrer Institute, CH-5232 Villigen PSI (Switzerland); Kalugin, M.A. [Kurchatov Institute, 123182 Moscow (Russian Federation); Timm, W. [Framatome-ANP, D-91050 Erlangen (Germany); Chawla, R. [Laboratory for Reactor Physics and Systems Behaviour (LRS), Paul Scherrer Institute, CH-5232 Villigen PSI (Switzerland); Ecole Polytechnique Federale de Lausanne (EPFL), CH-1015 Lausanne (Switzerland)

    2006-05-15

    Measurements in the CROCUS reactor at EPFL, Lausanne, are reported for the critical water level and the inverse reactor period for several different sets of delayed supercritical conditions. The experimental configurations were also calculated by four different calculation methods. For each of the supercritical configurations, the absolute reactivity value has been determined in two different ways, viz.: (i) through direct comparison of the multiplication factor obtained employing a given calculation method with the corresponding value for the critical case (calculated reactivity: {rho} {sub calc}); (ii) by application of the inhour equation using the kinetic parameters obtained for the critical configuration and the measured inverse reactor period (measured reactivity: {rho} {sub meas}). The calculated multiplication factors for the reference critical configuration, as well as {rho} {sub calc} for the supercritical cases, are found to be in good agreement. However, the values of {rho} {sub meas} produced by two of the applied calculation methods differ appreciably from the corresponding {rho} {sub calc} values, clearly indicating deficiencies in the kinetic parameters obtained from these methods.

  10. A Benchmark Calculation for the Nd Scattering with a Model Three-Body Force

    CERN Document Server

    Phyu, Aye Mya; Golak, Jacek; Oo, Htun Htun; Witala, Henryk; Gloeckle, Walter

    2012-01-01

    Using the complex energy method, the problem of nucleon-deuteron scattering is solved with a simple three-body force having a separable form. Our results are compared with the results of modern direct two-variable calculations and a good agreement is found. This forms a firm base for other applications of the complex energy method.

  11. Benchmark calculation of no-core Monte Carlo shell model in light nuclei

    CERN Document Server

    Abe, T; Otsuka, T; Shimizu, N; Utsuno, Y; Vary, J P; 10.1063/1.3584062

    2011-01-01

    The Monte Carlo shell model is firstly applied to the calculation of the no-core shell model in light nuclei. The results are compared with those of the full configuration interaction. The agreements between them are within a few % at most.

  12. A critical assembly designed to measure neutronic benchmarks in support of the space nuclear thermal propulsion program

    Science.gov (United States)

    Parma, Edward J.; Ball, Russell M.; Hoovler, Gary S.; Selcow, Elizabeth C.; Cerbone, Ralph J.

    1993-01-01

    A reactor designed to perform criticality experiments in support of the Space Nuclear Thermal Propulsion program is currently in operation at the Sandia National Laboratories' reactor facility. The reactor is a small, water-moderated system that uses highly enriched uranium particle fuel in a 19-element configuration. Its purpose is to obtain neutronic measurements under a variety of experimental conditions that are subsequently used to benchmark rector-design computer codes. Brookhaven National Laboratory, Babcock & Wilcox, and Sandia National Laboratories participated in determining the reactor's performance requirements, design, follow-on experimentation, and in obtaining the licensing approvals. Brookhaven National Laboratory is primarily responsible for the analytical support, Babcock & Wilcox the hardware design, and Sandia National Laboratories the operational safety. All of the team members participate in determining the experimentation requirements, performance, and data reduction. Initial criticality was achieved in October 1989. An overall description of the reactor is presented along with key design features and safety-related aspects.

  13. 500-MeV electron beam bench-mark experiments and calculations

    Energy Technology Data Exchange (ETDEWEB)

    Farley, E.; Crase, K.; Selway, D.

    1979-12-01

    Experiments measuring the energy deposited by electron beams were performed to provide bench marks against which to evaluate our HANDYL76 electron beam computer code. The experiments, done at Stanford's Mk III accelerator, measured dose vs depth and dose vs radius profiles induced in layered aluminum targets by 500-MeV electrons. The dose was measured by passive thermoluminescence and photographic film placed between aluminum plates. The calculations predict a dose vs radius profile that forward-peaks on axis after the beam passes through a 200-cm air gap; the experimental measurements do not show this peak. This discrepancy indicates there may be a problem in using HANDYL76 to calculate deep penetration of a target with a large gap.

  14. Benchmarking DFT methods with small basis sets for the calculation of halogen-bond strengths.

    Science.gov (United States)

    Siiskonen, Antti; Priimagi, Arri

    2017-02-01

    In recent years, halogen bonding has become an important design tool in crystal engineering, supramolecular chemistry and biosciences. The fundamentals of halogen bonding have been studied extensively with high-accuracy computational methods. Due to its non-covalency, the use of triple-zeta (or larger) basis sets is often recommended when studying halogen bonding. However, in the large systems often encountered in supramolecular chemistry and biosciences, large basis sets can make the calculations far too slow. Therefore, small basis sets, which would combine high computational speed and high accuracy, are in great demand. This study focuses on comparing how well density functional theory (DFT) methods employing small, double-zeta basis sets can estimate halogen-bond strengths. Several methods with triple-zeta basis sets are included for comparison. Altogether, 46 DFT methods were tested using two data sets of 18 and 33 halogen-bonded complexes for which the complexation energies have been previously calculated with the high-accuracy CCSD(T)/CBS method. The DGDZVP basis set performed far better than other double-zeta basis sets, and it even outperformed the triple-zeta basis sets. Due to its small size, it is well-suited to studying halogen bonding in large systems.

  15. Benchmark calculation of n-3H and p-3He scattering

    CERN Document Server

    Viviani, M; Lazauskas, R; Carbonell, J; Fonseca, A C; Kievsky, A; Marcucci, L E; Rosati, S

    2011-01-01

    The n-3H and p-3He elastic phase-shifts below the trinucleon disintegration thresholds are calculated by solving the 4-nucleon problem with three different realistic nucleon-nucleon interactions (the I-N3LO model by Entem and Machleidt, the Argonne v18 potential model, and a low-k model derived from the CD-Bonn potential). Three different methods -- Alt, Grassberger and Sandhas, Hyperspherical Harmonics, and Faddeev-Yakubovsky -- have been used and their respective results are compared. For both n-3H and p-3He we observe a rather good agreement between the three different theoretical methods. We also compare the theoretical predictions with the available experimental data, confirming the large underprediction of the p-3He analyzing power.

  16. Simplified generalized-gradient approximation and anharmonicity: Benchmark calculations on molecules

    Science.gov (United States)

    Patton, David C.; Porezag, Dirk V.; Pederson, Mark R.

    1997-03-01

    Recent implementational improvements of the generalized-gradient approximation (GGA) have led to a simplified version which is parametrized entirely from fundamental constants, easier to use, and possibly easier to improve. We have performed detailed calculations on the geometries, atomization energies, vibrational energies, and infrared and Raman spectra of many first- and second-row dimers as well as some polyatomic molecules. For atomization and vibrational energies, we find that the simplified version of GGA leads to results similar to the original version. We comment on the fact that GGA-induced changes of hydrogenic bonding are different than for the other atoms in the periodic table but still an improvement over the local approximations to density-functional theory. In addition to a harmonic treatment of the vibrational modes we include the contributions of anharmonicity as well. With the exception of the light hydrogen containing molecules anharmonic corrections are quite small.

  17. Total molecular photoionization cross-sections by algebraic diagrammatic construction-Stieltjes-Lanczos method: Benchmark calculations

    Science.gov (United States)

    Ruberti, M.; Yun, R.; Gokhberg, K.; Kopelke, S.; Cederbaum, L. S.; Tarantelli, F.; Averbukh, V.

    2013-10-01

    In [K. Gokhberg, V. Vysotskiy, L. S. Cederbaum, L. Storchi, F. Tarantelli, and V. Averbukh, J. Chem. Phys. 130, 064104 (2009)] we introduced a new {L}2ab initio method for the calculation of total molecular photoionization cross-sections. The method is based on the ab initio description of discretized photoionized molecular states within the many-electron Green's function approach, known as algebraic diagrammatic construction (ADC), and on the application of Stieltjes-Chebyshev moment theory to Lanczos pseudospectra of the ADC electronic Hamiltonian. Here we establish the accuracy of the new technique by comparing the ADC-Lanczos-Stieltjes cross-sections in the valence ionization region to the experimental ones for a series of eight molecules of first row elements: HF, NH3, H2O, CO2, H2CO, CH4, C2H2, and C2H4. We find that the use of the second-order ADC technique [ADC(2)] that includes double electronic excitations leads to a substantial systematic improvement over the first-order method [ADC(1)] and to a good agreement with experiment for photon energies below 80 eV. The use of extended second-order ADC theory [ADC(2)x] leads to a smaller further improvement. Above 80 eV photon energy all three methods lead to significant deviations from the experimental values which we attribute to the use of Gaussian single-electron bases. Our calculations show that the ADC(2)-Lanczos-Stieltjes technique is a reliable and efficient ab initio tool for theoretical prediction of total molecular photo-ionization cross-sections in the valence region.

  18. Benchmarking the DFT+U method for thermochemical calculations of uranium molecular compounds and solids.

    Science.gov (United States)

    Beridze, George; Kowalski, Piotr M

    2014-12-18

    Ability to perform a feasible and reliable computation of thermochemical properties of chemically complex actinide-bearing materials would be of great importance for nuclear engineering. Unfortunately, density functional theory (DFT), which on many instances is the only affordable ab initio method, often fails for actinides. Among various shortcomings, it leads to the wrong estimate of enthalpies of reactions between actinide-bearing compounds, putting the applicability of the DFT approach to the modeling of thermochemical properties of actinide-bearing materials into question. Here we test the performance of DFT+U method--a computationally affordable extension of DFT that explicitly accounts for the correlations between f-electrons - for prediction of the thermochemical properties of simple uranium-bearing molecular compounds and solids. We demonstrate that the DFT+U approach significantly improves the description of reaction enthalpies for the uranium-bearing gas-phase molecular compounds and solids and the deviations from the experimental values are comparable to those obtained with much more computationally demanding methods. Good results are obtained with the Hubbard U parameter values derived using the linear response method of Cococcioni and de Gironcoli. We found that the value of Coulomb on-site repulsion, represented by the Hubbard U parameter, strongly depends on the oxidation state of uranium atom. Last, but not least, we demonstrate that the thermochemistry data can be successfully used to estimate the value of the Hubbard U parameter needed for DFT+U calculations.

  19. Validation and Benchmarking of a Practical Free Magnetic Energy and Relative Magnetic Helicity Budget Calculation in Solar Magnetic Structures

    CERN Document Server

    Moraitis, K; Georgoulis, M K; Archontis, V

    2014-01-01

    In earlier works we introduced and tested a nonlinear force-free (NLFF) method designed to self-consistently calculate the free magnetic energy and the relative magnetic helicity budgets of the corona of observed solar magnetic structures. The method requires, in principle, only a single, photospheric or low-chromospheric, vector magnetogram of a quiet-Sun patch or an active region and performs calculations in the absence of three-dimensional magnetic and velocity-field information. In this work we strictly validate this method using three-dimensional coronal magnetic fields. Benchmarking employs both synthetic, three-dimensional magnetohydrodynamic simulations and nonlinear force-free field extrapolations of the active-region solar corona. We find that our time-efficient NLFF method provides budgets that differ from those of more demanding semi-analytical methods by a factor of ~3, at most. This difference is expected from the physical concept and the construction of the method. Temporal correlations show mo...

  20. Mars/master coupled system calculation of the OECD MSLB benchmark exercise 3 with refined core thermal-hydraulic nodalization

    Energy Technology Data Exchange (ETDEWEB)

    Jeong, J.J.; Joo, H.G.; Cho, B.O.; Zee, S.Q.; Lee, W.J. [Korea Atomic Energy Research Inst., Daejeon (Korea, Republic of)

    2001-07-01

    To assess the performance of KAERI coupled multi-dimensional system thermal- hydraulics (T/H) and three-dimensional (3-D) kinetics code, MARS/MASTER, Exercise III of the OECD main steam line break benchmark problem is solved. The coupled code is capable of employing an individual flow channel for each fuel assembly as well as lumped ones. The basic analysis model of the reference plant consists of four major components: a 3-D core neutronics model, a 3-D thermal-hydraulic model for the reactor vessel employing lumped flow channels, a refined core T/H model and a 1-D T/H model for coolant system. Calculations were performed with and without the refined core T/H model. The results of the basic calculation performed without the refined core T/H model show that the core power distribution evolves to a highly localized shape due to the presence of a stuck rod, as well as asymmetric flow distribution in the reactor core. The results of the refined core T/H model indicate that the local peaking factor can be reduced by as much as 22 % through accurate representation of the local T/H feedback effects. Nonetheless, the global transient behaviors are not significantly affected. (author)

  1. Preliminary assessment of Geant4 HP models and cross section libraries by reactor criticality benchmark calculations

    DEFF Research Database (Denmark)

    Cai, Xiao-Xiao; Llamas-Jansa, Isabel; Mullet, Steven

    2013-01-01

    to reactor modelling. Before version 9.5, Geant4 HP thermal scattering model (i.e. the S(α; β) model ) supports only three bounded isotopes, namely, H in water and polyethylene, and C in graphite. Newly supported materials include D in heavy water, O and Be in beryllium oxide, H and Zr in zirconium hydride......, U and O in uranium dioxide, Al metal, Be metal, and Fe metal. The native HP cross section library G4NDL does not include data for elements with atomic number larger than 92. Therefore, transuranic elements, which have impacts for a realistic reactor, can not be simulated by the combination of the HP...

  2. Stationarity and source convergence in monte carlo criticality calculation.

    Energy Technology Data Exchange (ETDEWEB)

    Ueki, T. (Taro); Brown, F. B. (Forrest B.)

    2002-01-01

    In Monte Carlo (MC) criticality calculations, source error propagation through the stationary cycles and source convergcnce in the settling (inactive) cycles are both dominated by the dominance ratio (DR) of fission kernels, Le., the ratio of the second largest to largest eigenvalues. For symmetric two fissile component systems with DR close to unity, the extinction of fission source sites can occur in one of the components even when the initial source is symmetric and the number of histories per cycle is larger than one thousand. When such a system is made slightly asymmetric, the neutron effective multiplication factor (kern) at the inactive cycles does not reflect the convergence to stationary source distribution. To overcome this problem, relative entropy (Kullback Leibler distance) is applied to a slightly asymmetric two fissile component problem with a dominance ratio of 0.9925. Numerical results show that relative entropy is effective as a posterior diagnostic tool.

  3. Modeling coupled blast/structure interaction with Zapotec, benchmark calculations for the Conventional Weapon Effects Backfill (CONWEB) tests.

    Energy Technology Data Exchange (ETDEWEB)

    Bessette, Gregory Carl

    2004-09-01

    Modeling the response of buried reinforced concrete structures subjected to close-in detonations of conventional high explosives poses a challenge for a number of reasons. Foremost, there is the potential for coupled interaction between the blast and structure. Coupling enters the problem whenever the structure deformation affects the stress state in the neighboring soil, which in turn, affects the loading on the structure. Additional challenges for numerical modeling include handling disparate degrees of material deformation encountered in the structure and surrounding soil, modeling the structure details (e.g., modeling the concrete with embedded reinforcement, jointed connections, etc.), providing adequate mesh resolution, and characterizing the soil response under blast loading. There are numerous numerical approaches for modeling this class of problem (e.g., coupled finite element/smooth particle hydrodynamics, arbitrary Lagrange-Eulerian methods, etc.). The focus of this work will be the use of a coupled Euler-Lagrange (CEL) solution approach. In particular, the development and application of a CEL capability within the Zapotec code is described. Zapotec links two production codes, CTH and Pronto3D. CTH, an Eulerian shock physics code, performs the Eulerian portion of the calculation, while Pronto3D, an explicit finite element code, performs the Lagrangian portion. The two codes are run concurrently with the appropriate portions of a problem solved on their respective computational domains. Zapotec handles the coupling between the two domains. The application of the CEL methodology within Zapotec for modeling coupled blast/structure interaction will be investigated by a series of benchmark calculations. These benchmarks rely on data from the Conventional Weapons Effects Backfill (CONWEB) test series. In these tests, a 15.4-lb pipe-encased C-4 charge was detonated in soil at a 5-foot standoff from a buried test structure. The test structure was composed of a

  4. A BENCHMARKING ANALYSIS FOR FIVE RADIONUCLIDE VADOSE ZONE MODELS (CHAIN, MULTIMED_DP, FECTUZ, HYDRUS, AND CHAIN 2D) IN SOIL SCREENING LEVEL CALCULATIONS

    Science.gov (United States)

    Five radionuclide vadose zone models with different degrees of complexity (CHAIN, MULTIMED_DP, FECTUZ, HYDRUS, and CHAIN 2D) were selected for use in soil screening level (SSL) calculations. A benchmarking analysis between the models was conducted for a radionuclide (99Tc) rele...

  5. Evaluation of PWR and BWR assembly benchmark calculations. Status report of EPRI computational benchmark results, performed in the framework of the Netherlands` PINK programme (Joint project of ECN, IRI, KEMA and GKN)

    Energy Technology Data Exchange (ETDEWEB)

    Gruppelaar, H. [Netherlands Energy Research Foundation (ECN), Petten (Netherlands); Klippel, H.T. [Netherlands Energy Research Foundation (ECN), Petten (Netherlands); Kloosterman, J.L. [Netherlands Energy Research Foundation (ECN), Petten (Netherlands); Hoogenboom, J.E. [Technische Univ. Delft (Netherlands). Interfacultair Reactor Instituut; Leege, P.F.A. de [Technische Univ. Delft (Netherlands). Interfacultair Reactor Instituut; Verhagen, F.C.M. [Keuring van Electrotechnische Materialen NV, Arnhem (Netherlands); Bruggink, J.C. [Gemeenschappelijke Kernenergiecentrale Nederland N.V., Dodewaard (Netherlands)

    1993-11-01

    Benchmark results of the Dutch PINK working group on calculational benchmarks on single pin cell and multipin assemblies as defined by EPRI are presented and evaluated. First a short update of methods used by the various institutes involved is given as well as an update of the status with respect to previous performed pin-cell calculations. Problems detected in previous pin-cell calculations are inspected more closely. Detailed discussion of results of multipin assembly calculations is given. The assembly consists of 9 pins in a multicell square lattice in which the central pin is filled differently, i.e. a Gd pin for the BWR assembly and a control rod/guide tube for the PWR assembly. The results for pin cells showed a rather good overall agreement between the four participants although BWR pins with high void fraction turned out to be difficult to calculate. With respect to burnup calculations good overall agreement for the reactivity swing was obtained, provided that a fine time grid is used. (orig.)

  6. Benchmark Calculations of Energetic Properties of Groups 4 and 6 Transition Metal Oxide Nanoclusters Including Comparison to Density Functional Theory

    Energy Technology Data Exchange (ETDEWEB)

    Fang, Zongtang; Both, Johan; Li, Shenggang; Yue, Shuwen; Aprà, Edoardo; Keçeli, Murat; Wagner, Albert F.; Dixon, David A.

    2016-08-09

    The heats of formation and the normalized clustering energies (NCEs) for the group 4 and group 6 transition metal oxide (TMO) trimers and tetramers have been calculated by the Feller-Peterson-Dixon (FPD) method. The heats of formation predicted by the FPD method do not differ much from those previously derived from the NCEs at the CCSD(T)/aT level except for the CrO3 nanoclusters. New and improved heats of formation for Cr3O9 and Cr4O12 were obtained using PW91 orbitals instead of Hartree-Fock (HF) orbitals. Diffuse functions are necessary to predict accurate heats of formation. The fluoride affinities (FAs) are calculated with the CCSD(T) method. The relative energies (REs) of different isomers, NCEs, electron affinities (EAs), and FAs of (MO2)n ( M = Ti, Zr, Hf, n = 1 – 4 ) and (MO3)n ( M = Cr, Mo, W, n = 1 – 3) clusters have been benchmarked with 55 exchange-correlation DFT functionals including both pure and hybrid types. The absolute errors of the DFT results are mostly less than ±10 kcal/mol for the NCEs and the EAs, and less than ±15 kcal/mol for the FAs. Hybrid functionals usually perform better than the pure functionals for the REs and NCEs. The performance of the two types of functionals in predicting EAs and FAs is comparable. The B1B95 and PBE1PBE functionals provide reliable energetic properties for most isomers. Long range corrected pure functionals usually give poor FAs. The standard deviation of the absolute error is always close to the mean errors and the probability distributions of the DFT errors are often not Gaussian (normal). The breadth of the distribution of errors and the maximum probability are dependent on the energy property and the isomer.

  7. Qualification of coupled 3D neutron kinetic/thermal hydraulic code systems by the calculation of a VVER-440 benchmark. Re-connection of an isolated loop

    Energy Technology Data Exchange (ETDEWEB)

    Kotsarev, Alexander; Lizorkin, Mikhail [National Research Centre ' Kurchatov Institute' , Moscow (Russian Federation); Bencik, Marek; Hadek, Jan [UJV Rez, a.s., Rez (Czech Republic); Kozmenkov, Yaroslav; Kliem, Soeren [Helmholtz-Zentrum Dresden-Rossendorf (HZDR) e.V., Dresden (Germany)

    2016-09-15

    The 7th AER dynamic benchmark is a continuation of the efforts to validate the codes systematically for the estimation of the transient behavior of VVER type nuclear power plants. The main part of the benchmark is the simulation of the re-connection of an isolated circulation loop with low temperature in a VVER-440 plant. This benchmark was calculated by the National Research Centre ''Kurchatov Institute'' (with the code ATHLET/BIPR-VVER), UJV Rez (with the code RELAP5-3D {sup copyright}) and HZDR (with the code DYN3D/ATHLET). The paper gives an overview of the behavior of the main thermal hydraulic and neutron kinetic parameters in the provided solutions.

  8. Reactivity impact of {sup 16}O thermal elastic-scattering nuclear data for some numerical and critical benchmark systems

    Energy Technology Data Exchange (ETDEWEB)

    Kozier, K. S.; Roubtsov, D. [AECL, Chalk River Laboratories, Chalk River, ON (Canada); Plompen, A. J. M.; Kopecky, S. [EC-JRC, Inst. for Reference Materials and Measurements, Retieseweg 111, 2440 Geel (Belgium)

    2012-07-01

    The thermal neutron-elastic-scattering cross-section data for {sup 16}O used in various modern evaluated-nuclear-data libraries were reviewed and found to be generally too high compared with the best available experimental measurements. Some of the proposed revisions to the ENDF/B-VII.0 {sup 16}O data library and recent results from the TENDL system increase this discrepancy further. The reactivity impact of revising the {sup 16}O data downward to be consistent with the best measurements was tested using the JENDL-3.3 {sup 16}O cross-section values and was found to be very small in MCNP5 simulations of the UO{sub 2} and reactor-recycle MOX-fuel cases of the ANS Doppler-defect numerical benchmark. However, large reactivity differences of up to about 14 mk (1400 pcm) were observed using {sup 16}O data files from several evaluated-nuclear-data libraries in MCNP5 simulations of the Los Alamos National Laboratory HEU heavy-water solution thermal critical experiments, which were performed in the 1950's. The latter result suggests that new measurements using HEU in a heavy-water-moderated critical facility, such as the ZED-2 zero-power reactor at the Chalk River Laboratories, might help to resolve the discrepancy between the {sup 16}O thermal elastic-scattering cross-section values and thereby reduce or better define its uncertainty, although additional assessment work would be needed to confirm this. (authors)

  9. Growth and Expansion of the International Criticality Safety Benchmark Evaluation Project and the Newly Organized International Reactor Physics Experiment Evaluation Project

    Energy Technology Data Exchange (ETDEWEB)

    J. Blair Briggs; Lori Scott; Yolanda Rugama; Enrico Satori

    2007-05-01

    Since ICNC 2003, the International Criticality Safety Benchmark Evaluation Project (ICSBEP) has continued to expand its efforts and broaden its scope. Criticality-alarm / shielding type benchmarks and fundamental physics measurements that are relevant to criticality safety applications are not only included in the scope of the project, but benchmark data are also included in the latest version of the handbook. A considerable number of improvements have been made to the searchable database, DICE and the criticality-alarm / shielding benchmarks and fundamental physics measurements have been included in the database. There were 12 countries participating on the ICSBEP in 2003. That number has increased to 18 with recent contributions of data and/or resources from Brazil, Czech Republic, Poland, India, Canada, and China. South Africa, Germany, Argentina, and Australia have been invited to participate. Since ICNC 2003, the contents of the “International Handbook of Evaluated Criticality Safety Benchmark Experiments” have increased from 350 evaluations (28,000 pages) containing benchmark specifications for 3070 critical or subcritical configurations to 442 evaluations (over 38,000 pages) containing benchmark specifications for 3957 critical or subcritical configurations, 23 criticality-alarm-placement / shielding configurations with multiple dose points for each, and 20 configurations that have been categorized as fundamental physics measurements that are relevant to criticality safety applications in the 2006 Edition of the ICSBEP Handbook. Approximately 30 new evaluations and 250 additional configurations are expected to be added to the 2007 Edition of the Handbook. Since ICNC 2003, a reactor physics counterpart to the ICSBEP, The International Reactor Physics Experiment Evaluation Project (IRPhEP) was initiated. Beginning in 1999, the IRPhEP was conducted as a pilot activity by the by the Organization of Economic Cooperation and Development (OECD) Nuclear Energy

  10. A benchmark-problem specification and calculation using SENSIBL, a one- and two-dimensional sensitivity and uncertainty analysis code of the AARE system

    Energy Technology Data Exchange (ETDEWEB)

    Muir, D.W.; Davidson, J.W.; Dudziak, D.J.; Davierwalla, D.M.; Higgs, C.E.; Stepanek, J.

    1988-01-01

    The lack of suitable benchmark problems makes it difficult to test sensitivity codes with a covariance library. A benchmark problem has therefore been defined for one- and two-dimensional sensitivity and uncertainity analysis codes and code systems. The problem, representative of a fusion reactor blanket, has a simple, three-zone )tau)-z geometry containing a D-T fusion neutron source distributed in a central void region surrounded by a thick /sup 6/LiH annulus. The response of interest is the /sup 6/Li tritium production per source neutron, T/sub 6/. The calculation has been performed with SENSIBL using other codes from the AARE code system as a test of both SENSIBL and the linked, modular system. The caluclation was performed using the code system in the standard manner with a covariance data library in the COVFILS-2 format but modified to contain specifically tailored covariance data for H and /sup 6/Li (Path A). The calculation was also performed by a second method which uses specially perturbed H and Li cross sections (Path B). This method bypasses SENSIBL and allows a hand calculation of the benchmark T/sub 6/ uncertainties. The results of Path A and Path B were total uncertainties in T/sub 6/ of 0.21% and 0.19%, respectively. The closeness of the results for this challenging test gives confidence that SENSIBL and the AARE system will perform well for realistic sensitivity and uncertainty analyses

  11. Is the assumption of normality or log-normality for continuous response data critical for benchmark dose estimation?

    Energy Technology Data Exchange (ETDEWEB)

    Shao, Kan, E-mail: Shao.Kan@epa.gov [ORISE Postdoctoral Fellow, National Center for Environmental Assessment, U.S. Environmental Protection Agency, Research Triangle Park, NC (United States); Gift, Jeffrey S. [National Center for Environmental Assessment, U.S. Environmental Protection Agency, Research Triangle Park, NC (United States); Setzer, R. Woodrow [National Center for Computational Toxicology, U.S. Environmental Protection Agency, Research Triangle Park, NC (United States)

    2013-11-01

    Continuous responses (e.g. body weight) are widely used in risk assessment for determining the benchmark dose (BMD) which is used to derive a U.S. EPA reference dose. One critical question that is not often addressed in dose–response assessments is whether to model the continuous data as normally or log-normally distributed. Additionally, if lognormality is assumed, and only summarized response data (i.e., mean ± standard deviation) are available as is usual in the peer-reviewed literature, the BMD can only be approximated. In this study, using the “hybrid” method and relative deviation approach, we first evaluate six representative continuous dose–response datasets reporting individual animal responses to investigate the impact on BMD/BMDL estimates of (1) the distribution assumption and (2) the use of summarized versus individual animal data when a log-normal distribution is assumed. We also conduct simulation studies evaluating model fits to various known distributions to investigate whether the distribution assumption has influence on BMD/BMDL estimates. Our results indicate that BMDs estimated using the hybrid method are more sensitive to the distribution assumption than counterpart BMDs estimated using the relative deviation approach. The choice of distribution assumption has limited impact on the BMD/BMDL estimates when the within dose-group variance is small, while the lognormality assumption is a better choice for relative deviation method when data are more skewed because of its appropriateness in describing the relationship between mean and standard deviation. Additionally, the results suggest that the use of summarized data versus individual response data to characterize log-normal distributions has minimal impact on BMD estimates. - Highlights: • We investigate to what extent the distribution assumption can affect BMD estimates. • Both real data analysis and simulation study are conducted. • BMDs estimated using hybrid method are more

  12. Is the assumption of normality or log-normality for continuous response data critical for benchmark dose estimation?

    Science.gov (United States)

    Shao, Kan; Gift, Jeffrey S; Setzer, R Woodrow

    2013-11-01

    Continuous responses (e.g. body weight) are widely used in risk assessment for determining the benchmark dose (BMD) which is used to derive a U.S. EPA reference dose. One critical question that is not often addressed in dose-response assessments is whether to model the continuous data as normally or log-normally distributed. Additionally, if lognormality is assumed, and only summarized response data (i.e., mean±standard deviation) are available as is usual in the peer-reviewed literature, the BMD can only be approximated. In this study, using the "hybrid" method and relative deviation approach, we first evaluate six representative continuous dose-response datasets reporting individual animal responses to investigate the impact on BMD/BMDL estimates of (1) the distribution assumption and (2) the use of summarized versus individual animal data when a log-normal distribution is assumed. We also conduct simulation studies evaluating model fits to various known distributions to investigate whether the distribution assumption has influence on BMD/BMDL estimates. Our results indicate that BMDs estimated using the hybrid method are more sensitive to the distribution assumption than counterpart BMDs estimated using the relative deviation approach. The choice of distribution assumption has limited impact on the BMD/BMDL estimates when the within dose-group variance is small, while the lognormality assumption is a better choice for relative deviation method when data are more skewed because of its appropriateness in describing the relationship between mean and standard deviation. Additionally, the results suggest that the use of summarized data versus individual response data to characterize log-normal distributions has minimal impact on BMD estimates.

  13. Monte Carlo Simulation Calculation of Critical Coupling Constant for Continuum \\phi^4_2

    OpenAIRE

    Loinaz, Will; Willey, R. S.

    1997-01-01

    We perform a Monte Carlo simulation calculation of the critical coupling constant for the continuum {\\lambda \\over 4} \\phi^4_2 theory. The critical coupling constant we obtain is [{\\lambda \\over \\mu^2}]_crit=10.24(3).

  14. Benchmarking ENDF/B-VII.0

    Science.gov (United States)

    van der Marck, Steven C.

    2006-12-01

    The new major release VII.0 of the ENDF/B nuclear data library has been tested extensively using benchmark calculations. These were based upon MCNP-4C3 continuous-energy Monte Carlo neutronics simulations, together with nuclear data processed using the code NJOY. Three types of benchmarks were used, viz., criticality safety benchmarks, (fusion) shielding benchmarks, and reference systems for which the effective delayed neutron fraction is reported. For criticality safety, more than 700 benchmarks from the International Handbook of Criticality Safety Benchmark Experiments were used. Benchmarks from all categories were used, ranging from low-enriched uranium, compound fuel, thermal spectrum ones (LEU-COMP-THERM), to mixed uranium-plutonium, metallic fuel, fast spectrum ones (MIX-MET-FAST). For fusion shielding many benchmarks were based on IAEA specifications for the Oktavian experiments (for Al, Co, Cr, Cu, LiF, Mn, Mo, Si, Ti, W, Zr), Fusion Neutronics Source in Japan (for Be, C, N, O, Fe, Pb), and Pulsed Sphere experiments at Lawrence Livermore National Laboratory (for 6Li, 7Li, Be, C, N, O, Mg, Al, Ti, Fe, Pb, D 2O, H 2O, concrete, polyethylene and teflon). For testing delayed neutron data more than thirty measurements in widely varying systems were used. Among these were measurements in the Tank Critical Assembly (TCA in Japan) and IPEN/MB-01 (Brazil), both with a thermal spectrum, and two cores in Masurca (France) and three cores in the Fast Critical Assembly (FCA, Japan), all with fast spectra. In criticality safety, many benchmarks were chosen from the category with a thermal spectrum, low-enriched uranium, compound fuel (LEU-COMP-THERM), because this is typical of most current-day reactors, and because these benchmarks were previously underpredicted by as much as 0.5% by most nuclear data libraries (such as ENDF/B-VI.8, JEFF-3.0). The calculated results presented here show that this underprediction is no longer there for ENDF/B-VII.0. The average over 257

  15. Global phase equilibrium calculations: Critical lines, critical end points and liquid-liquid-vapour equilibrium in binary mixtures

    DEFF Research Database (Denmark)

    Cismondi, Martin; Michelsen, Michael Locht

    2007-01-01

    of critical lines. Each calculated point is analysed for stability by means of the tangent plane distance, and the occurrence of an unstable point is used to determine a critical endpoint (CEP). The critical endpoint, in turn, is used as the starting point for constructing the three-phase line. The equations...... for the critical endpoint, as well as for points on the three-phase line, are also solved using Newton's method with temperature, molar volume and composition as the independent variables. The different calculations are integrated into a general procedure that allows us to automatically trace critical lines......, critical endpoints and three-phase lines for binary mixtures with phase diagrams of types from I to V without advance knowledge of the type of phase diagram. The procedure requires a thermodynamic model in the form of a pressure-explicit EOS but is not specific to a particular equation of state. (C) 2006...

  16. Assessment of CTF Boiling Transition and Critical Heat Flux Modeling Capabilities Using the OECD/NRC BFBT and PSBT Benchmark Databases

    Directory of Open Access Journals (Sweden)

    Maria Avramova

    2013-01-01

    Full Text Available Over the last few years, the Pennsylvania State University (PSU under the sponsorship of the US Nuclear Regulatory Commission (NRC has prepared, organized, conducted, and summarized two international benchmarks based on the NUPEC data—the OECD/NRC Full-Size Fine-Mesh Bundle Test (BFBT Benchmark and the OECD/NRC PWR Sub-Channel and Bundle Test (PSBT Benchmark. The benchmarks’ activities have been conducted in cooperation with the Nuclear Energy Agency/Organization for Economic Co-operation and Development (NEA/OECD and the Japan Nuclear Energy Safety (JNES Organization. This paper presents an application of the joint Penn State University/Technical University of Madrid (UPM version of the well-known sub-channel code COBRA-TF (Coolant Boiling in Rod Array-Two Fluid, namely, CTF, to the steady state critical power and departure from nucleate boiling (DNB exercises of the OECD/NRC BFBT and PSBT benchmarks. The goal is two-fold: firstly, to assess these models and to examine their strengths and weaknesses; and secondly, to identify the areas for improvement.

  17. Tensor RG calculations and quantum simulations near criticality

    CERN Document Server

    Meurice, Y; Tsai, Shan-Wen; Unmuth-Yockey, J; Yang, Li-Ping; Zhang, Jin

    2016-01-01

    We discuss the reformulation of the O(2) model with a chemical potential and the Abelian Higgs model on a 1+1 dimensional space-time lattice using the Tensor Renormalization Group (TRG) method. The TRG allows exact blocking and connects smoothly the classical Lagrangian approach to the quantum Hamiltonian approach. We calculate the entanglement entropy in the superfluid phase of the O(2) model and show that it approximately obeys the logarithmic Calabrese-Cardy scaling obtained from Conformal Field Theory (CFT). We calculate the Polyakov loop in the Abelian Higgs model and discuss the possibility of a deconfinement transition at finite volume. We propose Bose-Hubbard Hamiltonians implementable on optical lattices as quantum simulators for CFT models.

  18. Critical evaluation of German regulatory specifications for calculating radiological exposure

    Energy Technology Data Exchange (ETDEWEB)

    Koenig, Claudia; Walther, Clemens [Hannover Univ. (Germany). Inst. of Radioecology; Smeddinck, Ulrich [Technische Univ. Braunschweig (Germany). Inst. of Law

    2015-07-01

    The assessment of radiological exposure of the public is an issue at the interface between scientific findings, juridical standard setting and political decision. The present work revisits the German regulatory specifications for calculating radiological exposure, like the already existing calculation model General Administrative Provision (AVV) for planning and monitoring nuclear facilities. We address the calculation models for the recent risk assessment regarding the final disposal of radioactive waste in Germany. To do so, a two-pronged approach is pursued. One part deals with radiological examinations of the groundwater-soil-transfer path of radionuclides into the biosphere. Processes at the so-called geosphere-biosphere-interface are examined, especially migration of I-129 in the unsaturated zone. This is necessary, since the German General Administrative Provision does not consider radionuclide transport via groundwater from an underground disposal facility yet. Especially data with regard to processes in the vadose zone are scarce. Therefore, using I-125 as a tracer, immobilization and mobilization of iodine is investigated in two reference soils from the German Federal Environment Agency. The second part of this study examines how scientific findings but also measures and activities of stakeholders and concerned parties influence juridical standard setting, which is necessary for risk management. Risk assessment, which is a scientific task, includes identification and investigation of relevant sources of radiation, possible pathways to humans, and maximum extent and duration of exposure based on dose-response functions. Risk characterization identifies probability and severity of health effects. These findings have to be communicated to authorities, who have to deal with the risk management. Risk management includes, for instance, taking into account acceptability of the risk, actions to reduce, mitigate, substitute or monitor the hazard, the setting of

  19. Research on GPU Acceleration for Monte Carlo Criticality Calculation

    Science.gov (United States)

    Xu, Qi; Yu, Ganglin; Wang, Kan

    2014-06-01

    The Monte Carlo neutron transport method can be naturally parallelized by multi-core architectures due to the dependency between particles during the simulation. The GPU+CPU heterogeneous parallel mode has become an increasingly popular way of parallelism in the field of scientific supercomputing. Thus, this work focuses on the GPU acceleration method for the Monte Carlo criticality simulation, as well as the computational efficiency that GPUs can bring. The "neutron transport step" is introduced to increase the GPU thread occupancy. In order to test the sensitivity of the MC code's complexity, a 1D one-group code and a 3D multi-group general purpose code are respectively transplanted to GPUs, and the acceleration effects are compared. The result of numerical experiments shows considerable acceleration effect of the "neutron transport step" strategy. However, the performance comparison between the 1D code and the 3D code indicates the poor scalability of MC codes on GPUs.

  20. The Establishment,Calculation and Application of Benchmarking Housing Price System%基准房价体系的构建、测算及应用

    Institute of Scientific and Technical Information of China (English)

    李妍; 汪友结

    2013-01-01

    In order to improve the sufficiency of benchmarking housing price in housing market,we have developed the concept of benchmarking housing price from district to unit price and identified its connotation. As a result,we have established a multi-level benchmarking housing price system,unit to building to community to district. In this context,we have built a practical calculation model of benchmarking housing price by utilizing the mass appraisal and full coverage of statistical sampling techniques. Moreover, GIS technology is introduced to price calculation and to build the application platform. Therefore,this has provided a theoretical framework and practical experiences of benchmarking housing price system to other cities in China for further references.%  为有效改变当前我国房地产价格种类繁多但缺乏权威性基准价格的现状,在科学借鉴国内外先进实践经验的基础上,将“基准房价”概念从“片区价”升华至“一房一价”并明确界定其基本内涵,提出了“房屋基准价→楼栋基准价→楼盘基准价→片区基准价”的多层次基准房价体系;通过引入批量评估技术和全样本统计技术,构建真正意义上的基准房价测算模型--整体估价模型和全样本统计模型;深入探索并全面拓展基准房价体系的应用方法及应用领域,并运用 GIS 技术设计完成基准房价测算与应用平台,从而为我国各城市构建基准房价体系提供了可供借鉴的理论框架与实践参考。

  1. Influence of active magnetic bearing sensor location on the calculated critical speeds of turbomachinery

    OpenAIRE

    1989-01-01

    The calculation of critical speeds for turbomachinery with active magnetic bearings (AMBs) is of great interest due to the increasing number of applications of this new technology. The potential for increased turbomachine performance through improved AMB design has created the need for a more accurate method for predicting the system's undamped critical speeds. This thesis documents the development of a method which improves the accuracy of critical speed calculation by properl...

  2. Validation of CENDL and JEFF evaluated nuclear data files for TRIGA calculations through the analysis of integral parameters of TRX and BAPL benchmark lattices of thermal reactors

    Energy Technology Data Exchange (ETDEWEB)

    Uddin, M.N. [Department of Physics, Jahangirnagar University, Dhaka (Bangladesh); Sarker, M.M. [Reactor Physics and Engineering Division, Institute of Nuclear Science and Technology, Atomic Energy Research Establishment, Savar, GPO Box 3787, Dhaka 1000 (Bangladesh); Khan, M.J.H. [Reactor Physics and Engineering Division, Institute of Nuclear Science and Technology, Atomic Energy Research Establishment, Savar, GPO Box 3787, Dhaka 1000 (Bangladesh)], E-mail: jahirulkhan@yahoo.com; Islam, S.M.A. [Department of Physics, Jahangirnagar University, Dhaka (Bangladesh)

    2009-10-15

    The aim of this paper is to present the validation of evaluated nuclear data files CENDL-2.2 and JEFF-3.1.1 through the analysis of the integral parameters of TRX and BAPL benchmark lattices of thermal reactors for neutronics analysis of TRIGA Mark-II Research Reactor at AERE, Bangladesh. In this process, the 69-group cross-section library for lattice code WIMS was generated using the basic evaluated nuclear data files CENDL-2.2 and JEFF-3.1.1 with the help of nuclear data processing code NJOY99.0. Integral measurements on the thermal reactor lattices TRX-1, TRX-2, BAPL-UO{sub 2}-1, BAPL-UO{sub 2}-2 and BAPL-UO{sub 2}-3 served as standard benchmarks for testing nuclear data files and have also been selected for this analysis. The integral parameters of the said lattices were calculated using the lattice transport code WIMSD-5B based on the generated 69-group cross-section library. The calculated integral parameters were compared to the measured values as well as the results of Monte Carlo Code MCNP. It was found that in most cases, the values of integral parameters show a good agreement with the experiment and MCNP results. Besides, the group constants in WIMS format for the isotopes U-235 and U-238 between two data files have been compared using WIMS library utility code WILLIE and it was found that the group constants are identical with very insignificant difference. Therefore, this analysis reflects the validation of evaluated nuclear data files CENDL-2.2 and JEFF-3.1.1 through benchmarking the integral parameters of TRX and BAPL lattices and can also be essential to implement further neutronic analysis of TRIGA Mark-II research reactor at AERE, Dhaka, Bangladesh.

  3. Criticality Calculations for a Typical Nuclear Fuel Fabrication Plant with Low Enriched Uranium

    Energy Technology Data Exchange (ETDEWEB)

    Elsayed, Hade; Nagy, Mohamed; Agamy, Said; Shaat, Mohmaed [Egyptian Atomic Energy Authority, Cairo (Egypt)

    2013-07-01

    The operations with the fissile materials such as U{sup 235} introduce the risk of a criticality accident that may be lethal to nearby personnel and can lead the facility to shutdown. Therefore, the prevention of a nuclear criticality accident should play a major role in the design of a nuclear facility. The objectives of criticality safety are to prevent a self-sustained nuclear chain reaction and to minimize the consequences. Sixty criticality accidents were occurred in the world. These are accidents divided into two categories, 22 accidents occurred in process facilities and 38 accidents occurred during critical experiments or operations with research reactor. About 21 criticality accidents including Japan Nuclear Fuel Conversion Co. (JCO) accident took place with fuel solution or slurry and only one accident occurred with metal fuel. In this study the nuclear criticality calculations have been performed for a typical nuclear fuel fabrication plant producing nuclear fuel elements for nuclear research reactors with low enriched uranium up to 20%. The calculations were performed for both normal and abnormal operation conditions. The effective multiplication factor (k{sub eff}) during the nuclear fuel fabrication process (Uranium hexafluoride - Ammonium Diuranate conversion process) was determined. Several accident scenarios were postulated and the criticalities of these accidents were evaluated. The computer code MCNP-4B which based on Monte Carlo method was used to calculate neutron multiplication factor. The criticality calculations Monte Carlo method was used to calculate neutron multiplication factor. The criticality calculations were performed for the cases of, change of moderator to fuel ratio, solution density and concentration of the solute in order to prevent or mitigate criticality accidents during the nuclear fuel fabrication process. The calculation results are analyzed and discussed.

  4. Multi-Loop Calculations of Anomalous Exponents in the Models of Critical Dynamics

    Directory of Open Access Journals (Sweden)

    Adzhemyan L. Ts.

    2016-01-01

    Full Text Available The Renormalization group method (RG is applied to the investigation of the E model of critical dynamics, which describes the transition from the normal to the superfluid phase in He4. The technique “Sector decomposition” with R’ operation is used for the calculation of the Feynman diagrams. The RG functions, critical exponents and critical dynamical exponent z, which determines the growth of the relaxation time near the critical point, have been calculated in the two-loop approximation in the framework of ε-expansion. The relevance of a fixed point for helium, where the dynamic scaling is weakly violated, is briefly discussed.

  5. Comparative Neutronics Analysis of DIMPLE S06 Criticality Benchmark with Contemporary Reactor Core Analysis Computer Code Systems

    Directory of Open Access Journals (Sweden)

    Wonkyeong Kim

    2015-01-01

    Full Text Available A high-leakage core has been known to be a challenging problem not only for a two-step homogenization approach but also for a direct heterogeneous approach. In this paper the DIMPLE S06 core, which is a small high-leakage core, has been analyzed by a direct heterogeneous modeling approach and by a two-step homogenization modeling approach, using contemporary code systems developed for reactor core analysis. The focus of this work is a comprehensive comparative analysis of the conventional approaches and codes with a small core design, DIMPLE S06 critical experiment. The calculation procedure for the two approaches is explicitly presented in this paper. Comprehensive comparative analysis is performed by neutronics parameters: multiplication factor and assembly power distribution. Comparison of two-group homogenized cross sections from each lattice physics codes shows that the generated transport cross section has significant difference according to the transport approximation to treat anisotropic scattering effect. The necessity of the ADF to correct the discontinuity at the assembly interfaces is clearly presented by the flux distributions and the result of two-step approach. Finally, the two approaches show consistent results for all codes, while the comparison with the reference generated by MCNP shows significant error except for another Monte Carlo code, SERPENT2.

  6. Benchmarking a modified version of the civ3 nonrelativistic atomic-structure code within Na-like-tungsten R -matrix calculations

    Science.gov (United States)

    Turkington, M. D.; Ballance, C. P.; Hibbert, A.; Ramsbottom, C. A.

    2016-08-01

    In this work we explore the validity of employing a modified version of the nonrelativistic structure code civ3 for heavy, highly charged systems, using Na-like tungsten as a simple benchmark. Consequently, we present radiative and subsequent collisional atomic data compared with corresponding results from a fully relativistic structure and collisional model. Our motivation for this line of study is to benchmark civ3 against the relativistic grasp0 structure code. This is an important study as civ3 wave functions in nonrelativistic R -matrix calculations are computationally less expensive than their Dirac counterparts. There are very few existing data for the W LXIV ion in the literature with which we can compare except for an incomplete set of energy levels available from the NIST database. The overall accuracy of the present results is thus determined by the comparison between the civ3 and grasp0 structure codes alongside collisional atomic data computed by the R -matrix Breit-Pauli and Dirac codes. It is found that the electron-impact collision strengths and effective collision strengths computed by these differing methods are in good general agreement for the majority of the transitions considered, across a broad range of electron temperatures.

  7. Financial Benchmarking

    OpenAIRE

    2012-01-01

    This bachelor's thesis is focused on financial benchmarking of TULIPA PRAHA s.r.o. The aim of this work is to evaluate financial situation of the company, identify its strengths and weaknesses and to find out how efficient is the performance of this company in comparison with top companies within the same field by using INFA benchmarking diagnostic system of financial indicators. The theoretical part includes the characteristic of financial analysis, which financial benchmarking is based on a...

  8. Evaluation of the concrete shield compositions from the 2010 criticality accident alarm system benchmark experiments at the CEA Valduc SILENE facility

    Energy Technology Data Exchange (ETDEWEB)

    Miller, Thomas Martin [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Celik, Cihangir [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Dunn, Michael E [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Wagner, John C [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); McMahan, Kimberly L [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Authier, Nicolas [French Atomic Energy Commission (CEA), Centre de Valduc, Is-sur-Tille (France); Jacquet, Xavier [French Atomic Energy Commission (CEA), Centre de Valduc, Is-sur-Tille (France); Rousseau, Guillaume [French Atomic Energy Commission (CEA), Centre de Valduc, Is-sur-Tille (France); Wolff, Herve [French Atomic Energy Commission (CEA), Centre de Valduc, Is-sur-Tille (France); Savanier, Laurence [French Atomic Energy Commission (CEA), Centre de Valduc, Is-sur-Tille (France); Baclet, Nathalie [French Atomic Energy Commission (CEA), Centre de Valduc, Is-sur-Tille (France); Lee, Yi-kang [French Atomic Energy Commission (CEA), Centre de Saclay, Gif sur Yvette (France); Trama, Jean-Christophe [French Atomic Energy Commission (CEA), Centre de Saclay, Gif sur Yvette (France); Masse, Veronique [French Atomic Energy Commission (CEA), Centre de Saclay, Gif sur Yvette (France); Gagnier, Emmanuel [French Atomic Energy Commission (CEA), Centre de Saclay, Gif sur Yvette (France); Naury, Sylvie [French Atomic Energy Commission (CEA), Centre de Saclay, Gif sur Yvette (France); Blanc-Tranchant, Patrick [French Atomic Energy Commission (CEA), Centre de Saclay, Gif sur Yvette (France); Hunter, Richard [Babcock International Group (United Kingdom); Kim, Soon [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Dulik, George Michael [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Reynolds, Kevin H. [Y-12 National Security Complex, Oak Ridge, TN (United States)

    2015-01-01

    In October 2010, a series of benchmark experiments were conducted at the French Commissariat a l'Energie Atomique et aux Energies Alternatives (CEA) Valduc SILENE facility. These experiments were a joint effort between the United States Department of Energy Nuclear Criticality Safety Program and the CEA. The purpose of these experiments was to create three benchmarks for the verification and validation of radiation transport codes and evaluated nuclear data used in the analysis of criticality accident alarm systems. This series of experiments consisted of three single-pulsed experiments with the SILENE reactor. For the first experiment, the reactor was bare (unshielded), whereas in the second and third experiments, it was shielded by lead and polyethylene, respectively. The polyethylene shield of the third experiment had a cadmium liner on its internal and external surfaces, which vertically was located near the fuel region of SILENE. During each experiment, several neutron activation foils and thermoluminescent dosimeters (TLDs) were placed around the reactor. Nearly half of the foils and TLDs had additional high-density magnetite concrete, high-density barite concrete, standard concrete, and/or BoroBond shields. CEA Saclay provided all the concrete, and the US Y-12 National Security Complex provided the BoroBond. Measurement data from the experiments were published at the 2011 International Conference on Nuclear Criticality (ICNC 2011) and the 2013 Nuclear Criticality Safety Division (NCSD 2013) topical meeting. Preliminary computational results for the first experiment were presented in the ICNC 2011 paper, which showed poor agreement between the computational results and the measured values of the foils shielded by concrete. Recently the hydrogen content, boron content, and density of these concrete shields were further investigated within the constraints of the previously available data. New computational results for the first experiment are now available

  9. Verification and validation benchmarks.

    Energy Technology Data Exchange (ETDEWEB)

    Oberkampf, William Louis; Trucano, Timothy Guy

    2007-02-01

    Verification and validation (V&V) are the primary means to assess the accuracy and reliability of computational simulations. V&V methods and procedures have fundamentally improved the credibility of simulations in several high-consequence fields, such as nuclear reactor safety, underground nuclear waste storage, and nuclear weapon safety. Although the terminology is not uniform across engineering disciplines, code verification deals with assessing the reliability of the software coding, and solution verification deals with assessing the numerical accuracy of the solution to a computational model. Validation addresses the physics modeling accuracy of a computational simulation by comparing the computational results with experimental data. Code verification benchmarks and validation benchmarks have been constructed for a number of years in every field of computational simulation. However, no comprehensive guidelines have been proposed for the construction and use of V&V benchmarks. For example, the field of nuclear reactor safety has not focused on code verification benchmarks, but it has placed great emphasis on developing validation benchmarks. Many of these validation benchmarks are closely related to the operations of actual reactors at near-safety-critical conditions, as opposed to being more fundamental-physics benchmarks. This paper presents recommendations for the effective design and use of code verification benchmarks based on manufactured solutions, classical analytical solutions, and highly accurate numerical solutions. In addition, this paper presents recommendations for the design and use of validation benchmarks, highlighting the careful design of building-block experiments, the estimation of experimental measurement uncertainty for both inputs and outputs to the code, validation metrics, and the role of model calibration in validation. It is argued that the understanding of predictive capability of a computational model is built on the level of

  10. Benchmark quantum-chemical calculations on a complete set of rotameric families of the DNA sugar-phosphate backbone and their comparison with modern density functional theory.

    Science.gov (United States)

    Mládek, Arnošt; Krepl, Miroslav; Svozil, Daniel; Cech, Petr; Otyepka, Michal; Banáš, Pavel; Zgarbová, Marie; Jurečka, Petr; Sponer, Jiří

    2013-05-21

    The DNA sugar-phosphate backbone has a substantial influence on the DNA structural dynamics. Structural biology and bioinformatics studies revealed that the DNA backbone in experimental structures samples a wide range of distinct conformational substates, known as rotameric DNA backbone conformational families. Their correct description is essential for methods used to model nucleic acids and is known to be the Achilles heel of force field computations. In this study we report the benchmark database of MP2 calculations extrapolated to the complete basis set of atomic orbitals with aug-cc-pVTZ and aug-cc-pVQZ basis sets, MP2(T,Q), augmented by ΔCCSD(T)/aug-cc-pVDZ corrections. The calculations are performed in the gas phase as well as using a COSMO solvent model. This study includes a complete set of 18 established and biochemically most important families of DNA backbone conformations and several other salient conformations that we identified in experimental structures. We utilize an electronically sufficiently complete DNA sugar-phosphate-sugar (SPS) backbone model system truncated to prevent undesired intramolecular interactions. The calculations are then compared with other QM methods. The BLYP and TPSS functionals supplemented with Grimme's D3(BJ) dispersion term provide the best tradeoff between computational demands and accuracy and can be recommended for preliminary conformational searches as well as calculations on large model systems. Among the tested methods, the best agreement with the benchmark database has been obtained for the double-hybrid DSD-BLYP functional in combination with a quadruple-ζ basis set, which is, however, computationally very demanding. The new hybrid density functionals PW6B95-D3 and MPW1B95-D3 yield outstanding results and even slightly outperform the computationally more demanding PWPB95 double-hybrid functional. B3LYP-D3 is somewhat less accurate compared to the other hybrids. Extrapolated MP2(D,T) calculations are not as

  11. Shielding calculation and criticality safety analysis of spent fuel transportation cask in research reactors.

    Science.gov (United States)

    Mohammadi, A; Hassanzadeh, M; Gharib, M

    2016-02-01

    In this study, shielding calculation and criticality safety analysis were carried out for general material testing reactor (MTR) research reactors interim storage and relevant transportation cask. During these processes, three major terms were considered: source term, shielding, and criticality calculations. The Monte Carlo transport code MCNP5 was used for shielding calculation and criticality safety analysis and ORIGEN2.1 code for source term calculation. According to the results obtained, a cylindrical cask with body, top, and bottom thicknesses of 18, 13, and 13 cm, respectively, was accepted as the dual-purpose cask. Furthermore, it is shown that the total dose rates are below the normal transport criteria that meet the standards specified.

  12. Calculation of mixture critical diagrams using an equation of state based on the lattice fluid theory

    Directory of Open Access Journals (Sweden)

    S. Mattedi

    2000-12-01

    Full Text Available A modified form of the Hicks and Young algorithm was used with the Mattedi-Tavares-Castier lattice equation of state (MTC lattice EOS to calculate critical points of binary mixtures that exhibit several types of critical behavior. Several qualitative aspects of the critical curves, such as maxima and minima in critical pressure, and minima in critical temperature, could be predicted using the MTC lattice EOS. These results were in agreement with experimental information available in the literature, illustrating the flexibility of the functional form of the MTC lattice EOS. We observed however that the MTC lattice EOS failed to predict maxima in pressure for two of the studied systems: ethane + ethanol and methane + n-hexane. We also observed that the agreement between the calculated and experimental critical properties was at most semi-quantitative in some examples. Despite these limitations, in many ways similar to those of other EOS in common use when applied to critical point calculations, we can conclude that the MTC lattice EOS has the ability to predict several types of critical curves of complex shape.

  13. Benchmark selection

    DEFF Research Database (Denmark)

    Hougaard, Jens Leth; Tvede, Mich

    2002-01-01

    Within a production theoretic framework, this paper considers an axiomatic approach to benchmark selection. It is shown that two simple and weak axioms; efficiency and comprehensive monotonicity characterize a natural family of benchmarks which typically becomes unique. Further axioms are added...

  14. Interactive benchmarking

    DEFF Research Database (Denmark)

    Lawson, Lartey; Nielsen, Kurt

    2005-01-01

    We discuss individual learning by interactive benchmarking using stochastic frontier models. The interactions allow the user to tailor the performance evaluation to preferences and explore alternative improvement strategies by selecting and searching the different frontiers using directional...... in the suggested benchmarking tool. The study investigates how different characteristics on dairy farms influences the technical efficiency....

  15. Electronic couplings for molecular charge transfer: Benchmarking CDFT, FODFT, and FODFTB against high-level ab initio calculations

    Energy Technology Data Exchange (ETDEWEB)

    Kubas, Adam; Blumberger, Jochen, E-mail: j.blumberger@ucl.ac.uk [Department of Physics and Astronomy, University College London, Gower Street, London WC1E 6BT (United Kingdom); Hoffmann, Felix [Department of Physics and Astronomy, University College London, Gower Street, London WC1E 6BT (United Kingdom); Lehrstuhl für Theoretische Chemie, Ruhr-Universität Bochum, Universitätsstr. 150, 44801 Bochum (Germany); Heck, Alexander; Elstner, Marcus [Institute of Physical Chemistry, Karlsruhe Institute of Technology, Fritz-Haber-Weg 6, 76131 Karlsruhe (Germany); Oberhofer, Harald [Department of Chemistry, Technical University of Munich, Lichtenbergstr. 4, 85747 Garching (Germany)

    2014-03-14

    We introduce a database (HAB11) of electronic coupling matrix elements (H{sub ab}) for electron transfer in 11 π-conjugated organic homo-dimer cations. High-level ab inito calculations at the multireference configuration interaction MRCI+Q level of theory, n-electron valence state perturbation theory NEVPT2, and (spin-component scaled) approximate coupled cluster model (SCS)-CC2 are reported for this database to assess the performance of three DFT methods of decreasing computational cost, including constrained density functional theory (CDFT), fragment-orbital DFT (FODFT), and self-consistent charge density functional tight-binding (FODFTB). We find that the CDFT approach in combination with a modified PBE functional containing 50% Hartree-Fock exchange gives best results for absolute H{sub ab} values (mean relative unsigned error = 5.3%) and exponential distance decay constants β (4.3%). CDFT in combination with pure PBE overestimates couplings by 38.7% due to a too diffuse excess charge distribution, whereas the economic FODFT and highly cost-effective FODFTB methods underestimate couplings by 37.6% and 42.4%, respectively, due to neglect of interaction between donor and acceptor. The errors are systematic, however, and can be significantly reduced by applying a uniform scaling factor for each method. Applications to dimers outside the database, specifically rotated thiophene dimers and larger acenes up to pentacene, suggests that the same scaling procedure significantly improves the FODFT and FODFTB results for larger π-conjugated systems relevant to organic semiconductors and DNA.

  16. Benchmark calculations of nonconservative charged-particle swarms in dc electric and magnetic fields crossed at arbitrary angles.

    Science.gov (United States)

    Dujko, S; White, R D; Petrović, Z Lj; Robson, R E

    2010-04-01

    A multiterm solution of the Boltzmann equation has been developed and used to calculate transport coefficients of charged-particle swarms in gases under the influence of electric and magnetic fields crossed at arbitrary angles when nonconservative collisions are present. The hierarchy resulting from a spherical-harmonic decomposition of the Boltzmann equation in the hydrodynamic regime is solved numerically by representing the speed dependence of the phase-space distribution function in terms of an expansion in Sonine polynomials about a Maxwellian velocity distribution at an internally determined temperature. Results are given for electron swarms in certain collisional models for ionization and attachment over a range of angles between the fields and field strengths. The implicit and explicit effects of ionization and attachment on the electron-transport coefficients are considered using physical arguments. It is found that the difference between the two sets of transport coefficients, bulk and flux, resulting from the explicit effects of nonconservative collisions, can be controlled either by the variation in the magnetic field strengths or by the angles between the fields. In addition, it is shown that the phenomena of ionization cooling and/or attachment cooling/heating previously reported for dc electric fields carry over directly to the crossed electric and magnetic fields. The results of the Boltzmann equation analysis are compared with those obtained by a Monte Carlo simulation technique. The comparison confirms the theoretical basis and numerical integrity of the moment method for solving the Boltzmann equation and gives a set of well-established data that can be used to test future codes and plasma models.

  17. Calculation of the critical exponents by a renormalization of the Ornstein-Zernike equation

    Science.gov (United States)

    Zhang, Q.; Badiali, J. P.

    1991-09-01

    We calculate the critical exponents at the liquid-vapor critical point by using the classical ingredients of the liquid-state theory. Two coupling constants are defined at a microscopic level. The closure of the Ornstein-Zernike equation is given by the Callan-Symanzik equation from which we determine the position of the fixed point. The role of the three-body direct-correlation function is emphasized. A comparison between this work and the standard theory of critical phenomena based on the Landau-Ginzburg-Wilson Hamiltonian is presented.

  18. Criticality Safety Code Validation with LWBR’s SB Cores

    Energy Technology Data Exchange (ETDEWEB)

    Putman, Valerie Lee

    2003-01-01

    The first set of critical experiments from the Shippingport Light Water Breeder Reactor Program included eight, simple geometry critical cores built with 233UO2-ZrO2, 235UO2-ZrO2, ThO2, and ThO2-233UO2 nuclear materials. These cores are evaluated, described, and modeled to provide benchmarks and validation information for INEEL criticality safety calculation methodology. In addition to consistency with INEEL methodology, benchmark development and nuclear data are consistent with International Criticality Safety Benchmark Evaluation Project methodology.Section 1 of this report introduces the experiments and the reason they are useful for validating some INEEL criticality safety calculations. Section 2 provides detailed experiment descriptions based on currently available experiment reports. Section 3 identifies criticality safety validation requirement sources and summarizes requirements that most affect this report. Section 4 identifies relevant hand calculation and computer code calculation methodologies used in the experiment evaluation, benchmark development, and validation calculations. Section 5 provides a detailed experiment evaluation. This section identifies resolutions for currently unavailable and discrepant information. Section 5 also reports calculated experiment uncertainty effects. Section 6 describes the developed benchmarks. Section 6 includes calculated sensitivities to various benchmark features and parameters. Section 7 summarizes validation results. Appendices describe various assumptions and their bases, list experimenter calculations results for items that were independently calculated for this validation work, report other information gathered and developed by SCIENTEC personnel while evaluating these same experiments, and list benchmark sample input and miscellaneous supplementary data.

  19. Benchmark Evaluation of the NRAD Reactor LEU Core Startup Measurements

    Energy Technology Data Exchange (ETDEWEB)

    J. D. Bess; T. L. Maddock; M. A. Marshall

    2011-09-01

    The Neutron Radiography (NRAD) reactor is a 250-kW TRIGA-(Training, Research, Isotope Production, General Atomics)-conversion-type reactor at the Idaho National Laboratory; it is primarily used for neutron radiography analysis of irradiated and unirradiated fuels and materials. The NRAD reactor was converted from HEU to LEU fuel with 60 fuel elements and brought critical on March 31, 2010. This configuration of the NRAD reactor has been evaluated as an acceptable benchmark experiment and is available in the 2011 editions of the International Handbook of Evaluated Criticality Safety Benchmark Experiments (ICSBEP Handbook) and the International Handbook of Evaluated Reactor Physics Benchmark Experiments (IRPhEP Handbook). Significant effort went into precisely characterizing all aspects of the reactor core dimensions and material properties; detailed analyses of reactor parameters minimized experimental uncertainties. The largest contributors to the total benchmark uncertainty were the 234U, 236U, Er, and Hf content in the fuel; the manganese content in the stainless steel cladding; and the unknown level of water saturation in the graphite reflector blocks. A simplified benchmark model of the NRAD reactor was prepared with a keff of 1.0012 {+-} 0.0029 (1s). Monte Carlo calculations with MCNP5 and KENO-VI and various neutron cross section libraries were performed and compared with the benchmark eigenvalue for the 60-fuel-element core configuration; all calculated eigenvalues are between 0.3 and 0.8% greater than the benchmark value. Benchmark evaluations of the NRAD reactor are beneficial in understanding biases and uncertainties affecting criticality safety analyses of storage, handling, or transportation applications with LEU-Er-Zr-H fuel.

  20. Criticality coefficient calculation for a small PWR using Monte Carlo Transport Code

    Energy Technology Data Exchange (ETDEWEB)

    Trombetta, Debora M.; Su, Jian, E-mail: dtrombetta@nuclear.ufrj.br, E-mail: sujian@nuclear.ufrj.br [Coordenacao dos Programas de Pos-Graduacao em Engenharia (COPPE/UFRJ), Rio de Janeiro, RJ (Brazil); Chirayath, Sunil S., E-mail: sunilsc@tamu.edu [Department of Nuclear Engineering and Nuclear Security Science and Policy Institute, Texas A and M University, TX (United States)

    2015-07-01

    Computational models of reactors are increasingly used to predict nuclear reactor physics parameters responsible for reactivity changes which could lead to accidents and losses. In this work, preliminary results for criticality coefficient calculation using the Monte Carlo transport code MCNPX were presented for a small PWR. The computational modeling developed consists of the core with fuel elements, radial reflectors, and control rods inside a pressure vessel. Three different geometries were simulated, a single fuel pin, a fuel assembly and the core, with the aim to compare the criticality coefficients among themselves.The criticality coefficients calculated were: Doppler Temperature Coefficient, Coolant Temperature Coefficient, Coolant Void Coefficient, Power Coefficient, and Control Rod Worth. The coefficient values calculated by the MCNP code were compared with literature results, showing good agreement with reference data, which validate the computational model developed and allow it to be used to perform more complex studies. Criticality Coefficient values for the three simulations done had little discrepancy for almost all coefficients investigated, the only exception was the Power Coefficient. Preliminary results presented show that simple modelling as a fuel assembly can describe changes at almost all the criticality coefficients, avoiding the need of a complex core simulation. (author)

  1. A Direct Calculation of Critical Exponents of Two-Dimensional Anisotropic Ising Model

    Institute of Scientific and Technical Information of China (English)

    XIONG Gang; WANG Xiang-Rong

    2006-01-01

    Using an exact solution of the one-dimensional quantum transverse-field Ising model, we calculate the critical exponents of the two-dimensional anisotropic classicalIsing model (IM). We verify that the exponents are the same as those of isotropic classical IM. Our approach provides an alternative means of obtaining and verifying these well-known results.

  2. Benchmark for Strategic Performance Improvement.

    Science.gov (United States)

    Gohlke, Annette

    1997-01-01

    Explains benchmarking, a total quality management tool used to measure and compare the work processes in a library with those in other libraries to increase library performance. Topics include the main groups of upper management, clients, and staff; critical success factors for each group; and benefits of benchmarking. (Author/LRW)

  3. Evaluation of dose equivalent rate distribution in JCO critical accident by radiation transport calculation

    CERN Document Server

    Sakamoto, Y

    2002-01-01

    In the prevention of nuclear disaster, there needs the information on the dose equivalent rate distribution inside and outside the site, and energy spectra. The three dimensional radiation transport calculation code is a useful tool for the site specific detailed analysis with the consideration of facility structures. It is important in the prediction of individual doses in the future countermeasure that the reliability of the evaluation methods of dose equivalent rate distribution and energy spectra by using of Monte Carlo radiation transport calculation code, and the factors which influence the dose equivalent rate distribution outside the site are confirmed. The reliability of radiation transport calculation code and the influence factors of dose equivalent rate distribution were examined through the analyses of critical accident at JCO's uranium processing plant occurred on September 30, 1999. The radiation transport calculations including the burn-up calculations were done by using of the structural info...

  4. Nuclear criticality safety experiments, calculations, and analyses: 1958 to 1982. Volume 1. Lookup tables

    Energy Technology Data Exchange (ETDEWEB)

    Koponen, B.L.; Hampel, V.E.

    1982-10-21

    This compilation contains 688 complete summaries of papers on nuclear criticality safety as presented at meetings of the American Nuclear Society (ANS). The selected papers contain criticality parameters for fissile materials derived from experiments and calculations, as well as criticality safety analyses for fissile material processing, transport, and storage. The compilation was developed as a component of the Nuclear Criticality Information System (NCIS) now under development at the Lawrence Livermore National Laboratory. The compilation is presented in two volumes: Volume 1 contains a directory to the ANS Transaction volume and page number where each summary was originally published, the author concordance, and the subject concordance derived from the keyphrases in titles. Volume 2 contains - in chronological order - the full-text summaries, reproduced here by permission of the American Nuclear Society from their Transactions, volumes 1-41.

  5. Gamow's calculation of the neutron star's critical mass revised

    Energy Technology Data Exchange (ETDEWEB)

    Ludwig, Hendrik; Ruffini, Remo [Sapienza Universita di Roma, Rome (Italy); ICRANet, University of Nice-Sophia Antipolis, Nice Cedex (France)

    2014-09-15

    It has at times been indicated that Landau introduced neutron stars in his classic paper of 1932. This is clearly impossible because the discovery of the neutron by Chadwick was submitted more than one month after Landau's work. Therefore, and according to his calculations, what Landau really did was to study white dwarfs, and the critical mass he obtained clearly matched the value derived by Stoner and later by Chandrasekhar. The birth of the concept of a neutron star is still today unclear. Clearly, in 1934, the work of Baade and Zwicky pointed to neutron stars as originating from supernovae. Oppenheimer in 1939 is also well known to have introduced general relativity (GR) in the study of neutron stars. The aim of this note is to point out that the crucial idea for treating the neutron star has been advanced in Newtonian theory by Gamow. However, this pioneering work was plagued by mistakes. The critical mass he should have obtained was 6.9 M, not the one he declared, namely, 1.5 M. Probably, he was taken to this result by the work of Landau on white dwarfs. We revise Gamow's calculation of the critical mass regarding calculational and conceptual aspects and discuss whether it is justified to consider it the first neutron-star critical mass. We compare Gamow's approach to other early and modern approaches to the problem.

  6. Benchmarking and Performance Management

    Directory of Open Access Journals (Sweden)

    Adrian TANTAU

    2010-12-01

    Full Text Available The relevance of the chosen topic is explained by the meaning of the firm efficiency concept - the firm efficiency means the revealed performance (how well the firm performs in the actual market environment given the basic characteristics of the firms and their markets that are expected to drive their profitability (firm size, market power etc.. This complex and relative performance could be due to such things as product innovation, management quality, work organization, some other factors can be a cause even if they are not directly observed by the researcher. The critical need for the management individuals/group to continuously improve their firm/company’s efficiency and effectiveness, the need for the managers to know which are the success factors and the competitiveness determinants determine consequently, what performance measures are most critical in determining their firm’s overall success. Benchmarking, when done properly, can accurately identify both successful companies and the underlying reasons for their success. Innovation and benchmarking firm level performance are critical interdependent activities. Firm level variables, used to infer performance, are often interdependent due to operational reasons. Hence, the managers need to take the dependencies among these variables into account when forecasting and benchmarking performance. This paper studies firm level performance using financial ratio and other type of profitability measures. It uses econometric models to describe and then propose a method to forecast and benchmark performance.

  7. Neutronics Benchmarks for the Utilization of Mixed-Oxide Fuel: Joint US/Russian Progress Report for Fiscal Year 1997, Volume 4, part 4-ESADA Plutonium Program Critical Experiments: Single-Region Core Configurations

    Energy Technology Data Exchange (ETDEWEB)

    Akkurt, H.; Abdurrahman, N.M.

    1999-05-01

    The purpose of this study is to simulate and assess the findings from selected ESADA experiments. It is presented in the format prescribed by the Nuclear Energy Agency Nuclear Science Committee for material to be included in the International Handbook of Evaluated Criticality Safety Benchmark Experiments.

  8. An Analytical Solution for Lateral Buckling Critical Load Calculation of Leaning-Type Arch Bridge

    Directory of Open Access Journals (Sweden)

    Ai-rong Liu

    2014-01-01

    Full Text Available An analytical solution for lateral buckling critical load of leaning-type arch bridge was presented in this paper. New tangential and radial buckling models of the transverse brace between the main and stable arch ribs are established. Based on the Ritz method, the analytical solution for lateral buckling critical load of the leaning-type arch bridge with different central angles of main arch ribs and leaning arch ribs under different boundary conditions is derived for the first time. Comparison between the analytical results and the FEM calculated results shows that the analytical solution presented in this paper is sufficiently accurate. The parametric analysis results show that the lateral buckling critical load of the arch bridge with fixed boundary conditions is about 1.14 to 1.16 times as large as that of the arch bridge with hinged boundary condition. The lateral buckling critical load increases by approximately 31.5% to 41.2% when stable arch ribs are added, and the critical load increases as the inclined angle of stable arch rib increases. The differences in the center angles of the main arch rib and the stable arch rib have little effect on the lateral buckling critical load.

  9. Numerical performance and throughput benchmark for electronic structure calculations in PC-Linux systems with new architectures, updated compilers, and libraries.

    Science.gov (United States)

    Yu, Jen-Shiang K; Hwang, Jenn-Kang; Tang, Chuan Yi; Yu, Chin-Hui

    2004-01-01

    A number of recently released numerical libraries including Automatically Tuned Linear Algebra Subroutines (ATLAS) library, Intel Math Kernel Library (MKL), GOTO numerical library, and AMD Core Math Library (ACML) for AMD Opteron processors, are linked against the executables of the Gaussian 98 electronic structure calculation package, which is compiled by updated versions of Fortran compilers such as Intel Fortran compiler (ifc/efc) 7.1 and PGI Fortran compiler (pgf77/pgf90) 5.0. The ifc 7.1 delivers about 3% of improvement on 32-bit machines compared to the former version 6.0. Performance improved from pgf77 3.3 to 5.0 is also around 3% when utilizing the original unmodified optimization options of the compiler enclosed in the software. Nevertheless, if extensive compiler tuning options are used, the speed can be further accelerated to about 25%. The performances of these fully optimized numerical libraries are similar. The double-precision floating-point (FP) instruction sets (SSE2) are also functional on AMD Opteron processors operated in 32-bit compilation, and Intel Fortran compiler has performed better optimization. Hardware-level tuning is able to improve memory bandwidth by adjusting the DRAM timing, and the efficiency in the CL2 mode is further accelerated by 2.6% compared to that of the CL2.5 mode. The FP throughput is measured by simultaneous execution of two identical copies of each of the test jobs. Resultant performance impact suggests that IA64 and AMD64 architectures are able to fulfill significantly higher throughput than the IA32, which is consistent with the SpecFPrate2000 benchmarks.

  10. Critical analysis of fragment-orbital DFT schemes for the calculation of electronic coupling values

    Energy Technology Data Exchange (ETDEWEB)

    Schober, Christoph; Reuter, Karsten; Oberhofer, Harald, E-mail: harald.oberhofer@ch.tum.de [Chair for Theoretical Chemistry, Technische Universität München, Lichtenbergstr. 4, D-85747 Garching (Germany)

    2016-02-07

    We present a critical analysis of the popular fragment-orbital density-functional theory (FO-DFT) scheme for the calculation of electronic coupling values. We discuss the characteristics of different possible formulations or “flavors” of the scheme which differ by the number of electrons in the calculation of the fragments and the construction of the Hamiltonian. In addition to two previously described variants based on neutral fragments, we present a third version taking a different route to the approximate diabatic state by explicitly considering charged fragments. In applying these FO-DFT flavors to the two molecular test sets HAB7 (electron transfer) and HAB11 (hole transfer), we find that our new scheme gives improved electronic couplings for HAB7 (−6.2% decrease in mean relative signed error) and greatly improved electronic couplings for HAB11 (−15.3% decrease in mean relative signed error). A systematic investigation of the influence of exact exchange on the electronic coupling values shows that the use of hybrid functionals in FO-DFT calculations improves the electronic couplings, giving values close to or even better than more sophisticated constrained DFT calculations. Comparing the accuracy and computational cost of each variant, we devise simple rules to choose the best possible flavor depending on the task. For accuracy, our new scheme with charged-fragment calculations performs best, while numerically more efficient at reasonable accuracy is the variant with neutral fragments.

  11. Critical analysis of fragment-orbital DFT schemes for the calculation of electronic coupling values.

    Science.gov (United States)

    Schober, Christoph; Reuter, Karsten; Oberhofer, Harald

    2016-02-07

    We present a critical analysis of the popular fragment-orbital density-functional theory (FO-DFT) scheme for the calculation of electronic coupling values. We discuss the characteristics of different possible formulations or "flavors" of the scheme which differ by the number of electrons in the calculation of the fragments and the construction of the Hamiltonian. In addition to two previously described variants based on neutral fragments, we present a third version taking a different route to the approximate diabatic state by explicitly considering charged fragments. In applying these FO-DFT flavors to the two molecular test sets HAB7 (electron transfer) and HAB11 (hole transfer), we find that our new scheme gives improved electronic couplings for HAB7 (-6.2% decrease in mean relative signed error) and greatly improved electronic couplings for HAB11 (-15.3% decrease in mean relative signed error). A systematic investigation of the influence of exact exchange on the electronic coupling values shows that the use of hybrid functionals in FO-DFT calculations improves the electronic couplings, giving values close to or even better than more sophisticated constrained DFT calculations. Comparing the accuracy and computational cost of each variant, we devise simple rules to choose the best possible flavor depending on the task. For accuracy, our new scheme with charged-fragment calculations performs best, while numerically more efficient at reasonable accuracy is the variant with neutral fragments.

  12. Test Suite for Nuclear Data I: Deterministic Calculations for Critical Assemblies and Replacement Coefficients

    Energy Technology Data Exchange (ETDEWEB)

    Pruet, J; Brown, D A; Descalle, M

    2006-05-22

    The authors describe tools developed by the Computational Nuclear Physics group for testing the quality of internally developed nuclear data and the fidelity of translations from ENDF formatted data to ENDL formatted data used by Livermore. These tests include S{sub n} calculations for the effective k value characterizing critical assemblies and for replacement coefficients of different materials embedded in the Godiva and Jezebel critical assemblies. For those assemblies and replacement materials for which reliable experimental information is available, these calculations provide an integral check on the quality of data. Because members of the ENDF and reactor communities use calculations for these same assemblies in their validation process, a comparison between their results with ENDF formatted data and their results with data translated into the ENDL format provides a strong check on the accuracy of translations. As a first application of the test suite they present a study comparing ENDL 99 and ENDF/B-V. They also consider the quality of the ENDF/B-V translation previously done by the Computational Nuclear Physics group. No significant errors are found.

  13. Benchmarking & European Sustainable Transport Policies

    DEFF Research Database (Denmark)

    Gudmundsson, H.

    2003-01-01

    , Benchmarking is one of the management tools that have recently been introduced in the transport sector. It is rapidly being applied to a wide range of transport operations, services and policies. This paper is a contribution to the discussion of the role of benchmarking in the future efforts...... to support Sustainable European Transport Policies. The key message is that transport benchmarking has not yet been developed to cope with the challenges of this task. Rather than backing down completely, the paper suggests some critical conditions for applying and adopting benchmarking for this purpose. One...... way forward is to ensure a higher level of environmental integration in transport policy benchmarking. To this effect the paper will discuss the possible role of the socalled Transport and Environment Reporting Mechanism developed by the European Environment Agency. The paper provides an independent...

  14. Calculational criticality analyses of 10- and 20-MW UF[sub 6] freezer/sublimer vessels

    Energy Technology Data Exchange (ETDEWEB)

    Jordan, W.C.

    1993-02-01

    Calculational criticality analyses have been performed for 10- and 20-MW UF[sub 6] freezer/sublimer vessels. The freezer/sublimers have been analyzed over a range of conditions that encompass normal operation and abnormal conditions. The effects of HF moderation of the UF[sub 6] in each vessel have been considered for uranium enriched between 2 and 5 wt % [sup 235]U. The results indicate that the nuclearly safe enrichments originally established for the operation of a 10-MW freezer/sublimer, based on a hydrogen-to-uranium moderation ratio of 0.33, are acceptable. If strict moderation control can be demonstrated for hydrogen-to-uranium moderation ratios that are less than 0.33, then the enrichment limits for the 10-MW freezer/sublimer may be increased slightly. The calculations performed also allow safe enrichment limits to be established for a 20-NM freezer/sublimer under moderation control.

  15. Radiography benchmark 2014

    Energy Technology Data Exchange (ETDEWEB)

    Jaenisch, G.-R., E-mail: Gerd-Ruediger.Jaenisch@bam.de; Deresch, A., E-mail: Gerd-Ruediger.Jaenisch@bam.de; Bellon, C., E-mail: Gerd-Ruediger.Jaenisch@bam.de [Federal Institute for Materials Research and Testing, Unter den Eichen 87, 12205 Berlin (Germany); Schumm, A.; Lucet-Sanchez, F.; Guerin, P. [EDF R and D, 1 avenue du Général de Gaulle, 92141 Clamart (France)

    2015-03-31

    The purpose of the 2014 WFNDEC RT benchmark study was to compare predictions of various models of radiographic techniques, in particular those that predict the contribution of scattered radiation. All calculations were carried out for homogenous materials and a mono-energetic X-ray point source in the energy range between 100 keV and 10 MeV. The calculations were to include the best physics approach available considering electron binding effects. Secondary effects like X-ray fluorescence and bremsstrahlung production were to be taken into account if possible. The problem to be considered had two parts. Part I examined the spectrum and the spatial distribution of radiation behind a single iron plate. Part II considered two equally sized plates, made of iron and aluminum respectively, only evaluating the spatial distribution. Here we present the results of above benchmark study, comparing them to MCNP as the assumed reference model. The possible origins of the observed deviations are discussed.

  16. Calculation of the critical overdensity in the spherical-collapse approximation

    Science.gov (United States)

    Herrera, D.; Waga, I.; Jorás, S. E.

    2017-03-01

    Critical overdensity δc is a key concept in estimating the number count of halos for different redshift and halo-mass bins, and therefore, it is a powerful tool to compare cosmological models to observations. There are currently two different prescriptions in the literature for its calculation, namely, the differential-radius and the constant-infinity methods. In this work we show that the latter yields precise results only if we are careful in the definition of the so-called numerical infinities. Although the subtleties we point out are crucial ingredients for an accurate determination of δc both in general relativity and in any other gravity theory, we focus on f (R )-modified gravity models in the metric approach; in particular, we use the so-called large (F =1 /3 ) and small-field (F =0 ) limits. For both of them, we calculate the relative errors (between our method and the others) in the critical density δc, in the comoving number density of halos per logarithmic mass interval nln M, and in the number of clusters at a given redshift in a given mass bin Nbin, as functions of the redshift. We have also derived an analytical expression for the density contrast in the linear regime as a function of the collapse redshift zc and Ωm 0 for any F .

  17. Critical groups vs. representative person: dose calculations due to predicted releases from USEXA

    Energy Technology Data Exchange (ETDEWEB)

    Ferreira, N.L.D., E-mail: nelson.luiz@ctmsp.mar.mil.br [Centro Tecnologico da Marinha (CTM/SP), Sao Paulo, SP (Brazil); Rochedo, E.R.R., E-mail: elainerochedo@gmail.com [Instituto de Radiprotecao e Dosimetria (lRD/CNEN-RJ), Rio de Janeiro, RJ (Brazil); Mazzilli, B.P., E-mail: mazzilli@ipen.br [Instituto de Pesquisas Energeticas e Nucleares (IPEN/CNEN-SP), Sao Paulo, SP (Brazil)

    2013-07-01

    The critical group cf Centro Experimental Aramar (CEA) site was previously defined based 00 the effluents releases to the environment resulting from the facilities already operational at CEA. In this work, effective doses are calculated to members of the critical group considering the predicted potential uranium releases from the Uranium Hexafluoride Production Plant (USEXA). Basically, this work studies the behavior of the resulting doses related to the type of habit data used in the analysis and two distinct situations are considered: (a) the utilization of average values obtained from official institutions (IBGE, IEA-SP, CNEN, IAEA) and from the literature; and (b) the utilization of the 95{sup tb} percentile of the values derived from distributions fit to the obtained habit data. The first option corresponds to the way that data was used for the definition of the critical group of CEA done in former assessments, while the second one corresponds to the use of data in deterministic assessments, as recommended by ICRP to estimate doses to the so--called 'representative person' . (author)

  18. Three-Temperature MHD Calculation of the Critical Surface of Laser Absorption in Laser Induced Plasmas

    Science.gov (United States)

    Merkle Peterkin, Laurence D., Jr.

    1997-11-01

    The time-dependent location of the critical surface of laser absorption is studied numerically, using the general purpose two-dimensional finite-difference MHD software uc(Mach2.) This software, which is based on an arbitrary Lagrangian-Eulerian fluid algorithm, includes models for partial laser absorption in underdense plasmas via inverse brehmsstrahlung, as well as total laser absorption at a critical surface. The simulations conducted are of a laboratory experiment in which a plasma is generated by a mode-locked laser interacting with a solid copper target (G.K. Chawla and C.W. von Rosenberg, Jr., IEEE Conference Record --- Abstracts, 1997 IEEE International Conference on Plasma Science). The location of the critical surface is a function of the number density of free electrons. Consequently, calculations must carefully consider the energy budget. Because of large opacities in hot regions, a non-equilibrium radiation diffusion model is employed. Adequate energy conservation in such simulations is possible only with careful attention to numerical aspects, such as time steps and flux limits. Simulations are performed for both 90^circ and 45^circ incident beams. The former are carried out using both cylindrical and plane-parallel geometries, while the latter require a plane-parallel geometry.

  19. Medicare Program; Medicare Shared Savings Program; Accountable Care Organizations--Revised Benchmark Rebasing Methodology, Facilitating Transition to Performance-Based Risk, and Administrative Finality of Financial Calculations. Final rule.

    Science.gov (United States)

    2016-06-10

    Under the Medicare Shared Savings Program (Shared Savings Program), providers of services and suppliers that participate in an Accountable Care Organization (ACO) continue to receive traditional Medicare fee-for-service (FFS) payments under Parts A and B, but the ACO may be eligible to receive a shared savings payment if it meets specified quality and savings requirements. This final rule addresses changes to the Shared Savings Program, including: Modifications to the program's benchmarking methodology, when resetting (rebasing) the ACO's benchmark for a second or subsequent agreement period, to encourage ACOs' continued investment in care coordination and quality improvement; an alternative participation option to encourage ACOs to enter performance-based risk arrangements earlier in their participation under the program; and policies for reopening of payment determinations to make corrections after financial calculations have been performed and ACO shared savings and shared losses for a performance year have been determined.

  20. Refinement of the critical angle calculation for the contrast reversal of oil slicks under sunglint

    Science.gov (United States)

    Lu, Yingcheng; Sun, Shaojie; Zhang, Minwei; Murch, Brock; Hu, Chuanmin

    2016-01-01

    It has long been observed that oil slicks under sunglint can reverse their optical contrast against nearby oil-free seawater. Such a phenomenon has been described through both empirical statistical analysis of the sunglint strength and modeled theoretically using a critical angle concept. The critical angle, in this model, is the angle at which the image pixels show no or negligible contrast between oiled and nonoiled seawater. Pixels away from this critical angle show either positive or negative contrast from the oil-free pixels. Although this concept has been fully demonstrated in the published literature, its calculation needs to be further refined to take into account: (1) the different refractive indices of oil slicks (from natural seeps) and seawater and (2) atmospheric effects in the sensor-measured radiance. Using measurements from the Moderate Resolution Imaging Spectroradiometer (MODIS) over oil films in the Gulf of Mexico, we show improvement in the modeled and MODIS-derived reflectance over oil slicks originated from natural seeps after incorporating these two factors in the model. Specifically, agreement between modeled and measured sunglint reflectance is found for both negative and positive-contrasting oil slicks. These results indicate that surface roughness and reflectance from oil films can be estimated given any solar/viewing geometry and surface wind. Further, this model might be used to correct the sunglint effect on thick oil under similar illumination conditions. Once proven possible, it may allow existing laboratory-based models, which estimate oil thickness after such corrections, to be applied to remote sensing imagery.

  1. Criticality Model

    Energy Technology Data Exchange (ETDEWEB)

    A. Alsaed

    2004-09-14

    The ''Disposal Criticality Analysis Methodology Topical Report'' (YMP 2003) presents the methodology for evaluating potential criticality situations in the monitored geologic repository. As stated in the referenced Topical Report, the detailed methodology for performing the disposal criticality analyses will be documented in model reports. Many of the models developed in support of the Topical Report differ from the definition of models as given in the Office of Civilian Radioactive Waste Management procedure AP-SIII.10Q, ''Models'', in that they are procedural, rather than mathematical. These model reports document the detailed methodology necessary to implement the approach presented in the Disposal Criticality Analysis Methodology Topical Report and provide calculations utilizing the methodology. Thus, the governing procedure for this type of report is AP-3.12Q, ''Design Calculations and Analyses''. The ''Criticality Model'' is of this latter type, providing a process evaluating the criticality potential of in-package and external configurations. The purpose of this analysis is to layout the process for calculating the criticality potential for various in-package and external configurations and to calculate lower-bound tolerance limit (LBTL) values and determine range of applicability (ROA) parameters. The LBTL calculations and the ROA determinations are performed using selected benchmark experiments that are applicable to various waste forms and various in-package and external configurations. The waste forms considered in this calculation are pressurized water reactor (PWR), boiling water reactor (BWR), Fast Flux Test Facility (FFTF), Training Research Isotope General Atomic (TRIGA), Enrico Fermi, Shippingport pressurized water reactor, Shippingport light water breeder reactor (LWBR), N-Reactor, Melt and Dilute, and Fort Saint Vrain Reactor spent nuclear fuel (SNF). The scope of

  2. Reliable Real-time Calculation of Heart-rate Complexity in Critically Ill Patients Using Multiple Noisy Waveform Sources

    Science.gov (United States)

    2014-01-01

    related metrics for detecting sepsis and multior- gan failure, improvement of HRC calculations may help detect significant changes from baseline values...calculations. Equiva- lence tests between mean HRC values derived from man- ually verified sequences and those derived from automatically detected peaks...assessment of HRC in critically ill patients. Keywords Signal detection analysis Electrocardiography Heart rate Clinical decision support

  3. Kvantitativ benchmark - Produktionsvirksomheder

    DEFF Research Database (Denmark)

    Sørensen, Ole H.; Andersen, Vibeke

    Rapport med resultatet af kvantitativ benchmark over produktionsvirksomhederne i VIPS projektet.......Rapport med resultatet af kvantitativ benchmark over produktionsvirksomhederne i VIPS projektet....

  4. Benchmarking in Student Affairs.

    Science.gov (United States)

    Mosier, Robert E.; Schwarzmueller, Gary J.

    2002-01-01

    Discusses the use of benchmarking in student affairs, focusing on issues related to student housing. Provides examples of how benchmarking has influenced administrative practice at many institutions. (EV)

  5. Radiation Detection Computational Benchmark Scenarios

    Energy Technology Data Exchange (ETDEWEB)

    Shaver, Mark W.; Casella, Andrew M.; Wittman, Richard S.; McDonald, Ben S.

    2013-09-24

    Modeling forms an important component of radiation detection development, allowing for testing of new detector designs, evaluation of existing equipment against a wide variety of potential threat sources, and assessing operation performance of radiation detection systems. This can, however, result in large and complex scenarios which are time consuming to model. A variety of approaches to radiation transport modeling exist with complementary strengths and weaknesses for different problems. This variety of approaches, and the development of promising new tools (such as ORNL’s ADVANTG) which combine benefits of multiple approaches, illustrates the need for a means of evaluating or comparing different techniques for radiation detection problems. This report presents a set of 9 benchmark problems for comparing different types of radiation transport calculations, identifying appropriate tools for classes of problems, and testing and guiding the development of new methods. The benchmarks were drawn primarily from existing or previous calculations with a preference for scenarios which include experimental data, or otherwise have results with a high level of confidence, are non-sensitive, and represent problem sets of interest to NA-22. From a technical perspective, the benchmarks were chosen to span a range of difficulty and to include gamma transport, neutron transport, or both and represent different important physical processes and a range of sensitivity to angular or energy fidelity. Following benchmark identification, existing information about geometry, measurements, and previous calculations were assembled. Monte Carlo results (MCNP decks) were reviewed or created and re-run in order to attain accurate computational times and to verify agreement with experimental data, when present. Benchmark information was then conveyed to ORNL in order to guide testing and development of hybrid calculations. The results of those ADVANTG calculations were then sent to PNNL for

  6. Modeling of the ORNL PCA Benchmark Using SCALE6.0 Hybrid Deterministic-Stochastic Methodology

    Directory of Open Access Journals (Sweden)

    Mario Matijević

    2013-01-01

    Full Text Available Revised guidelines with the support of computational benchmarks are needed for the regulation of the allowed neutron irradiation to reactor structures during power plant lifetime. Currently, US NRC Regulatory Guide 1.190 is the effective guideline for reactor dosimetry calculations. A well known international shielding database SINBAD contains large selection of models for benchmarking neutron transport methods. In this paper a PCA benchmark has been chosen from SINBAD for qualification of our methodology for pressure vessel neutron fluence calculations, as required by the Regulatory Guide 1.190. The SCALE6.0 code package, developed at Oak Ridge National Laboratory, was used for modeling of the PCA benchmark. The CSAS6 criticality sequence of the SCALE6.0 code package, which includes KENO-VI Monte Carlo code, as well as MAVRIC/Monaco hybrid shielding sequence, was utilized for calculation of equivalent fission fluxes. The shielding analysis was performed using multigroup shielding library v7_200n47g derived from general purpose ENDF/B-VII.0 library. As a source of response functions for reaction rate calculations with MAVRIC we used international reactor dosimetry libraries (IRDF-2002 and IRDF-90.v2 and appropriate cross-sections from transport library v7_200n47g. The comparison of calculational results and benchmark data showed a good agreement of the calculated and measured equivalent fission fluxes.

  7. Validation of KENO V. a. and two cross-section libraries for criticality calculations of low-enriched uranium systems

    Energy Technology Data Exchange (ETDEWEB)

    Easter, M.E.

    1985-07-01

    The SCALE code system, utilizing the Monte Carlo computer code KENO V.a, was employed to calculate 37 critical experiments. The critical assemblies had /sup 235/U enrichments of 5% or less and cover a variety of geometries and materials. Values of k/sub eff/ were calculated using two different results using either of the cross-section libraries. The 16-energy-group Hansen-Roach and the 27-energy-group ENDF/B-IV cross-section libraries, available in SCALE, were used in this validation study, and both give good results for the experiments considered. It is concluded that the code and cross sections are adequate for low-enriched uranium systems and that reliable criticality safety calculations can be made for such systems provided the limits of validated applicability are not exceeded.

  8. FRIB driver linac vacuum model and benchmarks

    CERN Document Server

    Durickovic, Bojan; Kersevan, Roberto; Machicoane, Guillaume

    2014-01-01

    The Facility for Rare Isotope Beams (FRIB) is a superconducting heavy-ion linear accelerator that is to produce rare isotopes far from stability for low energy nuclear science. In order to achieve this, its driver linac needs to achieve a very high beam current (up to 400 kW beam power), and this requirement makes vacuum levels of critical importance. Vacuum calculations have been carried out to verify that the vacuum system design meets the requirements. The modeling procedure was benchmarked by comparing models of an existing facility against measurements. In this paper, we present an overview of the methods used for FRIB vacuum calculations and simulation results for some interesting sections of the accelerator. (C) 2013 Elsevier Ltd. All rights reserved.

  9. Accelerator-driven sub-critical research facility with low-enriched fuel in lead matrix: Neutron flux calculation

    Directory of Open Access Journals (Sweden)

    Avramović Ivana

    2007-01-01

    Full Text Available The H5B is a concept of an accelerator-driven sub-critical research facility (ADSRF being developed over the last couple of years at the Vinča Institute of Nuclear Sciences, Belgrade, Serbia. Using well-known computer codes, the MCNPX and MCNP, this paper deals with the results of a tar get study and neutron flux calculations in the sub-critical core. The neutron source is generated by an interaction of a proton or deuteron beam with the target placed inside the sub-critical core. The results of the total neutron flux density escaping the target and calculations of neutron yields for different target materials are also given here. Neutrons escaping the target volume with the group spectra (first step are used to specify a neutron source for further numerical simulations of the neutron flux density in the sub-critical core (second step. The results of the calculations of the neutron effective multiplication factor keff and neutron generation time L for the ADSRF model have also been presented. Neutron spectra calculations for an ADSRF with an uranium tar get (highest values of the neutron yield for the selected sub-critical core cells for both beams have also been presented in this paper.

  10. On the isotropic Raman spectrum of Ar{sub 2} and how to benchmark ab initio calculations of small atomic clusters: Paradox lost

    Energy Technology Data Exchange (ETDEWEB)

    Chrysos, Michael, E-mail: michel.chrysos@univ-angers.fr; Rachet, Florent [LUNAM Université, Université d’Angers, CNRS UMR 6200, Laboratoire MOLTECH-Anjou, 2 Bd Lavoisier, 49045 Angers (France); Dixneuf, Sophie [Centre du Commissariat à l’Énergie Atomique de Grenoble, Laboratoire CEA-bioMérieux, Bât 40.20, 17 rue des Martyrs, 38054 Grenoble (France)

    2015-07-14

    This is the long-overdue answer to the discrepancies observed between theory and experiment in Ar{sub 2} regarding both the isotropic Raman spectrum and the second refractivity virial coefficient, B{sub R} [Gaye et al., Phys. Rev. A 55, 3484 (1997)]. At the origin of this progress is the advent (posterior to 1997) of advanced computational methods for weakly interconnected neutral species at close separations. Here, we report agreement between the previously taken Raman measurements and quantum lineshapes now computed with the employ of large-scale CCSD or smartly constructed MP2 induced-polarizability data. By using these measurements as a benchmark tool, we assess the degree of performance of various other ab initio computed data for the mean polarizability α, and we show that an excellent agreement with the most recently measured value of B{sub R} is reached. We propose an even more refined model for α, which is solution of the inverse-scattering problem and whose lineshape matches exactly the measured spectrum over the entire frequency-shift range probed.

  11. Benchmarking v ICT

    OpenAIRE

    Blecher, Jan

    2009-01-01

    The aim of this paper is to describe benefits of benchmarking IT in wider context and benchmarking scope at all. I specify benchmarking as a process and mention basic rules and guidelines. Further I define IT benchmarking domains and describe possibilities of their use. Best known type of IT benchmark is cost benchmark which represents only a subset of benchmark opportunities. In this paper, is cost benchmark rather an imaginary first step to benchmarking contribution to company. IT benchmark...

  12. ICSBEP Benchmarks For Nuclear Data Applications

    Science.gov (United States)

    Briggs, J. Blair

    2005-05-01

    The International Criticality Safety Benchmark Evaluation Project (ICSBEP) was initiated in 1992 by the United States Department of Energy. The ICSBEP became an official activity of the Organization for Economic Cooperation and Development (OECD) — Nuclear Energy Agency (NEA) in 1995. Representatives from the United States, United Kingdom, France, Japan, the Russian Federation, Hungary, Republic of Korea, Slovenia, Serbia and Montenegro (formerly Yugoslavia), Kazakhstan, Spain, Israel, Brazil, Poland, and the Czech Republic are now participating. South Africa, India, China, and Germany are considering participation. The purpose of the ICSBEP is to identify, evaluate, verify, and formally document a comprehensive and internationally peer-reviewed set of criticality safety benchmark data. The work of the ICSBEP is published as an OECD handbook entitled "International Handbook of Evaluated Criticality Safety Benchmark Experiments." The 2004 Edition of the Handbook contains benchmark specifications for 3331 critical or subcritical configurations that are intended for use in validation efforts and for testing basic nuclear data. New to the 2004 Edition of the Handbook is a draft criticality alarm / shielding type benchmark that should be finalized in 2005 along with two other similar benchmarks. The Handbook is being used extensively for nuclear data testing and is expected to be a valuable resource for code and data validation and improvement efforts for decades to come. Specific benchmarks that are useful for testing structural materials such as iron, chromium, nickel, and manganese; beryllium; lead; thorium; and 238U are highlighted.

  13. Benchmarking concentrating photovoltaic systems

    Science.gov (United States)

    Duerr, Fabian; Muthirayan, Buvaneshwari; Meuret, Youri; Thienpont, Hugo

    2010-08-01

    Integral to photovoltaics is the need to provide improved economic viability. To achieve this goal, photovoltaic technology has to be able to harness more light at less cost. A large variety of concentrating photovoltaic concepts has provided cause for pursuit. To obtain a detailed profitability analysis, a flexible evaluation is crucial for benchmarking the cost-performance of this variety of concentrating photovoltaic concepts. To save time and capital, a way to estimate the cost-performance of a complete solar energy system is to use computer aided modeling. In this work a benchmark tool is introduced based on a modular programming concept. The overall implementation is done in MATLAB whereas Advanced Systems Analysis Program (ASAP) is used for ray tracing calculations. This allows for a flexible and extendable structuring of all important modules, namely an advanced source modeling including time and local dependence, and an advanced optical system analysis of various optical designs to obtain an evaluation of the figure of merit. An important figure of merit: the energy yield for a given photovoltaic system at a geographical position over a specific period, can be calculated.

  14. DSP Platform Benchmarking : DSP Platform Benchmarking

    OpenAIRE

    Xinyuan, Luo

    2009-01-01

    Benchmarking of DSP kernel algorithms was conducted in the thesis on a DSP processor for teaching in the course TESA26 in the department of Electrical Engineering. It includes benchmarking on cycle count and memory usage. The goal of the thesis is to evaluate the quality of a single MAC DSP instruction set and provide suggestions for further improvement in instruction set architecture accordingly. The scope of the thesis is limited to benchmark the processor only based on assembly coding. The...

  15. Verification of ARES transport code system with TAKEDA benchmarks

    Science.gov (United States)

    Zhang, Liang; Zhang, Bin; Zhang, Penghe; Chen, Mengteng; Zhao, Jingchang; Zhang, Shun; Chen, Yixue

    2015-10-01

    Neutron transport modeling and simulation are central to many areas of nuclear technology, including reactor core analysis, radiation shielding and radiation detection. In this paper the series of TAKEDA benchmarks are modeled to verify the critical calculation capability of ARES, a discrete ordinates neutral particle transport code system. SALOME platform is coupled with ARES to provide geometry modeling and mesh generation function. The Koch-Baker-Alcouffe parallel sweep algorithm is applied to accelerate the traditional transport calculation process. The results show that the eigenvalues calculated by ARES are in excellent agreement with the reference values presented in NEACRP-L-330, with a difference less than 30 pcm except for the first case of model 3. Additionally, ARES provides accurate fluxes distribution compared to reference values, with a deviation less than 2% for region-averaged fluxes in all cases. All of these confirms the feasibility of ARES-SALOME coupling and demonstrate that ARES has a good performance in critical calculation.

  16. Benchmarking of human resources management

    Directory of Open Access Journals (Sweden)

    David M. Akinnusi

    2008-12-01

    Full Text Available This paper reviews the role of human resource management (HRM which, today, plays a strategic partnership role in management. The focus of the paper is on HRM in the public sector, where much hope rests on HRM as a means of transforming the public service and achieving much needed service delivery. However, a critical evaluation of HRM practices in the public sector reveals that these services leave much to be desired. The paper suggests the adoption of benchmarking as a process to revamp HRM in the public sector so that it is able to deliver on its promises. It describes the nature and process of benchmarking and highlights the inherent difficulties in applying benchmarking in HRM. It concludes with some suggestions for a plan of action. The process of identifying “best” practices in HRM requires the best collaborative efforts of HRM practitioners and academicians. If used creatively, benchmarking has the potential to bring about radical and positive changes in HRM in the public sector. The adoption of the benchmarking process is, in itself, a litmus test of the extent to which HRM in the public sector has grown professionally.

  17. Super-Phenix benchmark used for comparison of PNC and CEA calculation methods, and of JENDL-3.2 and CARNAVAL IV nuclear data

    Energy Technology Data Exchange (ETDEWEB)

    Hunter, S.N. [Power Reactor and Nuclear Fuel Development Corp., Oarai, Ibaraki (Japan). Oarai Engineering Center

    1998-02-01

    The study was carried out within the framework of the PNC-CEA collaboration agreement. Data were provided, by CEA, for an experimental loading of a start-up core in Super-Phenix. This data was used at PNC to produce core flux snapshot calculations. CEA undertook a comparison of the PNC results with the equivalent calculations carried out by CEA, and also with experimental measurements from SPX. The results revealed a systematic radial flux tilt between the calculations and the reactor measurements, with the PNC tilts only {approx}30-40% of those from CEA. CEA carried out an analysis of the component causes of the radial tilt. It was concluded that a major cause of radial tilt differences between the PNC and CEA calculations lay in the nuclear datasets used: JENDL-3.2 and CARNAVAL IV. For the final stage of the study, PNC undertook a sensitivity analysis, to examine the detailed differences between the two sets of nuclear data. The sensitivity analysis showed that a relatively small number of nuclear data items contributed the bulk of the radial tilt difference between calculations with JENDL-3.2 and with CARNAVAL IV. A direct comparison between JENDL-3.2 and CARNAVAL IV data revealed the following. The Nu values showed little difference. The only large fission cross-section differences were at low energy. Although down-scattering reactions showed some large fractional differences, absolute differences were negligible compared with in-group scattering; for in-group scattering fractional differences were up to {approx}75%, but generally <20%. There were many large differences in capture cross-sections, generally {approx}30-200%. (J.P.N.)

  18. Aquatic Life Benchmarks

    Data.gov (United States)

    U.S. Environmental Protection Agency — The Aquatic Life Benchmarks is an EPA-developed set of criteria for freshwater species. These benchmarks are based on toxicity values reviewed by EPA and used in the...

  19. Benchmarking a DSP processor

    OpenAIRE

    Lennartsson, Per; Nordlander, Lars

    2002-01-01

    This Master thesis describes the benchmarking of a DSP processor. Benchmarking means measuring the performance in some way. In this report, we have focused on the number of instruction cycles needed to execute certain algorithms. The algorithms we have used in the benchmark are all very common in signal processing today. The results we have reached in this thesis have been compared to benchmarks for other processors, performed by Berkeley Design Technology, Inc. The algorithms were programm...

  20. 47 CFR 69.108 - Transport rate benchmark.

    Science.gov (United States)

    2010-10-01

    ... with this subpart, the DS3-to-DS1 benchmark ratio shall be calculated as follows: the telephone company... benchmark ratio of 9.6 to 1 or higher. (c) If a telephone company's initial transport rates are based on... 47 Telecommunication 3 2010-10-01 2010-10-01 false Transport rate benchmark. 69.108 Section...

  1. Revaluering benchmarking - A topical theme for the construction industry

    DEFF Research Database (Denmark)

    Rasmussen, Grane Mikael Gregaard

    2011-01-01

    Over the past decade, benchmarking has increasingly gained foothold in the construction industry. The predominant research, perceptions and uses of benchmarking are valued so strongly and uniformly, that what may seem valuable, is actually abstaining researchers and practitioners from studying an...... organizational relations, behaviors and actions. In closing it is briefly considered how to study the calculative practices of benchmarking....

  2. Benchmark calculations of excess electrons in water cluster cavities: balancing the addition of atom-centered diffuse functions versus floating diffuse functions.

    Science.gov (United States)

    Zhang, Changzhe; Bu, Yuxiang

    2016-09-14

    Diffuse functions have been proved to be especially crucial for the accurate characterization of excess electrons which are usually bound weakly in intermolecular zones far away from the nuclei. To examine the effects of diffuse functions on the nature of the cavity-shaped excess electrons in water cluster surroundings, both the HOMO and LUMO distributions, vertical detachment energies (VDEs) and visible absorption spectra of two selected (H2O)24(-) isomers are investigated in the present work. Two main types of diffuse functions are considered in calculations including the Pople-style atom-centered diffuse functions and the ghost-atom-based floating diffuse functions. It is found that augmentation of atom-centered diffuse functions contributes to a better description of the HOMO (corresponding to the VDE convergence), in agreement with previous studies, but also leads to unreasonable diffuse characters of the LUMO with significant red-shifts in the visible spectra, which is against the conventional point of view that the more the diffuse functions, the better the results. The issue of designing extra floating functions for excess electrons has also been systematically discussed, which indicates that the floating diffuse functions are necessary not only for reducing the computational cost but also for improving both the HOMO and LUMO accuracy. Thus, the basis sets with a combination of partial atom-centered diffuse functions and floating diffuse functions are recommended for a reliable description of the weakly bound electrons. This work presents an efficient way for characterizing the electronic properties of weakly bound electrons accurately by balancing the addition of atom-centered diffuse functions and floating diffuse functions and also by balancing the computational cost and accuracy of the calculated results, and thus is very useful in the relevant calculations of various solvated electron systems and weakly bound anionic systems.

  3. Calculation of critical fault recovery time for nonlinear systems based on region of attraction analysis

    DEFF Research Database (Denmark)

    Tabatabaeipour, Mojtaba; Blanke, Mogens

    2014-01-01

    of a system. It must be guaranteed that the trajectory of a system subject to fault remains in the region of attraction (ROA) of the post-fault system during this time. This paper proposes a new algorithm to compute the critical fault recovery time for nonlinear systems with polynomial vector elds using sum...

  4. Calculations of critical micelle concentration by dissipative particle dynamics simulations: the role of chain rigidity.

    Science.gov (United States)

    Lee, Ming-Tsung; Vishnyakov, Aleksey; Neimark, Alexander V

    2013-09-05

    Micelle formation in surfactant solutions is a self-assembly process governed by complex interplay of solvent-mediated interactions between hydrophilic and hydrophobic groups, which are commonly called heads and tails. However, the head-tail repulsion is not the only factor affecting the micelle formation. For the first time, we present a systematic study of the effect of chain rigidity on critical micelle concentration and micelle size, which is performed with the dissipative particle dynamics simulation method. Rigidity of the coarse-grained surfactant molecule was controlled by the harmonic bonds set between the second-neighbor beads. Compared to flexible molecules with the nearest-neighbor bonds being the only type of bonded interactions, rigid molecules exhibited a lower critical micelle concentration and formed larger and better-defined micelles. By varying the strength of head-tail repulsion and the chain rigidity, we constructed two-dimensional diagrams presenting how the critical micelle concentration and aggregation number depend on these parameters. We found that the solutions of flexible and rigid molecules that exhibited approximately the same critical micelle concentration could differ substantially in the micelle size and shape depending on the chain rigidity. With the increase of surfactant concentration, primary micelles of more rigid molecules were found less keen to agglomeration and formation of nonspherical aggregates characteristic of flexible molecules.

  5. A critical assessment of the calculation and analysis of thermodynamic parameters from adsorption data

    OpenAIRE

    2015-01-01

    Proper analysis of thermodynamic parameters obtained from adsorption data is a basic requirement for the characterization and optimization of an adsorption-dependent process like the action of organic corrosion inhibitors. Thus, this work aims at presenting a critical assessment of typical flawed examples from the literature together with alternative good practice to be considered, for preference.

  6. Calculation and mapping of Critical Thresholds in Europe: Status Report 1995

    NARCIS (Netherlands)

    Posch M; Smet PAM de; Hettelingh JP; Downing RJ; MNV

    1995-01-01

    The present aim of the Coordination Center for Effects (CCE) is to give scientific and technical support to the development of the critical loads and level methodology, in collaboration with the Programme Centres under the Convention on Long-Range Transboundary Air Pollution. Earlier CCE reports con

  7. Electronically Excited States of Vitamin B12: Benchmark Calculations Including Time-Dependent Density Functional Theory and Correlated Ab Initio Methods

    CERN Document Server

    Kornobis, Karina; Wong, Bryan M; Lodowski, Piotr; Jaworska, Maria; Andruniów, Tadeusz; Rudd, Kenneth; Kozlowski, Pawel M; 10.1021/jp110914y

    2011-01-01

    Time-dependent density functional theory (TD-DFT) and correlated ab initio methods have been applied to the electronically excited states of vitamin B12 (cyanocobalamin or CNCbl). Different experimental techniques have been used to probe the excited states of CNCbl, revealing many issues that remain poorly understood from an electronic structure point of view. Due to its efficient scaling with size, TD-DFT emerges as one of the most practical tools that can be used to predict the electronic properties of these fairly complex molecules. However, the description of excited states is strongly dependent on the type of functional used in the calculations. In the present contribution, the choice of a proper functional for vitamin B12 was evaluated in terms of its agreement with both experimental results and correlated ab initio calculations. Three different functionals, i.e. B3LYP, BP86, and LC-BLYP, were tested. In addition, the effect of relative contributions of DFT and HF to the exchange-correlation functional ...

  8. Geothermal Heat Pump Benchmarking Report

    Energy Technology Data Exchange (ETDEWEB)

    None

    1997-01-17

    A benchmarking study was conducted on behalf of the Department of Energy to determine the critical factors in successful utility geothermal heat pump programs. A Successful program is one that has achieved significant market penetration. Successfully marketing geothermal heat pumps has presented some major challenges to the utility industry. However, select utilities have developed programs that generate significant GHP sales. This benchmarking study concludes that there are three factors critical to the success of utility GHP marking programs: (1) Top management marketing commitment; (2) An understanding of the fundamentals of marketing and business development; and (3) An aggressive competitive posture. To generate significant GHP sales, competitive market forces must by used. However, because utilities have functioned only in a regulated arena, these companies and their leaders are unschooled in competitive business practices. Therefore, a lack of experience coupled with an intrinsically non-competitive culture yields an industry environment that impedes the generation of significant GHP sales in many, but not all, utilities.

  9. Comparison of statistical evaluation of criticality calculations for reactors VENUS-F and ALFRED

    Directory of Open Access Journals (Sweden)

    Janczyszyn Jerzy

    2017-01-01

    Full Text Available Limitations of correct evaluation of keff in Monte Carlo calculations, claimed in literature, apart from the nuclear data uncertainty, need to be addressed more thoroughly. Respective doubts concern: the proper number of discarded initial cycles, the sufficient number of neutrons in a cycle and the recognition and dealing with the keff bias. Calculations were performed to provide more information on these points with the use of the MCB code, solely for fast cores. We present applied methods and results, such as: calculation results for stability of variance, relation between standard deviation reported by MCNP and this from the dispersion of multiple independent keff values, second order standard deviations obtained from different numbers of grouped results. All obtained results for numbers of discarded initial cycles from 0 to 3000 were analysed leading for interesting conclusions.

  10. The graphics calculator in mathematics education: A critical review of recent research

    Science.gov (United States)

    Penglase, Marina; Arnold, Stephen

    1996-04-01

    The graphics calculator, sometimes referred to as the "super calculator," has sparked great interest among mathematics educators. Considered by many to be a tool which has the potential to revolutionise mathematics education, a significant amount of research has been conducted into its effectiveness as a tool for instruction and learning within precalculus and calculus courses, specifically in the study of functions, graphing and modelling. Some results suggest that these devices (a) can facilitate the learning of functions and graphing concepts and the development of spatial visualisation skills; (b) promote mathematical investigation and exploration; and (c) encourage a shift in emphasis from algebraic manipulation and proof to graphical investigation and examination of the relationship between graphical, algebraic and geometric representations. Other studies, however, indicate that there is still a need for manipulative techniques in the learning of function and graphing concepts, that the use of graphics calculators may not facilitate the learning of particular precalculus topics, and that some "de-skilling" may occur, especially among males. It is the contention of this paper, however, that much of the research in this new and important field fails to provide clear guidance or even to inform debate in adequate ways regarding the role of graphics calculators in mathematics teaching and learning. By failing to distinguish the role of the tool from that of the instructional process, many studies reviewed could be more appropriately classified as "program evaluations" rather than as research on the graphics calculator per se. Further, claims regarding the effectiveness of the graphics calculator as a tool for learning frequently fail to recognise that judgments of effectiveness result directly from existing assumptions regarding both assessment practice and student "achievement."

  11. Benchmarking semantic web technology

    CERN Document Server

    García-Castro, R

    2009-01-01

    This book addresses the problem of benchmarking Semantic Web Technologies; first, from a methodological point of view, proposing a general methodology to follow in benchmarking activities over Semantic Web Technologies and, second, from a practical point of view, presenting two international benchmarking activities that involved benchmarking the interoperability of Semantic Web technologies using RDF(S) as the interchange language in one activity and OWL in the other.The book presents in detail how the different resources needed for these interoperability benchmarking activities were defined:

  12. Combining active-space coupled-cluster methods with moment energy corrections via the CC(P;Q) methodology, with benchmark calculations for biradical transition states

    Science.gov (United States)

    Shen, Jun; Piecuch, Piotr

    2012-04-01

    We have recently suggested the CC(P;Q) methodology that can correct energies obtained in the active-space coupled-cluster (CC) or equation-of-motion (EOM) CC calculations, which recover much of the nondynamical and some dynamical electron correlation effects, for the higher-order, mostly dynamical, correlations missing in the active-space CC/EOMCC considerations. It is shown that one can greatly improve the description of biradical transition states, both in terms of the resulting energy barriers and total energies, by combining the CC approach with singles, doubles, and active-space triples, termed CCSDt, with the CC(P;Q)-style correction due to missing triple excitations defining the CC(t;3) approximation.

  13. Benchmarking in University Toolbox

    Directory of Open Access Journals (Sweden)

    Katarzyna Kuźmicz

    2015-06-01

    Full Text Available In the face of global competition and rising challenges that higher education institutions (HEIs meet, it is imperative to increase innovativeness and efficiency of their management. Benchmarking can be the appropriate tool to search for a point of reference necessary to assess institution’s competitive position and learn from the best in order to improve. The primary purpose of the paper is to present in-depth analysis of benchmarking application in HEIs worldwide. The study involves indicating premises of using benchmarking in HEIs. It also contains detailed examination of types, approaches and scope of benchmarking initiatives. The thorough insight of benchmarking applications enabled developing classification of benchmarking undertakings in HEIs. The paper includes review of the most recent benchmarking projects and relating them to the classification according to the elaborated criteria (geographical range, scope, type of data, subject, support and continuity. The presented examples were chosen in order to exemplify different approaches to benchmarking in higher education setting. The study was performed on the basis of the published reports from benchmarking projects, scientific literature and the experience of the author from the active participation in benchmarking projects. The paper concludes with recommendations for university managers undertaking benchmarking, derived on the basis of the conducted analysis.

  14. Implementation of CTRLPOS, a VENTURE module for control rod position criticality searches, control rod worth curve calculations, and general criticality searches

    Energy Technology Data Exchange (ETDEWEB)

    Smith, L.A.; Renier, J.P.

    1994-06-01

    A module in the VENTURE reactor analysis code system, CTRLPOS, is developed to position control rods and perform control rod position criticality searches. The module is variably dimensioned so that calculations can be performed with any number of control rod banks each having any number of control rods. CTRLPOS can also calculate control rod worth curves for a single control rod or a bank of control rods. Control rod depletion can be calculated to provide radiation source terms. These radiation source terms can be used to predict radiation doses to personnel and estimate the shielding and long-term storage requirements for spent control rods. All of these operations are completely automated. The numerous features of the module are discussed in detail. The necessary input data for the CTRLPOS module is explained. Several sample problems are presented to show the flexibility of the module. The results presented with the sample problems show that the CTRLPOS module is a powerful tool which allows a wide variety of calculations to be easily performed.

  15. Benchmarking of Decay Heat Measured Values of ITER Materials Induced by 14 MeV Neutron Activation with Calculated Results by ACAB Activation Code

    Energy Technology Data Exchange (ETDEWEB)

    Tore, C.; Ortego, P.; Rodriguez Rivada, A.

    2014-07-01

    The aim of this paper is the comparison between the calculated and measured decay heat of material samples which were irradiated at the Fusion Neutron Source of JAERI in Japan with D-T production of 14MeV neutrons. In the International Thermonuclear Experimental Reactor (ITER) neutron activation of the structural material will result in a source of heat after shutdown of the reactor. The estimation of decay heat value with qualified codes and nuclear data is an important parameter for the safety analyses of fusion reactors against lost of coolant accidents. When a loss of coolant and/or flow accident happen plasma facing components are heated up by decay heat. If the temperature of the components exceeds the allowable temperature, the accident would expand to loose the integrity of ITER. Uncertainties associated with decay prediction less than 15% are strongly requested by the ITER designers. Additionally, accurate decay heat prediction is required for making reasonable shutdown scenarios of ITER. (Author)

  16. CRITICALITY CALCULATION FOR THE MOST REACTIVE DEGRADED CONFIGURATIONS OF THE FFTF SNF CODISPOSAL WP CONTAINING AN INTACT IDENT-69 CONTAINER

    Energy Technology Data Exchange (ETDEWEB)

    D.R. Moscalu

    2002-08-28

    The objective of this calculation is to perform additional degraded mode criticality evaluations of the Department of Energy's (DOE) Fast Flux Test Facility (FFTF) Spent Nuclear Fuel (SNF) codisposed in a 5-Defense High-Level Waste (5-DHLW) Waste Package (WP). The scope of this calculation is limited to the most reactive degraded configurations of the codisposal WP with an almost intact Ident-69 container (breached and flooded but otherwise non-degraded) containing intact FFTF SNF pins. The configurations have been identified in a previous analysis (CRWMS M&O 1999a) and the present evaluations include additional relevant information that was left out of the original calculations. The additional information describes the exact distribution of fissile material in each container (DOE 2002a). The effects of the changes that have been included in the baseline design of the codisposal WP (CRWMS M&O 2000) are also investigated. The calculation determines the effective neutron multiplication factor (k{sub eff}) for selected degraded mode internal configurations of the codisposal waste package. These calculations will support the demonstration of the technical viability of the design solution adopted for disposing of MOX (FFTF) spent nuclear fuel in the potential repository. This calculation is subject to the Quality Assurance Requirements and Description (QARD) (DOE 2002b) per the activity evaluation under work package number P6212310M2 in the technical work plan TWP-MGR-MD-000010 REV 01 (BSC 2002).

  17. Notes on Critical Assessment of Theoretical Calculations of Atomic Structure and Transition Probabilities

    OpenAIRE

    Hyun-Kyung Chung; Per Jönsson; Alexander Kramida

    2013-01-01

    Atomic structure and transition probabilities are fundamental physical data required in many fields of science and technology. Atomic physics codes are freely available to other community users to generate atomic data for their interest, but the quality of these data is rarely verified. This special issue addresses estimation of uncertainties in atomic structure and transition probability calculations, and discusses methods and strategies to assess and ensure the quality of theoretical atomic...

  18. A nonparametric approach to calculate critical micelle concentrations: the local polynomial regression method.

    Science.gov (United States)

    López Fontán, J L; Costa, J; Ruso, J M; Prieto, G; Sarmiento, F

    2004-02-01

    The application of a statistical method, the local polynomial regression method, (LPRM), based on a nonparametric estimation of the regression function to determine the critical micelle concentration (cmc) is presented. The method is extremely flexible because it does not impose any parametric model on the subjacent structure of the data but rather allows the data to speak for themselves. Good concordance of cmc values with those obtained by other methods was found for systems in which the variation of a measured physical property with concentration showed an abrupt change. When this variation was slow, discrepancies between the values obtained by LPRM and others methods were found.

  19. A nonparametric approach to calculate critical micelle concentrations: the local polynomial regression method

    Energy Technology Data Exchange (ETDEWEB)

    Lopez Fontan, J.L.; Costa, J.; Ruso, J.M.; Prieto, G. [Dept. of Applied Physics, Univ. of Santiago de Compostela, Santiago de Compostela (Spain); Sarmiento, F. [Dept. of Mathematics, Faculty of Informatics, Univ. of A Coruna, A Coruna (Spain)

    2004-02-01

    The application of a statistical method, the local polynomial regression method, (LPRM), based on a nonparametric estimation of the regression function to determine the critical micelle concentration (cmc) is presented. The method is extremely flexible because it does not impose any parametric model on the subjacent structure of the data but rather allows the data to speak for themselves. Good concordance of cmc values with those obtained by other methods was found for systems in which the variation of a measured physical property with concentration showed an abrupt change. When this variation was slow, discrepancies between the values obtained by LPRM and others methods were found. (orig.)

  20. Validation of multigroup neutron cross sections and calculational methods for the advanced neutron source against the FOEHN critical experiments measurements

    Energy Technology Data Exchange (ETDEWEB)

    Smith, L.A.; Gallmeier, F.X. [Oak Ridge Institute for Science and Energy, TN (United States); Gehin, J.C. [Oak Ridge National Lab., TN (United States)] [and others

    1995-05-01

    The FOEHN critical experiment was analyzed to validate the use of multigroup cross sections and Oak Ridge National Laboratory neutronics computer codes in the design of the Advanced Neutron Source. The ANSL-V 99-group master cross section library was used for all the calculations. Three different critical configurations were evaluated using the multigroup KENO Monte Carlo transport code, the multigroup DORT discrete ordinates transport code, and the multigroup diffusion theory code VENTURE. The simple configuration consists of only the fuel and control elements with the heavy water reflector. The intermediate configuration includes boron endplates at the upper and lower edges of the fuel element. The complex configuration includes both the boron endplates and components in the reflector. Cross sections were processed using modules from the AMPX system. Both 99-group and 20-group cross sections were created and used in two-dimensional models of the FOEHN experiment. KENO calculations were performed using both 99-group and 20-group cross sections. The DORT and VENTURE calculations were performed using 20-group cross sections. Because the simple and intermediate configurations are azimuthally symmetric, these configurations can be explicitly modeled in R-Z geometry. Since the reflector components cannot be modeled explicitly using the current versions of these codes, three reflector component homogenization schemes were developed and evaluated for the complex configuration. Power density distributions were calculated with KENO using 99-group cross sections and with DORT and VENTURE using 20-group cross sections. The average differences between the measured values and the values calculated with the different computer codes range from 2.45 to 5.74%. The maximum differences between the measured and calculated thermal flux values for the simple and intermediate configurations are {approx} 13%, while the average differences are < 8%.

  1. Calculation of the Critical Speed and Stability Analysis of Cryogenic Turboexpanders with Different Structures

    Institute of Scientific and Technical Information of China (English)

    陈双涛; 赵红利; 马斌; 侯予

    2012-01-01

    A modularized code based on the Finite Element QZ (FEQZ) method is developed, for a better estimate of the critical speed and a more convenient method of rotor-dynamic stability analysis for a gas bearing high speed turboexpander rotor system with actual structure and application of a cryogenic turboexpander. This code is then validated by the experimental data of a gas bearing turboexpander, with a rotor diameter of 25 mm and a rated speed of 106,400 rpm. With this code, four rotors with different structures, available to the turboexpander, are parametrically analyzed by the available speed range, vibration modes and logarithmic attenuation rate. The results suggest that the rotor with a structure of two thrust collars on the system exhibits a better performance in the designed conditions.

  2. Critical comparison of electrode models in density functional theory based quantum transport calculations.

    Science.gov (United States)

    Jacob, D; Palacios, J J

    2011-01-28

    We study the performance of two different electrode models in quantum transport calculations based on density functional theory: parametrized Bethe lattices and quasi-one-dimensional wires or nanowires. A detailed account of implementation details in both the cases is given. From the systematic study of nanocontacts made of representative metallic elements, we can conclude that the parametrized electrode models represent an excellent compromise between computational cost and electronic structure definition as long as the aim is to compare with experiments where the precise atomic structure of the electrodes is not relevant or defined with precision. The results obtained using parametrized Bethe lattices are essentially similar to the ones obtained with quasi-one-dimensional electrodes for large enough cross-sections of these, adding a natural smearing to the transmission curves that mimics the true nature of polycrystalline electrodes. The latter are more demanding from the computational point of view, but present the advantage of expanding the range of applicability of transport calculations to situations where the electrodes have a well-defined atomic structure, as is the case for carbon nanotubes, graphene nanoribbons, or semiconducting nanowires. All the analysis is done with the help of codes developed by the authors which can be found in the quantum transport toolbox ALACANT and are publicly available.

  3. Calculation of Theoretical and Empirical Nutrient N Critical Loads in the Mixed Conifer Ecosystems of Southern California

    Directory of Open Access Journals (Sweden)

    Joan Breiner

    2007-01-01

    Full Text Available Edaphic, foliar, and hydrologic forest nutrient status indicators from 15 mixed conifer forest stands in the Sierra Nevada, San Gabriel Mountains, and San Bernardino National Forest were used to estimate empirical or theoretical critical loads (CL for nitrogen (N as a nutrient. Soil acidification response to N deposition was also evaluated. Robust empirical relationships were found relating N deposition to plant N uptake (N in foliage, N fertility (litter C/N ratio, and soil acidification. However, no consistent empirical CL were obtained when the thresholds for parameters indicative of N excess from other types of ecosystems were used. Similarly, the highest theoretical CL for nutrient N calculated using the simple mass balance steady state model (estimates ranging from 1.4–8.8 kg N/ha/year was approximately two times lower than the empirical observations. Further research is needed to derive the thresholds for indicators associated with the impairment of these mixed conifer forests exposed to chronic N deposition within a Mediterranean climate. Further development or parameterization of models for the calculation of theoretical critical loads suitable for these ecosystems will also be an important aspect of future critical loads research.

  4. Development of Benchmark Examples for Delamination Onset and Fatigue Growth Prediction

    Science.gov (United States)

    Krueger, Ronald

    2011-01-01

    An approach for assessing the delamination propagation and growth capabilities in commercial finite element codes was developed and demonstrated for the Virtual Crack Closure Technique (VCCT) implementations in ABAQUS. The Double Cantilever Beam (DCB) specimen was chosen as an example. First, benchmark results to assess delamination propagation capabilities under static loading were created using models simulating specimens with different delamination lengths. For each delamination length modeled, the load and displacement at the load point were monitored. The mixed-mode strain energy release rate components were calculated along the delamination front across the width of the specimen. A failure index was calculated by correlating the results with the mixed-mode failure criterion of the graphite/epoxy material. The calculated critical loads and critical displacements for delamination onset for each delamination length modeled were used as a benchmark. The load/displacement relationship computed during automatic propagation should closely match the benchmark case. Second, starting from an initially straight front, the delamination was allowed to propagate based on the algorithms implemented in the commercial finite element software. The load-displacement relationship obtained from the propagation analysis results and the benchmark results were compared. Good agreements could be achieved by selecting the appropriate input parameters, which were determined in an iterative procedure.

  5. RISKIND verification and benchmark comparisons

    Energy Technology Data Exchange (ETDEWEB)

    Biwer, B.M.; Arnish, J.J.; Chen, S.Y.; Kamboj, S.

    1997-08-01

    This report presents verification calculations and benchmark comparisons for RISKIND, a computer code designed to estimate potential radiological consequences and health risks to individuals and the population from exposures associated with the transportation of spent nuclear fuel and other radioactive materials. Spreadsheet calculations were performed to verify the proper operation of the major options and calculational steps in RISKIND. The program is unique in that it combines a variety of well-established models into a comprehensive treatment for assessing risks from the transportation of radioactive materials. Benchmark comparisons with other validated codes that incorporate similar models were also performed. For instance, the external gamma and neutron dose rate curves for a shipping package estimated by RISKIND were compared with those estimated by using the RADTRAN 4 code and NUREG-0170 methodology. Atmospheric dispersion of released material and dose estimates from the GENII and CAP88-PC codes. Verification results have shown the program to be performing its intended function correctly. The benchmark results indicate that the predictions made by RISKIND are within acceptable limits when compared with predictions from similar existing models.

  6. Mobile phone camera benchmarking in low light environment

    Science.gov (United States)

    Peltoketo, Veli-Tapani

    2015-01-01

    High noise values and poor signal to noise ratio are traditionally associated to the low light imaging. Still, there are several other camera quality features which may suffer from low light environment. For example, what happens to the color accuracy and resolution or how the camera speed behaves in low light? Furthermore, how low light environments affect to the camera benchmarking and which metrics are the critical ones? The work contains standard based image quality measurements including noise, color, and resolution measurements in three different light environments: 1000, 100, and 30 lux. Moreover, camera speed measurements are done. Detailed measurement results of each quality and speed category are revealed and compared. Also a suitable benchmark algorithm is evaluated and corresponding score is calculated to find an appropriate metric which characterize the camera performance in different environments. The result of this work introduces detailed image quality and camera speed measurements of mobile phone camera systems in three different light environments. The paper concludes how different light environments influence to the metrics and which metrics should be measured in low light environment. Finally, a benchmarking score is calculated using measurement data of each environment and mobile phone cameras are compared correspondingly.

  7. The Conic Benchmark Format

    DEFF Research Database (Denmark)

    Friberg, Henrik A.

    This document constitutes the technical reference manual of the Conic Benchmark Format with le extension: .cbf or .CBF. It unies linear, second-order cone (also known as conic quadratic) and semidenite optimization with mixed-integer variables. The format has been designed with benchmark libraries...... in mind, and therefore focuses on compact and easily parsable representations. The problem structure is separated from the problem data, and the format moreover facilitate benchmarking of hotstart capability through sequences of changes....

  8. Toxicological benchmarks for screening potential contaminants of concern for effects on aquatic biota: 1996 revision

    Energy Technology Data Exchange (ETDEWEB)

    Suter, G.W. II [Oak Ridge National Lab., TN (United States); Tsao, C.L. [Duke Univ., Durham, NC (United States). School of the Environment

    1996-06-01

    This report presents potential screening benchmarks for protection of aquatic life form contaminants in water. Because there is no guidance for screening for benchmarks, a set of alternative benchmarks is presented herein. This report presents the alternative benchmarks for chemicals that have been detected on the Oak Ridge Reservation. It also presents the data used to calculate the benchmarks and the sources of the data. It compares the benchmarks and discusses their relative conservatism and utility. Also included is the updates of benchmark values where appropriate, new benchmark values, secondary sources are replaced by primary sources, and a more complete documentation of the sources and derivation of all values are presented.

  9. Verification of new model for calculation of critical strain for the initialization of dynamic recrystallization using laboratory rolling

    Directory of Open Access Journals (Sweden)

    R. Fabík

    2009-10-01

    Full Text Available This paper presents a new model for calculation of critical strain for initialization of dynamic recrystallization. The new model reflects the history of forming in the deformation zone during rolling. In this region of restricted deformation, the strain rate curve for the surface of the strip exhibits two peaks. These are the two reasons why the onset of dynamic recrystallization DRX near the surface of the rolled part occurs later than in theory during strip rolling. The present model had been used in a program for simulation of forming processes with the aid of FEM and a comparison between the physical experiment and a mathematical model had been drawn.

  10. A critical analysis of dipole-moment calculations as obtained from experimental and theoretical structure factors.

    Science.gov (United States)

    Poulain-Paul, Agnieszka; Nassour, Ayoub; Jelsch, Christian; Guillot, Benoit; Kubicki, Maciej; Lecomte, Claude

    2012-11-01

    Three models of charge-density distribution - Hansen-Coppens multipolar, virtual atom and kappa - of different complexities, different numbers of refined parameters, and with variable levels of restraints, were tested against theoretical and high-resolution X-ray diffraction structure factors for 2-methyl-4-nitro-1-phenyl-1H-imidazole-5-carbonitrile. The influence of the model, refinement strategy, multipole level and treatment of the H atoms on the dipole moment was investigated. The dipole moment turned out to be very sensitive to the refinement strategy. Also, small changes in H-atom treatment can greatly influence the calculated magnitude and orientation of the dipole moment. The best results were obtained when H atoms were kept in positions determined by neutron diffraction and anisotropic displacement parameters (obtained by SHADE, in this case) were used. Also, constraints on kappa values of H atoms were found to be superior to the free refinement of these parameters. It is also shown that the over-parametrization of the multipolar model, although possibly leading to better residuals, in general gives worse dipole moments.

  11. Transport calculation of neutrons leaked to the surroundings of the facilities by the JCO criticality accident in Tokai-mura.

    Science.gov (United States)

    Imanaka, T

    2001-09-01

    A transport calculation of the neutrons leaked to the environment by the JCO criticality accident was carried out based on three-dimensional geometrical models of the buildings within the JCO territory. Our work started from an initial step to simulate the leakage process of neutrons from the precipitation tank, and proceeded to a step to calculate the neutron propagation throughout the JCO facilities. The total fission number during the accident in the precipitation tank was evaluated to be 2.5 x 10(18) by comparing the calculated neutron-induced activities per 235U fission with the measured values in a stainless-steel net sample taken 2 m from the precipitation tank. Shield effects by various structures within the JCO facilities were evaluated by comparing the present results with a previous calculation using two-dimensional models which suppose a point source of the fission spectrum in the air above the ground without any shield structures. The shield effect by the precipitation tank, itself, was obtained to be a factor of 3. The shield factor by the conversion building varied between 1.1 and 2, depending on the direction from the building. The shield effect by the surrounding buildings within the JCO territory was between I and 5, also depending on the direction.

  12. Handleiding benchmark VO

    NARCIS (Netherlands)

    Blank, j.l.t.

    2008-01-01

    OnderzoeksrapportenArchiefTechniek, Bestuur en Management> Over faculteit> Afdelingen> Innovation Systems> IPSE> Onderzoek> Publicaties> Onderzoeksrapporten> Handleiding benchmark VO Handleiding benchmark VO 25 november 2008 door IPSE Studies Door J.L.T. Blank. Handleiding voor het lezen van de i

  13. Benchmark af erhvervsuddannelserne

    DEFF Research Database (Denmark)

    Bogetoft, Peter; Wittrup, Jesper

    I dette arbejdspapir diskuterer vi, hvorledes de danske erhvervsskoler kan benchmarkes, og vi præsenterer resultaterne af en række beregningsmodeller. Det er begrebsmæssigt kompliceret at benchmarke erhvervsskolerne. Skolerne udbyder en lang række forskellige uddannelser. Det gør det vanskeligt...

  14. Benchmarking af kommunernes sagsbehandling

    DEFF Research Database (Denmark)

    Amilon, Anna

    Fra 2007 skal Ankestyrelsen gennemføre benchmarking af kommuernes sagsbehandlingskvalitet. Formålet med benchmarkingen er at udvikle praksisundersøgelsernes design med henblik på en bedre opfølgning og at forbedre kommunernes sagsbehandling. Dette arbejdspapir diskuterer metoder for benchmarking...

  15. Thermal Performance Benchmarking (Presentation)

    Energy Technology Data Exchange (ETDEWEB)

    Moreno, G.

    2014-11-01

    This project will benchmark the thermal characteristics of automotive power electronics and electric motor thermal management systems. Recent vehicle systems will be benchmarked to establish baseline metrics, evaluate advantages and disadvantages of different thermal management systems, and identify areas of improvement to advance the state-of-the-art.

  16. Internet based benchmarking

    DEFF Research Database (Denmark)

    Bogetoft, Peter; Nielsen, Kurt

    2005-01-01

    We discuss the design of interactive, internet based benchmarking using parametric (statistical) as well as nonparametric (DEA) models. The user receives benchmarks and improvement potentials. The user is also given the possibility to search different efficiency frontiers and hereby to explore al...... alternative improvement strategies. Implementations of both a parametric and a non-parametric model are presented....

  17. Impact of modeling Choices on Inventory and In-Cask Criticality Calculations for Forsmark 3 BWR Spent Fuel

    Energy Technology Data Exchange (ETDEWEB)

    Martinez-Gonzalez, Jesus S. [Univ. Politecnica de Madrid (Spain); Ade, Brian J. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Bowman, Stephen M. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Gauld, Ian C. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Ilas, Germina [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Marshall, William BJ J. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2015-01-01

    Simulation of boiling water reactor (BWR) fuel depletion poses a challenge for nuclide inventory validation and nuclear criticality safety analyses. This challenge is due to the complex operating conditions and assembly design heterogeneities that characterize these nuclear systems. Fuel depletion simulations and in-cask criticality calculations are affected by (1) completeness of design information, (2) variability of operating conditions needed for modeling purposes, and (3) possible modeling choices. These effects must be identified, quantified, and ranked according to their significance. This paper presents an investigation of BWR fuel depletion using a complete set of actual design specifications and detailed operational data available for five operating cycles of the Swedish BWR Forsmark 3 reactor. The data includes detailed axial profiles of power, burnup, and void fraction in a very fine temporal mesh for a GE14 (10×10) fuel assembly. The specifications of this case can be used to assess the impacts of different modeling choices on inventory prediction and in-cask criticality, specifically regarding the key parameters that drive inventory and reactivity throughout fuel burnup. This study focused on the effects of the fidelity with which power history and void fraction distributions are modeled. The corresponding sensitivity of the reactivity in storage configurations is assessed, and the impacts of modeling choices on decay heat and inventory are addressed.

  18. Toward Establishing a Realistic Benchmark for Airframe Noise Research: Issues and Challenges

    Science.gov (United States)

    Khorrami, Mehdi R.

    2010-01-01

    The availability of realistic benchmark configurations is essential to enable the validation of current Computational Aeroacoustic (CAA) methodologies and to further the development of new ideas and concepts that will foster the technologies of the next generation of CAA tools. The selection of a real-world configuration, the subsequent design and fabrication of an appropriate model for testing, and the acquisition of the necessarily comprehensive aeroacoustic data base are critical steps that demand great care and attention. In this paper, a brief account of the nose landing-gear configuration, being proposed jointly by NASA and the Gulfstream Aerospace Company as an airframe noise benchmark, is provided. The underlying thought processes and the resulting building block steps that were taken during the development of this benchmark case are given. Resolution of critical, yet conflicting issues is discussed - the desire to maintain geometric fidelity versus model modifications required to accommodate instrumentation; balancing model scale size versus Reynolds number effects; and time, cost, and facility availability versus important parameters like surface finish and installation effects. The decisions taken during the experimental phase of a study can significantly affect the ability of a CAA calculation to reproduce the prevalent flow conditions and associated measurements. For the nose landing gear, the most critical of such issues are highlighted and the compromises made to resolve them are discussed. The results of these compromises will be summarized by examining the positive attributes and shortcomings of this particular benchmark case.

  19. Benchmarking expert system tools

    Science.gov (United States)

    Riley, Gary

    1988-01-01

    As part of its evaluation of new technologies, the Artificial Intelligence Section of the Mission Planning and Analysis Div. at NASA-Johnson has made timing tests of several expert system building tools. Among the production systems tested were Automated Reasoning Tool, several versions of OPS5, and CLIPS (C Language Integrated Production System), an expert system builder developed by the AI section. Also included in the test were a Zetalisp version of the benchmark along with four versions of the benchmark written in Knowledge Engineering Environment, an object oriented, frame based expert system tool. The benchmarks used for testing are studied.

  20. Toxicological Benchmarks for Wildlife

    Energy Technology Data Exchange (ETDEWEB)

    Sample, B.E. Opresko, D.M. Suter, G.W.

    1993-01-01

    Ecological risks of environmental contaminants are evaluated by using a two-tiered process. In the first tier, a screening assessment is performed where concentrations of contaminants in the environment are compared to no observed adverse effects level (NOAEL)-based toxicological benchmarks. These benchmarks represent concentrations of chemicals (i.e., concentrations presumed to be nonhazardous to the biota) in environmental media (water, sediment, soil, food, etc.). While exceedance of these benchmarks does not indicate any particular level or type of risk, concentrations below the benchmarks should not result in significant effects. In practice, when contaminant concentrations in food or water resources are less than these toxicological benchmarks, the contaminants may be excluded from further consideration. However, if the concentration of a contaminant exceeds a benchmark, that contaminant should be retained as a contaminant of potential concern (COPC) and investigated further. The second tier in ecological risk assessment, the baseline ecological risk assessment, may use toxicological benchmarks as part of a weight-of-evidence approach (Suter 1993). Under this approach, based toxicological benchmarks are one of several lines of evidence used to support or refute the presence of ecological effects. Other sources of evidence include media toxicity tests, surveys of biota (abundance and diversity), measures of contaminant body burdens, and biomarkers. This report presents NOAEL- and lowest observed adverse effects level (LOAEL)-based toxicological benchmarks for assessment of effects of 85 chemicals on 9 representative mammalian wildlife species (short-tailed shrew, little brown bat, meadow vole, white-footed mouse, cottontail rabbit, mink, red fox, and whitetail deer) or 11 avian wildlife species (American robin, rough-winged swallow, American woodcock, wild turkey, belted kingfisher, great blue heron, barred owl, barn owl, Cooper's hawk, and red

  1. Critical study of the method of calculating virgin rock stresses from measurement results of the CSIR triaxial strain cell

    Science.gov (United States)

    Vreede, F. A.

    1981-05-01

    The manual of instructions for the user of the CSIR triaxial rock stress measuring equipment is critically examined. It is shown that the values of the rock stresses can be obtained from the strain gauge records by means of explicit formulae, which makes the manual's computer program obsolete. Furthermore statistical methods are proposed to check for faulty data and inhomogeneity in rock properties and virgin stress. The possibility of non-elastic behavior of the rock during the test is also checked. A new computer program based on the explicit functions and including the check calculations is presented. It is much more efficient than the one in the manual since it does not require computer sub-routines, allowing it to be used directly on any modern computer. The output of the new program is in a format suitable for direct inclusion in the report of an investigation using strain cell results.

  2. GeodeticBenchmark_GEOMON

    Data.gov (United States)

    Vermont Center for Geographic Information — The GeodeticBenchmark_GEOMON data layer consists of geodetic control monuments (points) that have a known position or spatial reference. The locations of these...

  3. Diagnostic Algorithm Benchmarking

    Science.gov (United States)

    Poll, Scott

    2011-01-01

    A poster for the NASA Aviation Safety Program Annual Technical Meeting. It describes empirical benchmarking on diagnostic algorithms using data from the ADAPT Electrical Power System testbed and a diagnostic software framework.

  4. Benchmarking and Regulation

    DEFF Research Database (Denmark)

    Agrell, Per J.; Bogetoft, Peter

    . The application of benchmarking in regulation, however, requires specific steps in terms of data validation, model specification and outlier detection that are not systematically documented in open publications, leading to discussions about regulatory stability and economic feasibility of these techniques...

  5. Financial Integrity Benchmarks

    Data.gov (United States)

    City of Jackson, Mississippi — This data compiles standard financial integrity benchmarks that allow the City to measure its financial standing. It measure the City's debt ratio and bond ratings....

  6. The impact and applicability of critical experiment evaluations

    Energy Technology Data Exchange (ETDEWEB)

    Brewer, R. [Los Alamos National Lab., NM (United States)

    1997-06-01

    This paper very briefly describes a project to evaluate previously performed critical experiments. The evaluation is intended for use by criticality safety engineers to verify calculations, and may also be used to identify data which need further investigation. The evaluation process is briefly outlined; the accepted benchmark critical experiments will be used as a standard for verification and validation. The end result of the project will be a comprehensive reference document.

  7. Mean-field calculation of critical parameters and log-periodic characterization of an aperiodic-modulated model.

    Science.gov (United States)

    Oliveira, T P; Branco, N S

    2012-01-01

    We employ a mean-field approximation to study the Ising model with aperiodic modulation of its interactions in one spatial direction. Two different values for the exchange constant, J(A) and J(B), are present, according to the Fibonacci sequence. We calculate the pseudocritical temperatures for finite systems and extrapolate them to the thermodynamic limit. We explicitly obtain the exponents β, δ, and γ and, from the usual scaling relations for anisotropic models at the upper critical dimension (assumed to be 4 for the model we treat), we calculate α, ν, ν(∥), η, and η(∥). Within the framework of a renormalization-group approach, the Fibonacci sequence is a marginal one and we obtain exponents that depend on the ratio r≡J(B)/J(A), as expected; however, the scaling relation γ=β(δ-1) is obeyed for all values of r we studied. We characterize some thermodynamic functions as log-periodic functions of their arguments, as expected for aperiodic-modulated models, and obtain precise values for the exponents from this characterization.

  8. Benchmarking in Foodservice Operations.

    Science.gov (United States)

    2007-11-02

    51. Lingle JH, Schiemann WA. From balanced scorecard to strategic gauges: Is measurement worth it? Mgt Rev. 1996; 85(3):56-61. 52. Struebing L...studies lasted from nine to twelve months, and could extend beyond that time for numerous reasons (49). Benchmarking was not industrial tourism , a...not simply data comparison, a fad, a means for reducing resources, a quick-fix program, or industrial tourism . Benchmarking was a complete process

  9. How Activists Use Benchmarks

    DEFF Research Database (Denmark)

    Seabrooke, Leonard; Wigan, Duncan

    2015-01-01

    Non-governmental organisations use benchmarks as a form of symbolic violence to place political pressure on firms, states, and international organisations. The development of benchmarks requires three elements: (1) salience, that the community of concern is aware of the issue and views it as impo...... interests and challenge established politico-economic norms. Differentiating these cycles provides insights into how activists work through organisations and with expert networks, as well as how campaigns on complex economic issues can be mounted and sustained....

  10. On Big Data Benchmarking

    OpenAIRE

    Han, Rui; Lu, Xiaoyi

    2014-01-01

    Big data systems address the challenges of capturing, storing, managing, analyzing, and visualizing big data. Within this context, developing benchmarks to evaluate and compare big data systems has become an active topic for both research and industry communities. To date, most of the state-of-the-art big data benchmarks are designed for specific types of systems. Based on our experience, however, we argue that considering the complexity, diversity, and rapid evolution of big data systems, fo...

  11. REPROVIS-DB: a benchmark system for ligand-based virtual screening derived from reproducible prospective applications.

    Science.gov (United States)

    Ripphausen, Peter; Wassermann, Anne Mai; Bajorath, Jürgen

    2011-10-24

    Benchmark calculations are essential for the evaluation of virtual screening (VS) methods. Typically, classes of known active compounds taken from the medicinal chemistry literature are divided into reference molecules (search templates) and potential hits that are added to background databases assumed to consist of compounds not sharing this activity. Then VS calculations are carried out, and the recall of known active compounds is determined. However, conventional benchmarking is affected by a number of problems that reduce its value for method evaluation. In addition to often insufficient statistical validation and the lack of generally accepted evaluation standards, the artificial nature of typical benchmark settings is often criticized. Retrospective benchmark calculations generally overestimate the potential of VS methods and do not scale with their performance in prospective applications. In order to provide additional opportunities for benchmarking that more closely resemble practical VS conditions, we have designed a publicly available compound database (DB) of reproducible virtual screens (REPROVIS-DB) that organizes information from successful ligand-based VS applications including reference compounds, screening databases, compound selection criteria, and experimentally confirmed hits. Using the currently available 25 hand-selected compound data sets, one can attempt to reproduce successful virtual screens with other than the originally applied methods and assess their potential for practical applications.

  12. Benchmarking ~(232)Th Evaluations With KBR and Thor Experiments

    Institute of Scientific and Technical Information of China (English)

    2011-01-01

    The n+232Th evaluations from CENDL-3.1, ENDF/B-Ⅶ.0, JENDL-3.3 and JENDL-4.0 were tested with KBR series and THOR benchmark from ICSBEP Handbook. THOR is Plutonium-Metal-Fast (PMF) criticality benchmark reflected with metal thorium.

  13. A framework of benchmarking land models

    Science.gov (United States)

    Luo, Y. Q.; Randerson, J.; Abramowitz, G.; Bacour, C.; Blyth, E.; Carvalhais, N.; Ciais, P.; Dalmonech, D.; Fisher, J.; Fisher, R.; Friedlingstein, P.; Hibbard, K.; Hoffman, F.; Huntzinger, D.; Jones, C. D.; Koven, C.; Lawrence, D.; Li, D. J.; Mahecha, M.; Niu, S. L.; Norby, R.; Piao, S. L.; Qi, X.; Peylin, P.; Prentice, I. C.; Riley, W.; Reichstein, M.; Schwalm, C.; Wang, Y. P.; Xia, J. Y.; Zaehle, S.; Zhou, X. H.

    2012-02-01

    Land models, which have been developed by the modeling community in the past two decades to predict future states of ecosystems and climate, have to be critically evaluated for their performance skills of simulating ecosystem responses and feedback to climate change. Benchmarking is an emerging procedure to measure and evaluate performance of models against a set of defined standards. This paper proposes a benchmarking framework for evaluation of land models. The framework includes (1) targeted aspects of model performance to be evaluated; (2) a set of benchmarks as defined references to test model performance; (3) metrics to measure and compare performance skills among models so as to identify model strengths and deficiencies; and (4) model improvement. Component 4 may or may not be involved in a benchmark analysis but is an ultimate goal of general modeling research. Land models are required to simulate exchange of water, energy, carbon and sometimes other trace gases between the atmosphere and the land-surface, and should be evaluated for their simulations of biophysical processes, biogeochemical cycles, and vegetation dynamics across timescales in response to both weather and climate change. Benchmarks that are used to evaluate models generally consist of direct observations, data-model products, and data-derived patterns and relationships. Metrics of measuring mismatches between models and benchmarks may include (1) a priori thresholds of acceptable model performance and (2) a scoring system to combine data-model mismatches for various processes at different temporal and spatial scales. The benchmark analyses should identify clues of weak model performance for future improvement. Iterations between model evaluation and improvement via benchmarking shall demonstrate progress of land modeling and help establish confidence in land models for their predictions of future states of ecosystems and climate.

  14. A framework of benchmarking land models

    Directory of Open Access Journals (Sweden)

    Y. Q. Luo

    2012-02-01

    Full Text Available Land models, which have been developed by the modeling community in the past two decades to predict future states of ecosystems and climate, have to be critically evaluated for their performance skills of simulating ecosystem responses and feedback to climate change. Benchmarking is an emerging procedure to measure and evaluate performance of models against a set of defined standards. This paper proposes a benchmarking framework for evaluation of land models. The framework includes (1 targeted aspects of model performance to be evaluated; (2 a set of benchmarks as defined references to test model performance; (3 metrics to measure and compare performance skills among models so as to identify model strengths and deficiencies; and (4 model improvement. Component 4 may or may not be involved in a benchmark analysis but is an ultimate goal of general modeling research. Land models are required to simulate exchange of water, energy, carbon and sometimes other trace gases between the atmosphere and the land-surface, and should be evaluated for their simulations of biophysical processes, biogeochemical cycles, and vegetation dynamics across timescales in response to both weather and climate change. Benchmarks that are used to evaluate models generally consist of direct observations, data-model products, and data-derived patterns and relationships. Metrics of measuring mismatches between models and benchmarks may include (1 a priori thresholds of acceptable model performance and (2 a scoring system to combine data-model mismatches for various processes at different temporal and spatial scales. The benchmark analyses should identify clues of weak model performance for future improvement. Iterations between model evaluation and improvement via benchmarking shall demonstrate progress of land modeling and help establish confidence in land models for their predictions of future states of ecosystems and climate.

  15. A framework for benchmarking land models

    Science.gov (United States)

    Luo, Y. Q.; Randerson, J. T.; Abramowitz, G.; Bacour, C.; Blyth, E.; Carvalhais, N.; Ciais, P.; Dalmonech, D.; Fisher, J. B.; Fisher, R.; Friedlingstein, P.; Hibbard, K.; Hoffman, F.; Huntzinger, D.; Jones, C. D.; Koven, C.; Lawrence, D.; Li, D. J.; Mahecha, M.; Niu, S. L.; Norby, R.; Piao, S. L.; Qi, X.; Peylin, P.; Prentice, I. C.; Riley, W.; Reichstein, M.; Schwalm, C.; Wang, Y. P.; Xia, J. Y.; Zaehle, S.; Zhou, X. H.

    2012-10-01

    Land models, which have been developed by the modeling community in the past few decades to predict future states of ecosystems and climate, have to be critically evaluated for their performance skills of simulating ecosystem responses and feedback to climate change. Benchmarking is an emerging procedure to measure performance of models against a set of defined standards. This paper proposes a benchmarking framework for evaluation of land model performances and, meanwhile, highlights major challenges at this infant stage of benchmark analysis. The framework includes (1) targeted aspects of model performance to be evaluated, (2) a set of benchmarks as defined references to test model performance, (3) metrics to measure and compare performance skills among models so as to identify model strengths and deficiencies, and (4) model improvement. Land models are required to simulate exchange of water, energy, carbon and sometimes other trace gases between the atmosphere and land surface, and should be evaluated for their simulations of biophysical processes, biogeochemical cycles, and vegetation dynamics in response to climate change across broad temporal and spatial scales. Thus, one major challenge is to select and define a limited number of benchmarks to effectively evaluate land model performance. The second challenge is to develop metrics of measuring mismatches between models and benchmarks. The metrics may include (1) a priori thresholds of acceptable model performance and (2) a scoring system to combine data-model mismatches for various processes at different temporal and spatial scales. The benchmark analyses should identify clues of weak model performance to guide future development, thus enabling improved predictions of future states of ecosystems and climate. The near-future research effort should be on development of a set of widely acceptable benchmarks that can be used to objectively, effectively, and reliably evaluate fundamental properties of land models

  16. A framework for benchmarking land models

    Directory of Open Access Journals (Sweden)

    Y. Q. Luo

    2012-10-01

    Full Text Available Land models, which have been developed by the modeling community in the past few decades to predict future states of ecosystems and climate, have to be critically evaluated for their performance skills of simulating ecosystem responses and feedback to climate change. Benchmarking is an emerging procedure to measure performance of models against a set of defined standards. This paper proposes a benchmarking framework for evaluation of land model performances and, meanwhile, highlights major challenges at this infant stage of benchmark analysis. The framework includes (1 targeted aspects of model performance to be evaluated, (2 a set of benchmarks as defined references to test model performance, (3 metrics to measure and compare performance skills among models so as to identify model strengths and deficiencies, and (4 model improvement. Land models are required to simulate exchange of water, energy, carbon and sometimes other trace gases between the atmosphere and land surface, and should be evaluated for their simulations of biophysical processes, biogeochemical cycles, and vegetation dynamics in response to climate change across broad temporal and spatial scales. Thus, one major challenge is to select and define a limited number of benchmarks to effectively evaluate land model performance. The second challenge is to develop metrics of measuring mismatches between models and benchmarks. The metrics may include (1 a priori thresholds of acceptable model performance and (2 a scoring system to combine data–model mismatches for various processes at different temporal and spatial scales. The benchmark analyses should identify clues of weak model performance to guide future development, thus enabling improved predictions of future states of ecosystems and climate. The near-future research effort should be on development of a set of widely acceptable benchmarks that can be used to objectively, effectively, and reliably evaluate fundamental properties

  17. Benchmarking File System Benchmarking: It *IS* Rocket Science

    OpenAIRE

    Seltzer, Margo I.; Tarasov, Vasily; Bhanage, Saumitra; Zadok, Erez

    2011-01-01

    The quality of file system benchmarking has not improved in over a decade of intense research spanning hundreds of publications. Researchers repeatedly use a wide range of poorly designed benchmarks, and in most cases, develop their own ad-hoc benchmarks. Our community lacks a definition of what we want to benchmark in a file system. We propose several dimensions of file system benchmarking and review the wide range of tools and techniques in widespread use. We experimentally show that even t...

  18. The KMAT: Benchmarking Knowledge Management.

    Science.gov (United States)

    de Jager, Martha

    Provides an overview of knowledge management and benchmarking, including the benefits and methods of benchmarking (e.g., competitive, cooperative, collaborative, and internal benchmarking). Arthur Andersen's KMAT (Knowledge Management Assessment Tool) is described. The KMAT is a collaborative benchmarking tool, designed to help organizations make…

  19. Benchmarking in Mobarakeh Steel Company

    OpenAIRE

    Sasan Ghasemi; Mohammad Nazemi; Mehran Nejati

    2008-01-01

    Benchmarking is considered as one of the most effective ways of improving performance in companies. Although benchmarking in business organizations is a relatively new concept and practice, it has rapidly gained acceptance worldwide. This paper introduces the benchmarking project conducted in Esfahan's Mobarakeh Steel Company, as the first systematic benchmarking project conducted in Iran. It aims to share the process deployed for the benchmarking project in this company and illustrate how th...

  20. Benchmarking in Mobarakeh Steel Company

    Directory of Open Access Journals (Sweden)

    Sasan Ghasemi

    2008-05-01

    Full Text Available Benchmarking is considered as one of the most effective ways of improving performance incompanies. Although benchmarking in business organizations is a relatively new concept and practice, ithas rapidly gained acceptance worldwide. This paper introduces the benchmarking project conducted in Esfahan’s Mobarakeh Steel Company, as the first systematic benchmarking project conducted in Iran. It aimsto share the process deployed for the benchmarking project in this company and illustrate how the projectsystematic implementation led to succes.

  1. Effect of subchronic 2,3,7,8-tetrachlorodibenzo-p-dioxin exposure on immune system and target gene responses in mice: calculation of benchmark doses for CYP1A1 and CYP1A2 related enzyme activities

    Energy Technology Data Exchange (ETDEWEB)

    Vogel, C. [Medical Institute of Environmental Hygiene at the Heinrich Heine University of Duesseldorf, Division of Toxicology, Auf`m Hennekamp 50, D-40225 Duesseldorf (Germany); Donat, S. [Medical Institute of Environmental Hygiene at the Heinrich Heine University of Duesseldorf, Division of Toxicology, Auf`m Hennekamp 50, D-40225 Duesseldorf (Germany); Doehr, O. [Medical Institute of Environmental Hygiene at the Heinrich Heine University of Duesseldorf, Division of Toxicology, Auf`m Hennekamp 50, D-40225 Duesseldorf (Germany); Kremer, J. [Medical Institute of Environmental Hygiene at the Heinrich Heine University of Duesseldorf, Division of Immunology Auf`m Hennekamp 50, D-40225 Duesseldorf (Germany); Esser, C. [Medical Institute of Environmental Hygiene at the Heinrich Heine University of Duesseldorf, Division of Immunology Auf`m Hennekamp 50, D-40225 Duesseldorf (Germany); Roller, M. [Medical Institute of Environmental Hygiene at the Heinrich Heine University of Duesseldorf, Division of Experimental Hygiene Auf`m Hennekamp 50, D-40225 Duesseldorf (Germany); Abel, J. [Medical Institute of Environmental Hygiene at the Heinrich Heine University of Duesseldorf, Division of Toxicology, Auf`m Hennekamp 50, D-40225 Duesseldorf (Germany)

    1997-04-01

    The dose-effect relationships were analysed for several noncarcinogenic endpoints, such as immunological and biochemical responses at subchronic, low dose exposure of female C57BL/6 mice to 2,3,7,8-tetrachlorodibenzo-p-dioxin (TCDD). The animals were treated i.p. with TCDD according to the initial- and maintenance-dose principle for a period of 135 days. The initial doses were 1, 10 and 100 ng TCDD/kg, the weekly maintenance doses were 0.2, 2 and 20 ng TCDD/kg, respectively. At days 23, 79 and 135 of TCDD treatment 10 animals of each dose group were killed. As immunological parameters the number of thymocytes and the pattern of thymocyte subpopulations were determined. In liver, lung and thymus, mRNA expression of TGF-{alpha}, TGF-{beta}{sub 1}, TGF-{beta}{sub 2}, TGF-{beta}{sub 3}, TNF-{alpha}, IL-1{beta} and different CYP1 isoforms (CYP1A1, CYP1A2, CYP1B1) was analysed. In the livers, activities of 7-ethoxyresorufin-O-deethylase (EROD) and 7-methoxyresorufin-O-demethylase (MROD) were measured. TCDD content in the liver was determined. The main results are summarized as follows: (1) The TCDD doses were not sufficient to elicit dose-dependent changes of pattern of thymocyte subpopulation. (2) TCDD failed to change the mRNA expression of TGF-{alpha}, TGF-{beta} and TNF-{alpha}, but led to an increase of IL-1{beta} mRNA expression in liver, lung and thymus. The results show that the TCDD induced IL-1{beta} mRNA increase is at least as sensitive a marker as the induction of CYP1A isoforms. (3) The expression of CYP1B1 mRNA remained unchanged at the doses tested, while CYP1A1 and CYP1A2 mRNA expression was dose-dependently enhanced. EROD and MROD activities in the liver paralleled the increases of CYP1A1 and CYP1A2 mRNA expression. (4) Regression analysis of the data showed that most of the parameters tested fit a linear model. (5) From the data, a benchmark dose for EROD/MROD activities in the livers of female C57BL/6 mice of about 0.03 ng TCDD/kg per day was

  2. PNNL Information Technology Benchmarking

    Energy Technology Data Exchange (ETDEWEB)

    DD Hostetler

    1999-09-08

    Benchmarking is a methodology for searching out industry best practices that lead to superior performance. It is exchanging information, not just with any organization, but with organizations known to be the best within PNNL, in industry, or in dissimilar industries with equivalent functions. It is used as a continuous improvement tool for business and technical processes, products, and services. Information technology--comprising all computer and electronic communication products and services--underpins the development and/or delivery of many PNNL products and services. This document describes the Pacific Northwest National Laboratory's (PNNL's) approach to information technology (IT) benchmarking. The purpose is to engage other organizations in the collaborative process of benchmarking in order to improve the value of IT services provided to customers. TM document's intended audience consists of other US Department of Energy (DOE) national laboratories and their IT staff. Although the individual participants must define the scope of collaborative benchmarking, an outline of IT service areas for possible benchmarking is described.

  3. Evaluation of PWR and BWR pin cell benchmark results

    Energy Technology Data Exchange (ETDEWEB)

    Pijlgroms, B.J.; Gruppelaar, H.; Janssen, A.J. (Netherlands Energy Research Foundation (ECN), Petten (Netherlands)); Hoogenboom, J.E.; Leege, P.F.A. de (Interuniversitair Reactor Inst., Delft (Netherlands)); Voet, J. van der (Gemeenschappelijke Kernenergiecentrale Nederland NV, Dodewaard (Netherlands)); Verhagen, F.C.M. (Keuring van Electrotechnische Materialen NV, Arnhem (Netherlands))

    1991-12-01

    Benchmark results of the Dutch PINK working group on PWR and BWR pin cell calculational benchmark as defined by EPRI are presented and evaluated. The observed discrepancies are problem dependent: a part of the results is satisfactory, some other results require further analysis. A brief overview is given of the different code packages used in this analysis. (author). 14 refs., 9 figs., 30 tabs.

  4. Calculation of critical loads for cadmium, lead and mercury; background document to a mapping manual on critical loads of cadmium, lead and mercury

    NARCIS (Netherlands)

    Vries, de W.; Schütze, G.; Lofts, S.; Tipping, E.; Meili, M.; Römkens, P.F.A.M.; Groenenberg, J.E.

    2005-01-01

    This report on heavy metals provides up-to-date methodologies to derive critical loads for the heavy metals cadmium (Cd), lead (Pb) and mercury (Hg) for both terrestrial and aquatic ecosystems. It presents background information to a Manual on Critical Loads for those metals. Focus is given to the m

  5. Benchmarking for Best Practice

    CERN Document Server

    Zairi, Mohamed

    1998-01-01

    Benchmarking for Best Practice uses up-to-the-minute case-studies of individual companies and industry-wide quality schemes to show how and why implementation has succeeded. For any practitioner wanting to establish best practice in a wide variety of business areas, this book makes essential reading. .It is also an ideal textbook on the applications of TQM since it describes concepts, covers definitions and illustrates the applications with first-hand examples. Professor Mohamed Zairi is an international expert and leading figure in the field of benchmarking. His pioneering work in this area l

  6. Benchmarking Danish Industries

    DEFF Research Database (Denmark)

    Gammelgaard, Britta; Bentzen, Eric; Aagaard Andreassen, Mette

    2003-01-01

    compatible survey. The International Manufacturing Strategy Survey (IMSS) doesbring up the question of supply chain management, but unfortunately, we did not have access to thedatabase. Data from the members of the SCOR-model, in the form of benchmarked performance data,may exist, but are nonetheless...... not public. The survey is a cooperative project "Benchmarking DanishIndustries" with CIP/Aalborg University, the Danish Technological University, the DanishTechnological Institute and Copenhagen Business School as consortia partners. The project has beenfunded by the Danish Agency for Trade and Industry...

  7. Deviating From the Benchmarks

    DEFF Research Database (Denmark)

    Rocha, Vera; Van Praag, Mirjam; Carneiro, Anabela

    survival? The analysis is based on a matched employer-employee dataset and covers about 17,500 startups in manufacturing and services. We adopt a new procedure to estimate individual benchmarks for the quantity and quality of initial human resources, acknowledging correlations between hiring decisions...... the benchmark can be substantial, are persistent over time, and hinder the survival of firms. The implications may, however, vary according to the sector and the ownership structure at entry. Given the stickiness of initial choices, wrong human capital decisions at entry turn out to be a close to irreversible...

  8. Literature research concerning alternative methods for validation of criticality calculation systems; Literaturrecherche zu alternativen Daten und Methoden zur Validierung von Kritikalitaetsrechensystemen

    Energy Technology Data Exchange (ETDEWEB)

    Behler, Matthias

    2016-05-15

    Beside radiochemical analysis of irradiated fuel and critical experiments, which has become a well-established basis for the validation of depletion code and criticality codes respectively, also results of oscillation experiments or the operating conditions of power reactor and research reactors can provide useful information for the validation of the above mentioned codes. Based on a literature review the potential of the utilization of oscillation experiment measurements for the validation of criticality codes is estimated. It is found that the reactivity measurements for actinides and fission products within the CERES program on the reactors DIMPLE (Winfrith, UK) and MINERVE (Cadarache, France) can give a valuable addition to the commonly used critical experiments for criticality code validation. However, there are approaches but yet no generally satisfactory solution for integrating the reactivity measurements in a quantitative bias determination for the neutron multiplication factor of typical application cases including irradiated spent fuel outside reactor cores, calculated using common criticality codes.

  9. DWEB: A Data Warehouse Engineering Benchmark

    CERN Document Server

    Darmont, Jérôme; Boussaïd, Omar

    2005-01-01

    Data warehouse architectural choices and optimization techniques are critical to decision support query performance. To facilitate these choices, the performance of the designed data warehouse must be assessed. This is usually done with the help of benchmarks, which can either help system users comparing the performances of different systems, or help system engineers testing the effect of various design choices. While the TPC standard decision support benchmarks address the first point, they are not tuneable enough to address the second one and fail to model different data warehouse schemas. By contrast, our Data Warehouse Engineering Benchmark (DWEB) allows to generate various ad-hoc synthetic data warehouses and workloads. DWEB is fully parameterized to fulfill data warehouse design needs. However, two levels of parameterization keep it relatively easy to tune. Finally, DWEB is implemented as a Java free software that can be interfaced with most existing relational database management systems. A sample usag...

  10. OECD/NEA International Benchmark exercises: Validation of CFD codes applied nuclear industry; OECD/NEA internatiion Benchmark exercices: La validacion de los codigos CFD aplicados a la industria nuclear

    Energy Technology Data Exchange (ETDEWEB)

    Pena-Monferrer, C.; Miquel veyrat, A.; Munoz-Cobo, J. L.; Chiva Vicent, S.

    2016-08-01

    In the recent years, due, among others, the slowing down of the nuclear industry, investment in the development and validation of CFD codes, applied specifically to the problems of the nuclear industry has been seriously hampered. Thus the International Benchmark Exercise (IBE) sponsored by the OECD/NEA have been fundamental to analyze the use of CFD codes in the nuclear industry, because although these codes are mature in many fields, still exist doubts about them in critical aspects of thermohydraulic calculations, even in single-phase scenarios. The Polytechnic University of Valencia (UPV) and the Universitat Jaume I (UJI), sponsored by the Nuclear Safety Council (CSN), have actively participated in all benchmark's proposed by NEA, as in the expert meetings,. In this paper, a summary of participation in the various IBE will be held, describing the benchmark itself, the CFD model created for it, and the main conclusions. (Author)

  11. Benchmark Data Through The International Reactor Physics Experiment Evaluation Project (IRPHEP)

    Energy Technology Data Exchange (ETDEWEB)

    J. Blair Briggs; Dr. Enrico Sartori

    2005-09-01

    The International Reactor Physics Experiments Evaluation Project (IRPhEP) was initiated by the Organization for Economic Cooperation and Development (OECD) Nuclear Energy Agency’s (NEA) Nuclear Science Committee (NSC) in June of 2002. The IRPhEP focus is on the derivation of internationally peer reviewed benchmark models for several types of integral measurements, in addition to the critical configuration. While the benchmarks produced by the IRPhEP are of primary interest to the Reactor Physics Community, many of the benchmarks can be of significant value to the Criticality Safety and Nuclear Data Communities. Benchmarks that support the Next Generation Nuclear Plant (NGNP), for example, also support fuel manufacture, handling, transportation, and storage activities and could challenge current analytical methods. The IRPhEP is patterned after the International Criticality Safety Benchmark Evaluation Project (ICSBEP) and is closely coordinated with the ICSBEP. This paper highlights the benchmarks that are currently being prepared by the IRPhEP that are also of interest to the Criticality Safety Community. The different types of measurements and associated benchmarks that can be expected in the first publication and beyond are described. The protocol for inclusion of IRPhEP benchmarks as ICSBEP benchmarks and for inclusion of ICSBEP benchmarks as IRPhEP benchmarks is detailed. The format for IRPhEP benchmark evaluations is described as an extension of the ICSBEP format. Benchmarks produced by the IRPhEP add new dimension to criticality safety benchmarking efforts and expand the collection of available integral benchmarks for nuclear data testing. The first publication of the "International Handbook of Evaluated Reactor Physics Benchmark Experiments" is scheduled for January of 2006.

  12. Benchmarking of Heavy Ion Transport Codes

    Energy Technology Data Exchange (ETDEWEB)

    Remec, Igor [ORNL; Ronningen, Reginald M. [Michigan State University, East Lansing; Heilbronn, Lawrence [University of Tennessee, Knoxville (UTK)

    2011-01-01

    Accurate prediction of radiation fields generated by heavy ion interactions is important in medical applications, space missions, and in designing and operation of rare isotope research facilities. In recent years, several well-established computer codes in widespread use for particle and radiation transport calculations have been equipped with the capability to simulate heavy ion transport and interactions. To assess and validate these capabilities, we performed simulations of a series of benchmark-quality heavy ion experiments with the computer codes FLUKA, MARS15, MCNPX, and PHITS. We focus on the comparisons of secondary neutron production. Results are encouraging; however, further improvements in models and codes and additional benchmarking are required.

  13. Benchmarks: WICHE Region 2012

    Science.gov (United States)

    Western Interstate Commission for Higher Education, 2013

    2013-01-01

    Benchmarks: WICHE Region 2012 presents information on the West's progress in improving access to, success in, and financing of higher education. The information is updated annually to monitor change over time and encourage its use as a tool for informed discussion in policy and education communities. To establish a general context for the…

  14. HPCS HPCchallenge Benchmark Suite

    Science.gov (United States)

    2007-11-02

    measured HPCchallenge Benchmark performance on various HPC architectures — from Cray X1s to Beowulf clusters — in the presentation and paper...from Cray X1s to Beowulf clusters — using the updated results at http://icl.cs.utk.edu/hpcc/hpcc_results.cgi Even a small percentage of random

  15. Surveys and Benchmarks

    Science.gov (United States)

    Bers, Trudy

    2012-01-01

    Surveys and benchmarks continue to grow in importance for community colleges in response to several factors. One is the press for accountability, that is, for colleges to report the outcomes of their programs and services to demonstrate their quality and prudent use of resources, primarily to external constituents and governing boards at the state…

  16. Calculations to compare different ways of modelling the plate geometry cells of the Zebra fast critical assembly, MZA

    Energy Technology Data Exchange (ETDEWEB)

    Rowlands, John, E-mail: rowlandsjl@aol.com

    2009-03-15

    The core region cells of the Zebra fast critical assembly MZA comprise 14 plates in a square steel tube, with 12 cells being stacked axially to form the core section of the assembly. The cells can be modelled in different levels of detail, ranging from a three-dimensional representation in which the core (The word core is used to describe both the region of a plate containing the main material, such as plutonium, UO{sub 2} or sodium, and the region of the assembly containing fissile material cells.) and canning regions of the plates and the void gaps between the edges of the plates and the steel tube, and between tubes, are represented. Simplified models include a three-dimensional representation in which the void regions are combined with the tube material. A further simplified three-dimensional model, called the MURAL model, represents the core regions of the plates but homogenises the canning, tube material and void regions. Two types of one-dimensional slab geometry model are found in the literature, one in which the materials are homogenised within each of the three axial slab regions of a canned plate (plate core and upper and lower canning regions) and a further simplified version in which the plate is modelled as a single region, the compositions being averaged over the whole thickness of the plate, comprising the plate core material, the canning and the tube material. MONK Monte Carlo calculations have been made for each of these models, and also for the fully homogenised cells, and the k-effective values, core sodium void reactivities and reaction rate ratios are compared.

  17. Full sphere hydrodynamic and dynamo benchmarks

    KAUST Repository

    Marti, P.

    2014-01-26

    Convection in planetary cores can generate fluid flow and magnetic fields, and a number of sophisticated codes exist to simulate the dynamic behaviour of such systems. We report on the first community activity to compare numerical results of computer codes designed to calculate fluid flow within a whole sphere. The flows are incompressible and rapidly rotating and the forcing of the flow is either due to thermal convection or due to moving boundaries. All problems defined have solutions that alloweasy comparison, since they are either steady, slowly drifting or perfectly periodic. The first two benchmarks are defined based on uniform internal heating within the sphere under the Boussinesq approximation with boundary conditions that are uniform in temperature and stress-free for the flow. Benchmark 1 is purely hydrodynamic, and has a drifting solution. Benchmark 2 is a magnetohydrodynamic benchmark that can generate oscillatory, purely periodic, flows and magnetic fields. In contrast, Benchmark 3 is a hydrodynamic rotating bubble benchmark using no slip boundary conditions that has a stationary solution. Results from a variety of types of code are reported, including codes that are fully spectral (based on spherical harmonic expansions in angular coordinates and polynomial expansions in radius), mixed spectral and finite difference, finite volume, finite element and also a mixed Fourier-finite element code. There is good agreement between codes. It is found that in Benchmarks 1 and 2, the approximation of a whole sphere problem by a domain that is a spherical shell (a sphere possessing an inner core) does not represent an adequate approximation to the system, since the results differ from whole sphere results. © The Authors 2014. Published by Oxford University Press on behalf of The Royal Astronomical Society.

  18. Storage-Intensive Supercomputing Benchmark Study

    Energy Technology Data Exchange (ETDEWEB)

    Cohen, J; Dossa, D; Gokhale, M; Hysom, D; May, J; Pearce, R; Yoo, A

    2007-10-30

    Critical data science applications requiring frequent access to storage perform poorly on today's computing architectures. This project addresses efficient computation of data-intensive problems in national security and basic science by exploring, advancing, and applying a new form of computing called storage-intensive supercomputing (SISC). Our goal is to enable applications that simply cannot run on current systems, and, for a broad range of data-intensive problems, to deliver an order of magnitude improvement in price/performance over today's data-intensive architectures. This technical report documents much of the work done under LDRD 07-ERD-063 Storage Intensive Supercomputing during the period 05/07-09/07. The following chapters describe: (1) a new file I/O monitoring tool iotrace developed to capture the dynamic I/O profiles of Linux processes; (2) an out-of-core graph benchmark for level-set expansion of scale-free graphs; (3) an entity extraction benchmark consisting of a pipeline of eight components; and (4) an image resampling benchmark drawn from the SWarp program in the LSST data processing pipeline. The performance of the graph and entity extraction benchmarks was measured in three different scenarios: data sets residing on the NFS file server and accessed over the network; data sets stored on local disk; and data sets stored on the Fusion I/O parallel NAND Flash array. The image resampling benchmark compared performance of software-only to GPU-accelerated. In addition to the work reported here, an additional text processing application was developed that used an FPGA to accelerate n-gram profiling for language classification. The n-gram application will be presented at SC07 at the High Performance Reconfigurable Computing Technologies and Applications Workshop. The graph and entity extraction benchmarks were run on a Supermicro server housing the NAND Flash 40GB parallel disk array, the Fusion-io. The Fusion system specs are as follows

  19. Benchmarking i den offentlige sektor

    DEFF Research Database (Denmark)

    Bukh, Per Nikolaj; Dietrichson, Lars; Sandalgaard, Niels

    2008-01-01

    I artiklen vil vi kort diskutere behovet for benchmarking i fraværet af traditionelle markedsmekanismer. Herefter vil vi nærmere redegøre for, hvad benchmarking er med udgangspunkt i fire forskellige anvendelser af benchmarking. Regulering af forsyningsvirksomheder vil blive behandlet, hvorefter...

  20. Benchmarking of LSTM Networks

    OpenAIRE

    Breuel, Thomas M.

    2015-01-01

    LSTM (Long Short-Term Memory) recurrent neural networks have been highly successful in a number of application areas. This technical report describes the use of the MNIST and UW3 databases for benchmarking LSTM networks and explores the effect of different architectural and hyperparameter choices on performance. Significant findings include: (1) LSTM performance depends smoothly on learning rates, (2) batching and momentum has no significant effect on performance, (3) softmax training outperf...

  1. A Benchmarking System for Domestic Water Use

    Directory of Open Access Journals (Sweden)

    Dexter V. L. Hunt

    2014-05-01

    Full Text Available The national demand for water in the UK is predicted to increase, exacerbated by a growing UK population, and home-grown demands for energy and food. When set against the context of overstretched existing supply sources vulnerable to droughts, particularly in increasingly dense city centres, the delicate balance of matching minimal demands with resource secure supplies becomes critical. When making changes to "internal" demands the role of technological efficiency and user behaviour cannot be ignored, yet existing benchmarking systems traditionally do not consider the latter. This paper investigates the practicalities of adopting a domestic benchmarking system (using a band rating that allows individual users to assess their current water use performance against what is possible. The benchmarking system allows users to achieve higher benchmarks through any approach that reduces water consumption. The sensitivity of water use benchmarks are investigated by making changes to user behaviour and technology. The impact of adopting localised supplies (i.e., Rainwater harvesting—RWH and Grey water—GW and including "external" gardening demands are investigated. This includes the impacts (in isolation and combination of the following: occupancy rates (1 to 4; roof size (12.5 m2 to 100 m2; garden size (25 m2 to 100 m2 and geographical location (North West, Midlands and South East, UK with yearly temporal effects (i.e., rainfall and temperature. Lessons learnt from analysis of the proposed benchmarking system are made throughout this paper, in particular its compatibility with the existing Code for Sustainable Homes (CSH accreditation system. Conclusions are subsequently drawn for the robustness of the proposed system.

  2. IAEA coordinated research project (CRP) on 'Analytical and experimental benchmark analyses of accelerator driven systems'

    Energy Technology Data Exchange (ETDEWEB)

    Abanades, Alberto [Universidad Politecnica de Madrid (Spain); Aliberti, Gerardo; Gohar, Yousry; Talamo, Alberto [ANL, Argonne (United States); Bornos, Victor; Kiyavitskaya, Anna [Joint Institute of Power Eng. and Nucl. Research ' Sosny' , Minsk (Belarus); Carta, Mario [ENEA, Casaccia (Italy); Janczyszyn, Jerzy [AGH-University of Science and Technology, Krakow (Poland); Maiorino, Jose [IPEN, Sao Paulo (Brazil); Pyeon, Cheolho [Kyoto University (Japan); Stanculescu, Alexander [IAEA, Vienna (Austria); Titarenko, Yury [ITEP, Moscow (Russian Federation); Westmeier, Wolfram [Wolfram Westmeier GmbH, Ebsdorfergrund (Germany)

    2008-07-01

    In December 2005, the International Atomic Energy Agency (IAEA) has started a Coordinated Research Project (CRP) on 'Analytical and Experimental Benchmark Analyses of Accelerator Driven Systems'. The overall objective of the CRP, performed within the framework of the Technical Working Group on Fast Reactors (TWGFR) of IAEA's Nuclear Energy Department, is to increase the capability of interested Member States in developing and applying advanced reactor technologies in the area of long-lived radioactive waste utilization and transmutation. The specific objective of the CRP is to improve the present understanding of the coupling of an external neutron source (e.g. spallation source) with a multiplicative sub-critical core. The participants are performing computational and experimental benchmark analyses using integrated calculation schemes and simulation methods. The CRP aims at integrating some of the planned experimental demonstration projects of the coupling between a sub-critical core and an external neutron source (e.g. YALINA Booster in Belarus, and Kyoto University's Critical Assembly (KUCA)). The objective of these experimental programs is to validate computational methods, obtain high energy nuclear data, characterize the performance of sub-critical assemblies driven by external sources, and to develop and improve techniques for sub-criticality monitoring. The paper summarizes preliminary results obtained to-date for some of the CRP benchmarks. (authors)

  3. Benchmark dose of lead inducing anemia at the workplace.

    Science.gov (United States)

    Karita, Kanae; Yano, Eiji; Dakeishi, Miwako; Iwata, Toyoto; Murata, Katsuyuki

    2005-08-01

    To estimate the critical dose of lead inducing anemia in humans, the effects of lead on hemoglobin (Hb) and hematocrit (Hct) levels and red blood cell (RBC) count were examined in 388 male lead-exposed workers with blood lead (BPb) levels of 0.05-5.5 (mean 1.3) micromol/L by using the benchmark dose (BMD) approach. The BPb level was significantly related to Hb (regression coefficient beta=-0.276), RBC (beta=-11.35), and Hct (beta=-0.563) among the workers (p anemia (1.85 micromol/L), based on the WHO criteria, than in those without anemia (1.26 micromol/L). The benchmark dose levels of BPb (i.e., lower 95% confidence limits of BMD), calculated from the K-power model set at an abnormal probability of 5% in unexposed workers and an excess risk of 5% in exposed workers were estimated to be 0.94 micromol/L (19.5 microg/dl) for Hb, 0.94 micromol/L (19.4 microg/dl) for RBC, and 1.43 micromol/L (29.6 microg/dl) for Hct. These findings suggest that reduction in hematopoietic indicators may be initiated at BPbs below the level currently considered without effect.

  4. Nuclear criticality safety experiments, calculations, and analyses - 1958 to 1982. Volume 2. Summaries. Complilation of papers from the Transactions of the American Nuclear Society

    Energy Technology Data Exchange (ETDEWEB)

    Koponen, B.L.; Hampel, V.E.

    1982-10-21

    This compilation contains 688 complete summaries of papers on nuclear criticality safety as presented at meetings of the American Nuclear Society (ANS). The selected papers contain criticality parameters for fissile materials derived from experiments and calculations, as well as criticality safety analyses for fissile material processing, transport, and storage. The compilation was developed as a component of the Nuclear Criticality Information System (NCIS) now under development at the Lawrence Livermore National Laboratory. The compilation is presented in two volumes: Volume 1 contains a directory to the ANS Transaction volume and page number where each summary was originally published, the author concordance, and the subject concordance derived from the keyphrases in titles. Volume 2 contains-in chronological order-the full-text summaries, reproduced here by permission of the American Nuclear Society from their Transactions, volumes 1-41.

  5. Monte Carlo and deterministic simulations of activation ratio experiments for 238U(n,f), 238U(n,g) and 238U(n,2n) in the Big Ten benchmark critical assembly

    Energy Technology Data Exchange (ETDEWEB)

    Descalle, M; Clouse, C; Pruet, J

    2009-07-28

    The authors have compared calculations of critical assembly activation ratios using 3 different Monte Carlo codes and one deterministic code. There is excellent agreement. Discrepancies between the different Monte Carlo codes are the 1-2% level. Notably, the deterministic calculations with 87 groups are also in good agreement with the continuous energy Monte Carlo results. The three codes underestimate the {sup 238}U(n,f) reaction, suggesting that there is room for improvement in the evaluation, or in the evaluations of other reactions influencing the spectrum in BigTen. Until statistical uncertainties are implemented in Mercury, they strongly advise long runs to guarantee sufficient convergence of the flux at high energies, and they strongly encourage comparing Mercury results to a well-developed and documented code such as MCNP5 and/or COG. It may be that ENDL2008 will be available for use in COG within a year. Finally, it may be worthwhile to add a 'standard' reaction rate tally similar to those implemented in COG and MCNP5, if the goal is to expand the central fission and activation ratios simulations to include isotopes that are not part of the specifications for the assembly material composition.

  6. KENO-VI Primer: A Primer for Criticality Calculations with SCALE/KENO-VI Using GeeWiz

    Energy Technology Data Exchange (ETDEWEB)

    Bowman, Stephen M [ORNL

    2008-09-01

    The SCALE (Standardized Computer Analyses for Licensing Evaluation) computer software system developed at Oak Ridge National Laboratory is widely used and accepted around the world for criticality safety analyses. The well-known KENO-VI three-dimensional Monte Carlo criticality computer code is one of the primary criticality safety analysis tools in SCALE. The KENO-VI primer is designed to help a new user understand and use the SCALE/KENO-VI Monte Carlo code for nuclear criticality safety analyses. It assumes that the user has a college education in a technical field. There is no assumption of familiarity with Monte Carlo codes in general or with SCALE/KENO-VI in particular. The primer is designed to teach by example, with each example illustrating two or three features of SCALE/KENO-VI that are useful in criticality analyses. The primer is based on SCALE 6, which includes the Graphically Enhanced Editing Wizard (GeeWiz) Windows user interface. Each example uses GeeWiz to provide the framework for preparing input data and viewing output results. Starting with a Quickstart section, the primer gives an overview of the basic requirements for SCALE/KENO-VI input and allows the user to quickly run a simple criticality problem with SCALE/KENO-VI. The sections that follow Quickstart include a list of basic objectives at the beginning that identifies the goal of the section and the individual SCALE/KENO-VI features that are covered in detail in the sample problems in that section. Upon completion of the primer, a new user should be comfortable using GeeWiz to set up criticality problems in SCALE/KENO-VI. The primer provides a starting point for the criticality safety analyst who uses SCALE/KENO-VI. Complete descriptions are provided in the SCALE/KENO-VI manual. Although the primer is self-contained, it is intended as a companion volume to the SCALE/KENO-VI documentation. (The SCALE manual is provided on the SCALE installation DVD.) The primer provides specific examples of

  7. Toward a molecular theory of homogeneous bubble nucleation: II. Calculation of the number density of critical nuclei and the rate of nucleation.

    Science.gov (United States)

    Torabi, Korosh; Corti, David S

    2013-10-17

    In the present paper, we develop a method to calculate the rate of homogeneous bubble nucleation within a superheated L-J liquid based on the (n,v) equilibrium embryo free energy surface introduced in the first paper (DOI: 10.1021/jp404149n). We express the nucleation rate as the product of the concentration of critical nuclei within the metastable liquid phase and the relevant forward rate coefficient. We calculate the forward rate coefficient of the critical nuclei from their average lifetime as determined from MD simulations of a large number of embryo trajectories initiated from the transitional region of the metastable liquid configuration space. Therefore, the proposed rate coefficient does not rely on any predefined reaction coordinate. In our model, the critical nuclei belong to the region of the configuration space where the committor probability is about one-half, guaranteeing the dynamical relevance of the proposed embryos. One novel characteristic of our approach is that we define a limit for the configuration space of the equilibrium metastable phase and do not include the configurations that have zero committor probability in the nucleation free energy surface. Furthermore, in order to take into account the transitional degrees of freedom of the critical nuclei, we develop a simulation-based approach for rigorously mapping the free energy of the (n,v) equilibrium embryos to the concentration of the critical nuclei within the bulk metastable liquid phase.

  8. Benchmarking: applications to transfusion medicine.

    Science.gov (United States)

    Apelseth, Torunn Oveland; Molnar, Laura; Arnold, Emmy; Heddle, Nancy M

    2012-10-01

    Benchmarking is as a structured continuous collaborative process in which comparisons for selected indicators are used to identify factors that, when implemented, will improve transfusion practices. This study aimed to identify transfusion medicine studies reporting on benchmarking, summarize the benchmarking approaches used, and identify important considerations to move the concept of benchmarking forward in the field of transfusion medicine. A systematic review of published literature was performed to identify transfusion medicine-related studies that compared at least 2 separate institutions or regions with the intention of benchmarking focusing on 4 areas: blood utilization, safety, operational aspects, and blood donation. Forty-five studies were included: blood utilization (n = 35), safety (n = 5), operational aspects of transfusion medicine (n = 5), and blood donation (n = 0). Based on predefined criteria, 7 publications were classified as benchmarking, 2 as trending, and 36 as single-event studies. Three models of benchmarking are described: (1) a regional benchmarking program that collects and links relevant data from existing electronic sources, (2) a sentinel site model where data from a limited number of sites are collected, and (3) an institutional-initiated model where a site identifies indicators of interest and approaches other institutions. Benchmarking approaches are needed in the field of transfusion medicine. Major challenges include defining best practices and developing cost-effective methods of data collection. For those interested in initiating a benchmarking program, the sentinel site model may be most effective and sustainable as a starting point, although the regional model would be the ideal goal.

  9. Critical endpoint behavior in an asymmetric Ising model: application of Wang-Landau sampling to calculate the density of states.

    Science.gov (United States)

    Tsai, Shan-Ho; Wang, Fugao; Landau, D P

    2007-06-01

    Using the Wang-Landau sampling method with a two-dimensional random walk we determine the density of states for an asymmetric Ising model with two- and three-body interactions on a triangular lattice, in the presence of an external field. With an accurate density of states we were able to map out the phase diagram accurately and perform quantitative finite-size analyses at, and away from, the critical endpoint. We observe a clear divergence of the curvature of the spectator phase boundary and of the magnetization coexistence diameter derivative at the critical endpoint, and the exponents for both divergences agree well with previous theoretical predictions.

  10. Critical endpoint behavior in an asymmetric Ising model: Application of Wang-Landau sampling to calculate the density of states

    Science.gov (United States)

    Tsai, Shan-Ho; Wang, Fugao; Landau, D. P.

    2007-06-01

    Using the Wang-Landau sampling method with a two-dimensional random walk we determine the density of states for an asymmetric Ising model with two- and three-body interactions on a triangular lattice, in the presence of an external field. With an accurate density of states we were able to map out the phase diagram accurately and perform quantitative finite-size analyses at, and away from, the critical endpoint. We observe a clear divergence of the curvature of the spectator phase boundary and of the magnetization coexistence diameter derivative at the critical endpoint, and the exponents for both divergences agree well with previous theoretical predictions.

  11. 2001 benchmarking guide.

    Science.gov (United States)

    Hoppszallern, S

    2001-01-01

    Our fifth annual guide to benchmarking under managed care presents data that is a study in market dynamics and adaptation. New this year are financial indicators on HMOs exiting the market and those remaining. Hospital financial ratios and details on department performance are included. The physician group practice numbers show why physicians are scrutinizing capitated payments. Overall, hospitals in markets with high managed care penetration are more successful in managing labor costs and show productivity gains in imaging services, physical therapy and materials management.

  12. Measurement, Standards, and Peer Benchmarking: One Hospital's Journey.

    Science.gov (United States)

    Martin, Brian S; Arbore, Mark

    2016-04-01

    Peer-to-peer benchmarking is an important component of rapid-cycle performance improvement in patient safety and quality-improvement efforts. Institutions should carefully examine critical success factors before engagement in peer-to-peer benchmarking in order to maximize growth and change opportunities. Solutions for Patient Safety has proven to be a high-yield engagement for Children's Hospital of Pittsburgh of University of Pittsburgh Medical Center, with measureable improvement in both organizational process and culture.

  13. Evaluation of the benchmark dose for point of departure determination for a variety of chemical classes in applied regulatory settings.

    Science.gov (United States)

    Izadi, Hoda; Grundy, Jean E; Bose, Ranjan

    2012-05-01

    Repeated-dose studies received by the New Substances Assessment and Control Bureau (NSACB) of Health Canada are used to provide hazard information toward risk calculation. These studies provide a point of departure (POD), traditionally the NOAEL or LOAEL, which is used to extrapolate the quantity of substance above which adverse effects can be expected in humans. This project explored the use of benchmark dose (BMD) modeling as an alternative to this approach for studies with few dose groups. Continuous data from oral repeated-dose studies for chemicals previously assessed by NSACB were reanalyzed using U.S. EPA benchmark dose software (BMDS) to determine the BMD and BMD 95% lower confidence limit (BMDL(05) ) for each endpoint critical to NOAEL or LOAEL determination for each chemical. Endpoint-specific benchmark dose-response levels , indicative of adversity, were consistently applied. An overall BMD and BMDL(05) were calculated for each chemical using the geometric mean. The POD obtained from benchmark analysis was then compared with the traditional toxicity thresholds originally used for risk assessment. The BMD and BMDL(05) generally were higher than the NOAEL, but lower than the LOAEL. BMDL(05) was generally constant at 57% of the BMD. Benchmark provided a clear advantage in health risk assessment when a LOAEL was the only POD identified, or when dose groups were widely distributed. Although the benchmark method cannot always be applied, in the selected studies with few dose groups it provided a more accurate estimate of the real no-adverse-effect level of a substance.

  14. Entropy-based benchmarking methods

    OpenAIRE

    2012-01-01

    We argue that benchmarking sign-volatile series should be based on the principle of movement and sign preservation, which states that a bench-marked series should reproduce the movement and signs in the original series. We show that the widely used variants of Denton (1971) method and the growth preservation method of Causey and Trager (1981) may violate this principle, while its requirements are explicitly taken into account in the pro-posed entropy-based benchmarking methods. Our illustrati...

  15. Criticality calculation of the nuclear material warehouse of the ININ; Calculo de criticidad del almacen del material nuclear del ININ

    Energy Technology Data Exchange (ETDEWEB)

    Garcia, T.; Angeles, A.; Flores C, J., E-mail: teodoro.garcia@inin.gob.mx [ININ, Carretera Mexico-Toluca s/n, 52750 Ocoyoacac, Estado de Mexico (Mexico)

    2013-10-15

    In this work the conditions of nuclear safety were determined as much in normal conditions as in the accident event of the nuclear fuel warehouse of the reactor TRIGA Mark III of the Instituto Nacional de Investigaciones Nucleares (ININ). The warehouse contains standard fuel elements Leu - 8.5/20, a control rod with follower of standard fuel type Leu - 8.5/20, fuel elements Leu - 30/20, and the reactor fuel Sur-100. To check the subcritical state of the warehouse the effective multiplication factor (keff) was calculated. The keff calculation was carried out with the code MCNPX. (Author)

  16. Calculation of critical index $\\eta$ of the $\\varphi^3$-theory in 4-loop approximation by the conformal bootstrap technique

    CERN Document Server

    Pismensky, Artem L

    2015-01-01

    The method of calculation of $\\varepsilon$-expansion in model of scalar field with $\\varphi^3$-interaction based on conformal bootstrap equations is proposed. This technique is based on self-consistent skeleton equations involving full propagator and full triple vertex. Analytical computations of the Fisher's index $\\eta$ are performed in four-loop approximation. The three-loop result coincides with one obtained previously by the renormalization group equations technique based on calculation of a larger number of Feynman diagrams. The four-loop result agrees with its numerical value obtained by other authors.

  17. Design and implementation of a web-based reporting and benchmarking center for inpatient glucometrics.

    Science.gov (United States)

    Maynard, Greg; Schnipper, Jeffrey Lawrence; Messler, Jordan; Ramos, Pedro; Kulasa, Kristen; Nolan, Ann; Rogers, Kendall

    2014-07-01

    Insulin is a top source of adverse drug events in the hospital, and glycemic control is a focus of improvement efforts across the country. Yet, the majority of hospitals have no data to gauge their performance on glycemic control, hypoglycemia rates, or hypoglycemic management. Current tools to outsource glucometrics reports are limited in availability or function. Society of Hospital Medicine (SHM) faculty designed and implemented a web-based data and reporting center that calculates glucometrics on blood glucose data files securely uploaded by users. Unit labels, care type (critical care, non-critical care), and unit type (eg, medical, surgical, mixed, pediatrics) are defined on upload allowing for robust, flexible reporting. Reports for any date range, care type, unit type, or any combination of units are available on demand for review or downloading into a variety of file formats. Four reports with supporting graphics depict glycemic control, hypoglycemia, and hypoglycemia management by patient day or patient stay. Benchmarking and performance ranking reports are generated periodically for all hospitals in the database. In all, 76 hospitals have uploaded at least 12 months of data for non-critical care areas and 67 sites have uploaded critical care data. Critical care benchmarking reveals wide variability in performance. Some hospitals achieve top quartile performance in both glycemic control and hypoglycemia parameters. This new web-based glucometrics data and reporting tool allows hospitals to track their performance with a flexible reporting system, and provides them with external benchmarking. Tools like this help to establish standardized glucometrics and performance standards.

  18. A study of dynamic finite size scaling behavior of the scaling functions—calculation of dynamic critical index of Wolff algorithm

    Science.gov (United States)

    Gündüç, Semra; Dilaver, Mehmet; Aydın, Meral; Gündüç, Yiğit

    2005-02-01

    In this work we have studied the dynamic scaling behavior of two scaling functions and we have shown that scaling functions obey the dynamic finite size scaling rules. Dynamic finite size scaling of scaling functions opens possibilities for a wide range of applications. As an application we have calculated the dynamic critical exponent (z) of Wolff's cluster algorithm for 2-, 3- and 4-dimensional Ising models. Configurations with vanishing initial magnetization are chosen in order to avoid complications due to initial magnetization. The observed dynamic finite size scaling behavior during early stages of the Monte Carlo simulation yields z for Wolff's cluster algorithm for 2-, 3- and 4-dimensional Ising models with vanishing values which are consistent with the values obtained from the autocorrelations. Especially, the vanishing dynamic critical exponent we obtained for d=3 implies that the Wolff algorithm is more efficient in eliminating critical slowing down in Monte Carlo simulations than previously reported.

  19. HPC Benchmark Suite NMx Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Intelligent Automation Inc., (IAI) and University of Central Florida (UCF) propose to develop a comprehensive numerical test suite for benchmarking current and...

  20. Critical Length Criterion and the Arc Chain Model for Calculating the Arcing Time of the Secondary Arc Related to AC Transmission Lines

    Science.gov (United States)

    Cong, Haoxi; Li, Qingmin; Xing, Jinyuan; Li, Jinsong; Chen, Qiang

    2015-06-01

    The prompt extinction of the secondary arc is critical to the single-phase reclosing of AC transmission lines, including half-wavelength power transmission lines. In this paper, a low-voltage physical experimental platform was established and the motion process of the secondary arc was recorded by a high-speed camera. It was found that the arcing time of the secondary arc rendered a close relationship with its arc length. Through the input and output power energy analysis of the secondary arc, a new critical length criterion for the arcing time was proposed. The arc chain model was then adopted to calculate the arcing time with both the traditional and the proposed critical length criteria, and the simulation results were compared with the experimental data. The study showed that the arcing time calculated from the new critical length criterion gave more accurate results, which can provide a reliable criterion in term of arcing time for modeling and simulation of the secondary arc related with power transmission lines. supported by National Natural Science Foundation of China (Nos. 51277061 and 51420105011)

  1. Calculation of Theoretical and Empirical Nutrient N Critical Loads in the Mixed Conifer Ecosystems of Southern California

    OpenAIRE

    2007-01-01

    Edaphic, foliar, and hydrologic forest nutrient status indicators from 15 mixed conifer forest stands in the Sierra Nevada, San Gabriel Mountains, and San Bernardino National Forest were used to estimate empirical or theoretical critical loads (CL) for nitrogen (N) as a nutrient. Soil acidification response to N deposition was also evaluated. Robust empirical relationships were found relating N deposition to plant N uptake (N in foliage), N fertility (litter C/N ratio), and soil acidification...

  2. Differences between directly measured and calculated values for cardiac output in the dogfish: a criticism of the Fick method.

    Science.gov (United States)

    Metcalfe, J D; Butler, P J

    1982-08-01

    Cardiac output has been measured directly, and calculated by the Fick method, during normoxia and hypoxia in six artificially perfused dogfish (Scyliorhinus canicula) in an attempt to estimate the accuracy of this method in fish. The construction and operation of a simple extra-corporeal cardiac bypass pump is described. This pump closely mimics the flow pulse profiles of the fish's own heart and allows complete control of both cardiac stroke volume and systolic and diastolic periods. During normoxia (PO2 = 21 kPa) there was no significant difference between directly measured and calculated values for cardiac output. However, some shunting of blood past the respiratory surface of the gills may have been obscured by cutaneous oxygen uptake. In response to hypoxia (PO2 = 8.6 kPa) there is either a decrease in the amount of blood being shunted past the respiratory surface of the gills and/or an increase in cutaneous oxygen uptake such that the Fick calculated value for cardiac output is on average 38% greater than the measured value. It is proposed that the increase in the levels of circulating catecholamines that is reported to occur in response to hypoxia in this species may play an important role in the observed response to hypoxia. The results are discussed in terms of their implications for the calculation of cardiac output by the Fick principle in fish.

  3. Evaluation of PWR and BWR pin cell benchmark results

    Energy Technology Data Exchange (ETDEWEB)

    Pilgroms, B.J.; Gruppelaar, H.; Janssen, A.J. (Netherlands Energy Research Foundation (ECN), Petten (Netherlands)); Hoogenboom, J.E.; Leege, P.F.A. de (Interuniversitair Reactor Inst., Delft (Netherlands)); Voet, J. van der (Gemeenschappelijke Kernenergiecentrale Nederland NV, Dodewaard (Netherlands)); Verhagen, F.C.M. (Keuring van Electrotechnische Materialen NV, Arnhem (Netherlands))

    1991-12-01

    Benchmark results of the Dutch PINK working group on the PWR and BWR pin cell calculational benchmark as defined by EPRI are presented and evaluated. The observed discrepancies are problem dependent: a part of the results is satisfactory, some other results require further analysis. A brief overview is given of the different code packages used in this analysis. (author). 14 refs.; 9 figs.; 30 tabs.

  4. Calculation and Research of Hydroplaning Critical Velocity%临界滑水速度的计算研究

    Institute of Scientific and Technical Information of China (English)

    李强; 张卓; 张立

    2011-01-01

    利用动量定理,分别计算楔角较小(楔角小于0.4°)和楔角较大两种情况下滑水速度值.以小轿车、中型汽军和载重车为例分析车轮内压、水膜厚度与滑水速度的关系.结果表明:不论楔角较小或者较大,滑水速度与车轮内压成正比;楔角较大且车轮内压一定时,滑水速度与水膜厚度成反比.通过NASA滑水速度方程,对理论计算的滑水速度值进行验证,得出它的可靠性满足要求.%The paper which made use of wedge angle of momentum caculated hydroplaning critical velocity in both cases which consists of small wedge angle( <0.4°) and larger wedge angle. Analysis of the relationship between wheel pressure, water film thickness and hydroplaning critical velocity with the exam of cars, medium cars and trucks. The results showed that; no matter small or large wedge angle, water skiing and wheel speed directly proportional to internal pressure; wedge angle is larger and the wheel pressure is constant, water-skiing and water film thickness is inversely proportional to the speed. Verifying hydroplaning critical velocity according to NASA skilling speed equation and the result show that its reliability to meet the requirements.

  5. Benchmarking foreign electronics technologies

    Energy Technology Data Exchange (ETDEWEB)

    Bostian, C.W.; Hodges, D.A.; Leachman, R.C.; Sheridan, T.B.; Tsang, W.T.; White, R.M.

    1994-12-01

    This report has been drafted in response to a request from the Japanese Technology Evaluation Center`s (JTEC) Panel on Benchmarking Select Technologies. Since April 1991, the Competitive Semiconductor Manufacturing (CSM) Program at the University of California at Berkeley has been engaged in a detailed study of quality, productivity, and competitiveness in semiconductor manufacturing worldwide. The program is a joint activity of the College of Engineering, the Haas School of Business, and the Berkeley Roundtable on the International Economy, under sponsorship of the Alfred P. Sloan Foundation, and with the cooperation of semiconductor producers from Asia, Europe and the United States. Professors David A. Hodges and Robert C. Leachman are the project`s Co-Directors. The present report for JTEC is primarily based on data and analysis drawn from that continuing program. The CSM program is being conducted by faculty, graduate students and research staff from UC Berkeley`s Schools of Engineering and Business, and Department of Economics. Many of the participating firms are represented on the program`s Industry Advisory Board. The Board played an important role in defining the research agenda. A pilot study was conducted in 1991 with the cooperation of three semiconductor plants. The research plan and survey documents were thereby refined. The main phase of the CSM benchmarking study began in mid-1992 and will continue at least through 1997. reports are presented on the manufacture of integrated circuits; data storage; wireless technology; human-machine interfaces; and optoelectronics. Selected papers are indexed separately for inclusion in the Energy Science and Technology Database.

  6. Benchmarking monthly homogenization algorithms

    Directory of Open Access Journals (Sweden)

    V. K. C. Venema

    2011-08-01

    Full Text Available The COST (European Cooperation in Science and Technology Action ES0601: Advances in homogenization methods of climate series: an integrated approach (HOME has executed a blind intercomparison and validation study for monthly homogenization algorithms. Time series of monthly temperature and precipitation were evaluated because of their importance for climate studies and because they represent two important types of statistics (additive and multiplicative. The algorithms were validated against a realistic benchmark dataset. The benchmark contains real inhomogeneous data as well as simulated data with inserted inhomogeneities. Random break-type inhomogeneities were added to the simulated datasets modeled as a Poisson process with normally distributed breakpoint sizes. To approximate real world conditions, breaks were introduced that occur simultaneously in multiple station series within a simulated network of station data. The simulated time series also contained outliers, missing data periods and local station trends. Further, a stochastic nonlinear global (network-wide trend was added.

    Participants provided 25 separate homogenized contributions as part of the blind study as well as 22 additional solutions submitted after the details of the imposed inhomogeneities were revealed. These homogenized datasets were assessed by a number of performance metrics including (i the centered root mean square error relative to the true homogeneous value at various averaging scales, (ii the error in linear trend estimates and (iii traditional contingency skill scores. The metrics were computed both using the individual station series as well as the network average regional series. The performance of the contributions depends significantly on the error metric considered. Contingency scores by themselves are not very informative. Although relative homogenization algorithms typically improve the homogeneity of temperature data, only the best ones improve

  7. NASA Software Engineering Benchmarking Study

    Science.gov (United States)

    Rarick, Heather L.; Godfrey, Sara H.; Kelly, John C.; Crumbley, Robert T.; Wifl, Joel M.

    2013-01-01

    was its software assurance practices, which seemed to rate well in comparison to the other organizational groups and also seemed to include a larger scope of activities. An unexpected benefit of the software benchmarking study was the identification of many opportunities for collaboration in areas including metrics, training, sharing of CMMI experiences and resources such as instructors and CMMI Lead Appraisers, and even sharing of assets such as documented processes. A further unexpected benefit of the study was the feedback on NASA practices that was received from some of the organizations interviewed. From that feedback, other potential areas where NASA could improve were highlighted, such as accuracy of software cost estimation and budgetary practices. The detailed report contains discussion of the practices noted in each of the topic areas, as well as a summary of observations and recommendations from each of the topic areas. The resulting 24 recommendations from the topic areas were then consolidated to eliminate duplication and culled into a set of 14 suggested actionable recommendations. This final set of actionable recommendations, listed below, are items that can be implemented to improve NASA's software engineering practices and to help address many of the items that were listed in the NASA top software engineering issues. 1. Develop and implement standard contract language for software procurements. 2. Advance accurate and trusted software cost estimates for both procured and in-house software and improve the capture of actual cost data to facilitate further improvements. 3. Establish a consistent set of objectives and expectations, specifically types of metrics at the Agency level, so key trends and models can be identified and used to continuously improve software processes and each software development effort. 4. Maintain the CMMI Maturity Level requirement for critical NASA projects and use CMMI to measure organizations developing software for NASA. 5

  8. A PWR Thorium Pin Cell Burnup Benchmark

    Energy Technology Data Exchange (ETDEWEB)

    Weaver, Kevan Dean; Zhao, X.; Pilat, E. E; Hejzlar, P.

    2000-05-01

    As part of work to evaluate the potential benefits of using thorium in LWR fuel, a thorium fueled benchmark comparison was made in this study between state-of-the-art codes, MOCUP (MCNP4B + ORIGEN2), and CASMO-4 for burnup calculations. The MOCUP runs were done individually at MIT and INEEL, using the same model but with some differences in techniques and cross section libraries. Eigenvalue and isotope concentrations were compared on a PWR pin cell model up to high burnup. The eigenvalue comparison as a function of burnup is good: the maximum difference is within 2% and the average absolute difference less than 1%. The isotope concentration comparisons are better than a set of MOX fuel benchmarks and comparable to a set of uranium fuel benchmarks reported in the literature. The actinide and fission product data sources used in the MOCUP burnup calculations for a typical thorium fuel are documented. Reasons for code vs code differences are analyzed and discussed.

  9. Scale-4 Analysis of Pressurized Water Reactor Critical Configurations: Volume 2-Sequoyah Unit 2 Cycle 3

    Energy Technology Data Exchange (ETDEWEB)

    Bowman, S.M.

    1995-01-01

    The requirements of ANSI/ANS 8.1 specify that calculational methods for away-from-reactor criticality safety analyses be validated against experimental measurements. If credit for the negative reactivity of the depleted (or spent) fuel isotopics is desired, it is necessary to benchmark computational methods against spent fuel critical configurations. This report summarizes a portion of the ongoing effort to benchmark away-from-reactor criticality analysis methods using critical configurations from commercial pressurized-water reactors. The analysis methodology selected for all the calculations reported herein is based on the codes and data provided in the SCALE-4 code system. The isotopic densities for the spent fuel assemblies in the critical configurations were calculated using the SAS2H analytical sequence of the SCALE-4 system. The sources of data and the procedures for deriving SAS2H input parameters are described in detail. The SNIKR code module was used to extract the necessary isotopic densities from the SAS2H results and provide the data in the format required by the SCALE criticality analysis modules. The CSASN analytical sequence in SCALE-4 was used to perform resonance processing of the cross sections. The KENO V.a module of SCALE-4 was used to calculate the effective multiplication factor (k{sub eff}) of each case. The SCALE-4 27-group burnup library containing ENDF/B-IV (actinides) and ENDF/B-V (fission products) data was used for all the calculations. This volume of the report documents the SCALE system analysis of three reactor critical configurations for the Sequoyah Unit 2 Cycle 3. This unit and cycle were chosen because of the relevance in spent fuel benchmark applications: (1) the unit had a significantly long downtime of 2.7 years during the middle of cycle (MOC) 3, and (2) the core consisted entirely of burned fuel at the MOC restart. The first benchmark critical calculation was the MOC restart at hot, full-power (HFP) critical conditions. The

  10. Benchmark job – Watch out!

    CERN Document Server

    Staff Association

    2017-01-01

    On 12 December 2016, in Echo No. 259, we already discussed at length the MERIT and benchmark jobs. Still, we find that a couple of issues warrant further discussion. Benchmark job – administrative decision on 1 July 2017 On 12 January 2017, the HR Department informed all staff members of a change to the effective date of the administrative decision regarding benchmark jobs. The benchmark job title of each staff member will be confirmed on 1 July 2017, instead of 1 May 2017 as originally announced in HR’s letter on 18 August 2016. Postponing the administrative decision by two months will leave a little more time to address the issues related to incorrect placement in a benchmark job. Benchmark job – discuss with your supervisor, at the latest during the MERIT interview In order to rectify an incorrect placement in a benchmark job, it is essential that the supervisor and the supervisee go over the assigned benchmark job together. In most cases, this placement has been done autom...

  11. Internal Benchmarking for Institutional Effectiveness

    Science.gov (United States)

    Ronco, Sharron L.

    2012-01-01

    Internal benchmarking is an established practice in business and industry for identifying best in-house practices and disseminating the knowledge about those practices to other groups in the organization. Internal benchmarking can be done with structures, processes, outcomes, or even individuals. In colleges or universities with multicampuses or a…

  12. Entropy-based benchmarking methods

    NARCIS (Netherlands)

    Temurshoev, Umed

    2012-01-01

    We argue that benchmarking sign-volatile series should be based on the principle of movement and sign preservation, which states that a bench-marked series should reproduce the movement and signs in the original series. We show that the widely used variants of Denton (1971) method and the growth pre

  13. Specification for the VERA Depletion Benchmark Suite

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Kang Seog [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2015-12-17

    CASL-X-2015-1014-000 iii Consortium for Advanced Simulation of LWRs EXECUTIVE SUMMARY The CASL neutronics simulator MPACT is under development for the neutronics and T-H coupled simulation for the pressurized water reactor. MPACT includes the ORIGEN-API and internal depletion module to perform depletion calculations based upon neutron-material reaction and radioactive decay. It is a challenge to validate the depletion capability because of the insufficient measured data. One of the detoured methods to validate it is to perform a code-to-code comparison for benchmark problems. In this study a depletion benchmark suite has been developed and a detailed guideline has been provided to obtain meaningful computational outcomes which can be used in the validation of the MPACT depletion capability.

  14. Scale-4 analysis of pressurized water reactor critical configurations: Volume 5, North Anna Unit 1 Cycle 5

    Energy Technology Data Exchange (ETDEWEB)

    Bowman, S.M. [Oak Ridge National Lab., TN (United States); Suto, T. [Power Reactor and Nuclear Fuel Development Corp., Tokyo (Japan)]|[Oak Ridge National Lab., TN (United States)

    1996-10-01

    ANSI/ANS 8.1 requires that calculational methods for away-from- reactor (AFR) criticality safety analyses be validated against experiment. This report summarizes part of the ongoing effort to benchmark AFR criticality analysis methods using selected critical configurations from commercial PWRs. Codes and data in the SCALE-4 code system were used. This volume documents the SCALE system analysis of one reactor critical configuration for North Anna Unit 1 Cycle 5. The KENO V.a criticality calculations for the North Anna 1 Cycle 5 beginning-of-cycle model yielded a value for k{sub eff} of 1. 0040{+-}0.0005.

  15. INTEGRAL BENCHMARK DATA FOR NUCLEAR DATA TESTING THROUGH THE ICSBEP AND THE NEWLY ORGANIZED IRPHEP

    Energy Technology Data Exchange (ETDEWEB)

    J. Blair Briggs; Lori Scott; Yolanda Rugama; Enrico Satori

    2007-04-01

    The status of the International Criticality Safety Benchmark Evaluation Project (ICSBEP) was last reported in a nuclear data conference at the International Conference on Nuclear Data for Science and Technology, ND-2004, in Santa Fe, New Mexico. Since that time the number and type of integral benchmarks have increased significantly. Included in the ICSBEP Handbook are criticality-alarm / shielding and fundamental physic benchmarks in addition to the traditional critical / subcritical benchmark data. Since ND 2004, a reactor physics counterpart to the ICSBEP, the International Reactor Physics Experiment Evaluation Project (IRPhEP) was initiated. The IRPhEP is patterned after the ICSBEP, but focuses on other integral measurements, such as buckling, spectral characteristics, reactivity effects, reactivity coefficients, kinetics measurements, reaction-rate and power distributions, nuclide compositions, and other miscellaneous-type measurements in addition to the critical configuration. The status of these two projects is discussed and selected benchmarks highlighted in this paper.

  16. Scale-4 Analysis of Pressurized Water Reactor Critical Configurations: Volume 3-Surry Unit 1 Cycle 2

    Energy Technology Data Exchange (ETDEWEB)

    Bowman, S.M.

    1995-01-01

    The requirements of ANSI/ANS 8.1 specify that calculational methods for away-from-reactor criticality safety analyses be validated against experimental measurements. If credit for the negative reactivity of the depleted (or spent) fuel isotopics is desired, it is necessary to benchmark computational methods against spent fuel critical configurations. This report summarizes a portion of the ongoing effort to benchmark away-from-reactor criticality analysis methods using selected critical configurations from commercial pressurized-water reactors. The analysis methodology selected for all the calculations in this report is based on the codes and data provided in the SCALE-4 code system. The isotopic densities for the spent fuel assemblies in the critical configurations were calculated using the SAS2H analytical sequence of the SCALE-4 system. The sources of data and the procedures for deriving SAS2H input parameters are described in detail. The SNIKR code module was used to extract the necessary isotopic densities from the SAS2H results and to provide the data in the format required by the SCALE criticality analysis modules. The CSASN analytical sequence in SCALE-4 was used to perform resonance processing of the cross sections. The KENO V.a module of SCALE-4 was used to calculate the effective multiplication factor (k{sub eff}) of each case. The SCALE-4 27-group burnup library containing ENDF/B-IV (actinides) and ENDF/B-V (fission products) data was used for all the calculations. This volume of the report documents the SCALE system analysis of two reactor critical configurations for Surry Unit 1 Cycle 2. This unit and cycle were chosen for a previous analysis using a different methodology because detailed isotopics from multidimensional reactor calculations were available from the Virginia Power Company. These data permitted a direct comparison of criticality calculations using the utility-calculated isotopics with those using the isotopics generated by the SCALE-4

  17. Calculation of the absorbed dose for the overexposed patients at the JCO criticality accident in Tokai-mura.

    Science.gov (United States)

    Ishigure, N; Endo, A; Yamaguchi, Y; Kawachi, K

    2001-09-01

    The doses for the overexposed patients were estimated by the measurement result of specific activity of 24Na in blood. The present method is almost based on documents of the International Atomic Energy Agency (IAEA) and the Oak Ridge National Laboratory. The neutron energy spectrum obtained using the ANISN code (Multigroup One-Dimensional Discrete Ordinates Transport Code System with Anisotropic Scattering) was assumed. The values in ICRP Publication 74 were applied for the doses in each organ per unit neutron fluence. Gamma-ray dose was indirectly estimated based on (a) the result of environmental monitoring around the accident site and (b) a graph in IAEA manual, which gives the kerma ratio of neutrons and gamma-rays as a function of the critical volume or the atomic ratio of hydrogen to 235U. The estimated neutron doses were 5.4 Gy for patient A. 2.9 Gy for patient B and 0.81 Gy for patient C. The estimated gamma-ray doses were 8.5 or 13 Gy for patient A, 4.5 or 6.9 Gy for patient B, and 1.3 or 2.0 Gy for patient C.

  18. BENCHMARKING ON-LINE SERVICES INDUSTRIES

    Institute of Scientific and Technical Information of China (English)

    John HAMILTON

    2006-01-01

    The Web Quality Analyser (WQA) is a new benchmarking tool for industry. It hasbeen extensively tested across services industries. Forty five critical success features are presented as measures that capture the user's perception of services industry websites. This tool differs to previous tools, in that it captures the information technology (IT) related driver sectors of website performance, along with the marketing-services related driver sectors. These driver sectors capture relevant structure, function and performance components.An 'on-off' switch measurement approach determines each component. Relevant component measures scale into a relative presence of the applicable feature, with a feature block delivering one of the sector drivers. Although it houses both measurable and a few subjective components, the WQA offers a proven and useful means to compare relevant websites.The WQA defines website strengths and weaknesses, thereby allowing for corrections to the website structure of the specific business. WQA benchmarking against services related business competitors delivers a position on the WQA index, facilitates specific website driver rating comparisons, and demonstrates where key competitive advantage may reside. This paper reports on the marketing-services driver sectors of this new benchmarking WQA tool.

  19. Benchmarking of energy time series

    Energy Technology Data Exchange (ETDEWEB)

    Williamson, M.A.

    1990-04-01

    Benchmarking consists of the adjustment of time series data from one source in order to achieve agreement with similar data from a second source. The data from the latter source are referred to as the benchmark(s), and often differ in that they are observed at a lower frequency, represent a higher level of temporal aggregation, and/or are considered to be of greater accuracy. This report provides an extensive survey of benchmarking procedures which have appeared in the statistical literature, and reviews specific benchmarking procedures currently used by the Energy Information Administration (EIA). The literature survey includes a technical summary of the major benchmarking methods and their statistical properties. Factors influencing the choice and application of particular techniques are described and the impact of benchmark accuracy is discussed. EIA applications and procedures are reviewed and evaluated for residential natural gas deliveries series and coal production series. It is found that the current method of adjusting the natural gas series is consistent with the behavior of the series and the methods used in obtaining the initial data. As a result, no change is recommended. For the coal production series, a staged approach based on a first differencing technique is recommended over the current procedure. A comparison of the adjustments produced by the two methods is made for the 1987 Indiana coal production series. 32 refs., 5 figs., 1 tab.

  20. Benchmarking in academic pharmacy departments.

    Science.gov (United States)

    Bosso, John A; Chisholm-Burns, Marie; Nappi, Jean; Gubbins, Paul O; Ross, Leigh Ann

    2010-10-11

    Benchmarking in academic pharmacy, and recommendations for the potential uses of benchmarking in academic pharmacy departments are discussed in this paper. Benchmarking is the process by which practices, procedures, and performance metrics are compared to an established standard or best practice. Many businesses and industries use benchmarking to compare processes and outcomes, and ultimately plan for improvement. Institutions of higher learning have embraced benchmarking practices to facilitate measuring the quality of their educational and research programs. Benchmarking is used internally as well to justify the allocation of institutional resources or to mediate among competing demands for additional program staff or space. Surveying all chairs of academic pharmacy departments to explore benchmarking issues such as department size and composition, as well as faculty teaching, scholarly, and service productivity, could provide valuable information. To date, attempts to gather this data have had limited success. We believe this information is potentially important, urge that efforts to gather it should be continued, and offer suggestions to achieve full participation.

  1. Benchmarking biofuels; Biobrandstoffen benchmarken

    Energy Technology Data Exchange (ETDEWEB)

    Croezen, H.; Kampman, B.; Bergsma, G.

    2012-03-15

    A sustainability benchmark for transport biofuels has been developed and used to evaluate the various biofuels currently on the market. For comparison, electric vehicles, hydrogen vehicles and petrol/diesel vehicles were also included. A range of studies as well as growing insight are making it ever clearer that biomass-based transport fuels may have just as big a carbon footprint as fossil fuels like petrol or diesel, or even bigger. At the request of Greenpeace Netherlands, CE Delft has brought together current understanding on the sustainability of fossil fuels, biofuels and electric vehicles, with particular focus on the performance of the respective energy carriers on three sustainability criteria, with the first weighing the heaviest: (1) Greenhouse gas emissions; (2) Land use; and (3) Nutrient consumption [Dutch] Greenpeace Nederland heeft CE Delft gevraagd een duurzaamheidsmeetlat voor biobrandstoffen voor transport te ontwerpen en hierop de verschillende biobrandstoffen te scoren. Voor een vergelijk zijn ook elektrisch rijden, rijden op waterstof en rijden op benzine of diesel opgenomen. Door onderzoek en voortschrijdend inzicht blijkt steeds vaker dat transportbrandstoffen op basis van biomassa soms net zoveel of zelfs meer broeikasgassen veroorzaken dan fossiele brandstoffen als benzine en diesel. CE Delft heeft voor Greenpeace Nederland op een rijtje gezet wat de huidige inzichten zijn over de duurzaamheid van fossiele brandstoffen, biobrandstoffen en elektrisch rijden. Daarbij is gekeken naar de effecten van de brandstoffen op drie duurzaamheidscriteria, waarbij broeikasgasemissies het zwaarst wegen: (1) Broeikasgasemissies; (2) Landgebruik; en (3) Nutriëntengebruik.

  2. Criticality Calculations of Fresh LEU and MOX Assemblies for Transport and Storage at the Balakovo Nuclear Power Plant

    Energy Technology Data Exchange (ETDEWEB)

    Goluoglu, S.

    2001-01-11

    Transportation of low-enriched uranium (LEU) and mixed-oxide (MOX) assemblies to and within the VVER-1000-type Balakovo Nuclear Power Plant is investigated. Effective multiplication factors for fresh fuel assemblies on the railroad platform, fresh fuel assemblies in the fuel transportation vehicle, and fresh fuel assemblies in the spent fuel storage pool are calculated. If there is no absorber between the units, the configurations with all MOX assemblies result in higher effective multiplication factors than the configurations with all LEU assemblies when the system is dry. When the system is flooded, the configurations with all LEU assemblies result in higher effective multiplication factors. For normal operating conditions, effective multiplication factors for all configurations are below the presumed upper subcritical limit of 0.95. For an accident condition of a fully loaded fuel transportation vehicle that is flooded with low-density water (possibly from a fire suppression system), the presumed upper subcritical limit is exceeded by configurations containing LEU assemblies.

  3. Benchmark Solutions for Computational Aeroacoustics (CAA) Code Validation

    Science.gov (United States)

    Scott, James R.

    2004-01-01

    NASA has conducted a series of Computational Aeroacoustics (CAA) Workshops on Benchmark Problems to develop a set of realistic CAA problems that can be used for code validation. In the Third (1999) and Fourth (2003) Workshops, the single airfoil gust response problem, with real geometry effects, was included as one of the benchmark problems. Respondents were asked to calculate the airfoil RMS pressure and far-field acoustic intensity for different airfoil geometries and a wide range of gust frequencies. This paper presents the validated that have been obtained to the benchmark problem, and in addition, compares them with classical flat plate results. It is seen that airfoil geometry has a strong effect on the airfoil unsteady pressure, and a significant effect on the far-field acoustic intensity. Those parts of the benchmark problem that have not yet been adequately solved are identified and presented as a challenge to the CAA research community.

  4. Benchmarking in water project analysis

    Science.gov (United States)

    Griffin, Ronald C.

    2008-11-01

    The with/without principle of cost-benefit analysis is examined for the possible bias that it brings to water resource planning. Theory and examples for this question are established. Because benchmarking against the demonstrably low without-project hurdle can detract from economic welfare and can fail to promote efficient policy, improvement opportunities are investigated. In lieu of the traditional, without-project benchmark, a second-best-based "difference-making benchmark" is proposed. The project authorizations and modified review processes instituted by the U.S. Water Resources Development Act of 2007 may provide for renewed interest in these findings.

  5. Integral Benchmark Data for Nuclear Data Testing Through the ICSBEP & IRPhEP

    Energy Technology Data Exchange (ETDEWEB)

    J. Blair Briggs; John D. Bess; Jim Gulliford; Ian Hill

    2013-10-01

    The status of the International Criticality Safety Benchmark Evaluation Project (ICSBEP) and International Reactor Physics Experiment Evaluation Project (IRPhEP) was last discussed directly with the nuclear data community at ND2007. Since ND2007, integral benchmark data that are available for nuclear data testing have increased significantly. The status of the ICSBEP and the IRPhEP is discussed and selected benchmark configurations that have been added to the ICSBEP and IRPhEP Handbooks since ND2007 are highlighted.

  6. Benchmarking Is Associated With Improved Quality of Care in Type 2 Diabetes

    OpenAIRE

    Hermans, Michel; Elisaf, Moses; Michel, Georges; Muls, Erik; Nobels, Frank; Vandenberghe, Hans; Brotons, Carlos; OPTIMISE (OPtimal Type 2 dIabetes Management Including benchmarking and Standard trEatment) International Steering Committee.

    2013-01-01

    OBJECTIVE: To assess prospectively the effect of benchmarking on quality of primary care for patients with type 2 diabetes by using three major modifiable cardiovascular risk factors as critical quality indicators. RESEARCH DESIGN AND METHODS: Primary care physicians treating patients with type 2 diabetes in six European countries were randomized to give standard care (control group) or standard care with feedback benchmarked against other centers in each country (benchmarking group). In both...

  7. Assessing reactor physics codes capabilities to simulate fast reactors on the example of the BN-600 benchmark

    Energy Technology Data Exchange (ETDEWEB)

    Ivanov, Vladimir [Scientific and Engineering Centre for Nuclear and Radiation Safety (SES NRS), Moscow (Russian Federation); Bousquet, Jeremy [Gesellschaft fuer Anlagen- und Reaktorsicherheit (GRS) gGmbH, Garching (Germany)

    2016-11-15

    This work aims to assess the capabilities of reactor physics codes (initially validated for thermal reactors) to simulate fast sodium cooled reactors. The BFS-62-3A critical experiment from the BN-600 Hybrid Core Benchmark Analyses was chosen for the investigation. Monte-Carlo codes (KENO from SCALE and SERPENT 2.1.23) and the deterministic diffusion code DYN3D-MG are applied to calculate the neutronic parameters. It was found that the multiplication factor and reactivity effects calculated by KENO and SERPENT using the ENDF/B-VII.0 continuous energy library are in a good agreement with each other and with the measured benchmark values. Few-groups macroscopic cross sections, required for DYN3D-MG, were prepared in applying different methods implemented in SCALE and SERPENT. The DYN3D-MG results of a simplified benchmark show reasonable agreement with results from Monte-Carlo calculations and measured values. The former results are used to justify DYN3D-MG implementation for sodium cooled fast reactors coupled deterministic analysis.

  8. Current Reactor Physics Benchmark Activities at the Idaho National Laboratory

    Energy Technology Data Exchange (ETDEWEB)

    John D. Bess; Margaret A. Marshall; Mackenzie L. Gorham; Joseph Christensen; James C. Turnbull; Kim Clark

    2011-11-01

    The International Reactor Physics Experiment Evaluation Project (IRPhEP) [1] and the International Criticality Safety Benchmark Evaluation Project (ICSBEP) [2] were established to preserve integral reactor physics and criticality experiment data for present and future research. These valuable assets provide the basis for recording, developing, and validating our integral nuclear data, and experimental and computational methods. These projects are managed through the Idaho National Laboratory (INL) and the Organisation for Economic Co-operation and Development Nuclear Energy Agency (OECD-NEA). Staff and students at the Department of Energy - Idaho (DOE-ID) and INL are engaged in the development of benchmarks to support ongoing research activities. These benchmarks include reactors or assemblies that support Next Generation Nuclear Plant (NGNP) research, space nuclear Fission Surface Power System (FSPS) design validation, and currently operational facilities in Southeastern Idaho.

  9. Benchmarking and Sustainable Transport Policy

    DEFF Research Database (Denmark)

    Gudmundsson, Henrik; Wyatt, Andrew; Gordon, Lucy

    2004-01-01

    Order to learn from the best. In 2000 the European Commission initiated research to explore benchmarking as a tool to promote policies for ‘sustainable transport’. This paper reports findings and recommendations on how to address this challenge. The findings suggest that benchmarking is a valuable...... tool that may indeed help to move forward the transport policy agenda. However, there are major conditions and limitations. First of all it is not always so straightforward to delimit, measure and compare transport services in order to establish a clear benchmark. Secondly ‘sustainable transport......’ evokes a broad range of concerns that are hard to address fully at the level of specific practices. Thirdly policies are not directly comparable across space and context. For these reasons attempting to benchmark ‘sustainable transport policies’ against one another would be a highly complex task, which...

  10. Performance Targets and External Benchmarking

    DEFF Research Database (Denmark)

    Friis, Ivar; Hansen, Allan; Vámosi, Tamás S.

    as a market mechanism that can be brought inside the firm to provide incentives for continuous improvement and the development of competitive advances. However, whereas extant research primarily has focused on the importance and effects of using external benchmarks, less attention has been directed towards...... towards the conditions for the use of the external benchmarks we provide more insights to some of the issues and challenges that are related to using this mechanism for performance management and advance competitiveness in organizations....

  11. Research on computer systems benchmarking

    Science.gov (United States)

    Smith, Alan Jay (Principal Investigator)

    1996-01-01

    This grant addresses the topic of research on computer systems benchmarking and is more generally concerned with performance issues in computer systems. This report reviews work in those areas during the period of NASA support under this grant. The bulk of the work performed concerned benchmarking and analysis of CPUs, compilers, caches, and benchmark programs. The first part of this work concerned the issue of benchmark performance prediction. A new approach to benchmarking and machine characterization was reported, using a machine characterizer that measures the performance of a given system in terms of a Fortran abstract machine. Another report focused on analyzing compiler performance. The performance impact of optimization in the context of our methodology for CPU performance characterization was based on the abstract machine model. Benchmark programs are analyzed in another paper. A machine-independent model of program execution was developed to characterize both machine performance and program execution. By merging these machine and program characterizations, execution time can be estimated for arbitrary machine/program combinations. The work was continued into the domain of parallel and vector machines, including the issue of caches in vector processors and multiprocessors. All of the afore-mentioned accomplishments are more specifically summarized in this report, as well as those smaller in magnitude supported by this grant.

  12. SPOC Benchmark Case: SNRE Model

    Energy Technology Data Exchange (ETDEWEB)

    Vishal Patel; Michael Eades; Claude Russel Joyner II

    2016-02-01

    The Small Nuclear Rocket Engine (SNRE) was modeled in the Center for Space Nuclear Research’s (CSNR) Space Propulsion Optimization Code (SPOC). SPOC aims to create nuclear thermal propulsion (NTP) geometries quickly to perform parametric studies on design spaces of historic and new NTP designs. The SNRE geometry was modeled in SPOC and a critical core with a reasonable amount of criticality margin was found. The fuel, tie-tubes, reflector, and control drum masses were predicted rather well. These are all very important for neutronics calculations so the active reactor geometries created with SPOC can continue to be trusted. Thermal calculations of the average and hot fuel channels agreed very well. The specific impulse calculations used historically and in SPOC disagree so mass flow rates and impulses differed. Modeling peripheral and power balance components that do not affect nuclear characteristics of the core is not a feature of SPOC and as such, these components should continue to be designed using other tools. A full paper detailing the available SNRE data and comparisons with SPOC outputs will be submitted as a follow-up to this abstract.

  13. Benchmarks and Quality Assurance for Online Course Development in Higher Education

    Science.gov (United States)

    Wang, Hong

    2008-01-01

    As online education has entered the main stream of the U.S. higher education, quality assurance in online course development has become a critical topic in distance education. This short article summarizes the major benchmarks related to online course development, listing and comparing the benchmarks of the National Education Association (NEA),…

  14. Calculation of the critical velocity and the critical slope of trailing hydro-transport in open channels%尾矿流槽临界流速和临界坡度计算

    Institute of Scientific and Technical Information of China (English)

    胡去劣; 熊梦婕

    2011-01-01

    The minimun flow rate which can keep the open channel without siltation is called “the critical velocity”. It is significant to the design of trailing hydro-transport. According to the assumption of the minimum energy loss of the muddy water under the critical condition, a formula is obtained by using the differential method in maths. Based on both theory and experiments, the derivation process of this formula is clear and correct. Many factors such as the characteristic parameters of the open channel and the two-phase-flow have been considered,which makes the formula more comprehensive. Meanwhile, it is also in good agreement with the experiment outcome that 94% of the test records are close to the calculation results. So this formula can be applied to multifarious working conditions in trailing hydro-transport projects. In general, this research has solved a special engineering problem successfully and it will be useful to the design of the critical velocity and critical slope of the trailing hydro-transport in open channels.%尾矿流槽保持不淤的最小流速称为临界流速,该流速对流槽设计有着重要意义.在浑水流动能量损失方程基础上,根据浑水流动临界流速的理念,采用求极值的方法,推得临界流速公式.该式综合了流槽与二相流的特性,考虑的因素较全面.并将计算结果与试验数据相比较,对比147个试验数据,其中计算相对误差小于10%的约占94%,说明该式的计算精度较高.

  15. Climate Benchmark Missions: CLARREO

    Science.gov (United States)

    Wielicki, Bruce A.; Young, David F.

    2010-01-01

    CLARREO (Climate Absolute Radiance and Refractivity Observatory) is one of the four Tier 1 missions recommended by the recent NRC decadal survey report on Earth Science and Applications from Space (NRC, 2007). The CLARREO mission addresses the need to rigorously observe climate change on decade time scales and to use decadal change observations as the most critical method to determine the accuracy of climate change projections such as those used in the Fourth Assessment Report of the Intergovernmental Panel on Climate Change (IPCC AR4). A rigorously known accuracy of both decadal change observations as well as climate projections is critical in order to enable sound policy decisions. The CLARREO mission accomplishes this critical objective through highly accurate and SI traceable decadal change observations sensitive to many of the key uncertainties in climate radiative forcings, responses, and feedbacks that in turn drive uncertainty in current climate model projections. The same uncertainties also lead to uncertainty in attribution of climate change to anthropogenic forcing. The CLARREO breakthrough in decadal climate change observations is to achieve the required levels of accuracy and traceability to SI standards for a set of observations sensitive to a wide range of key decadal change variables. These accuracy levels are determined both by the projected decadal changes as well as by the background natural variability that such signals must be detected against. The accuracy for decadal change traceability to SI standards includes uncertainties of calibration, sampling, and analysis methods. Unlike most other missions, all of the CLARREO requirements are judged not by instantaneous accuracy, but instead by accuracy in large time/space scale average decadal changes. Given the focus on decadal climate change, the NRC Decadal Survey concluded that the single most critical issue for decadal change observations was their lack of accuracy and low confidence in

  16. Parallel computing method of Monte Carlo criticality calculation based on bi-directional traversal%基于双向遍历的蒙特卡罗临界计算并行方法

    Institute of Scientific and Technical Information of China (English)

    李静; 宋婧; 龙鹏程; 刘鸿飞; 江平

    2015-01-01

    在基于蒙特卡罗粒子输运方法的反应堆模拟中,如裂变堆、聚变裂变混合堆等,达到可接受的统计误差需要大量的计算时间,这已成为蒙特卡罗方法的挑战问题之一,需通过并行计算技术解决。为解决现有方法中通信死锁的问题并保证负载均衡性,设计了基于双向遍历的临界计算并行算法。该方法基于超级蒙特卡罗核计算仿真软件系统SuperMC进行实现,以池式钠冷快堆BN600基准模型进行验证,并与MCNP进行对比。测试结果表明,串行和并行计算结果一致,且SuperMC并行效率高于MCNP。%Background: It requires much computational time with acceptable statistics errors in reactor simulations including fission reactors and fusion-fission hybrid reactors, which has become one challenge of the Monte Carlo method.Purpose: In this paper, an efficient parallel computing method was presented for resolving the communication deadlock and load balancing problem of current methods.Methods: The parallel computing method based on bi-directional traversal of criticality calculation was implemented in super Monte Carlo simulation program (SuperMC) for nuclear and radiation process. The pool-type sodium cooled fast reactor BN600 was proposed for benchmarking and was compared with MCNP.Results: Results showed that the parallel method and un-parallel methods were in agreement with each other.Conclusion: The parallel efficiency of SuperMC is higher than that of MCNP, which demonstrates the accuracy and efficiency of the parallel computing method.

  17. Overview of the 2014 Edition of the International Handbook of Evaluated Reactor Physics Benchmark Experiments (IRPhEP Handbook)

    Energy Technology Data Exchange (ETDEWEB)

    John D. Bess; J. Blair Briggs; Jim Gulliford; Ian Hill

    2014-10-01

    The International Reactor Physics Experiment Evaluation Project (IRPhEP) is a widely recognized world class program. The work of the IRPhEP is documented in the International Handbook of Evaluated Reactor Physics Benchmark Experiments (IRPhEP Handbook). Integral data from the IRPhEP Handbook is used by reactor safety and design, nuclear data, criticality safety, and analytical methods development specialists, worldwide, to perform necessary validations of their calculational techniques. The IRPhEP Handbook is among the most frequently quoted reference in the nuclear industry and is expected to be a valuable resource for future decades.

  18. Benchmarking of Proton Transport in Super Monte Carlo Simulation Program

    Science.gov (United States)

    Wang, Yongfeng; Li, Gui; Song, Jing; Zheng, Huaqing; Sun, Guangyao; Hao, Lijuan; Wu, Yican

    2014-06-01

    The Monte Carlo (MC) method has been traditionally applied in nuclear design and analysis due to its capability of dealing with complicated geometries and multi-dimensional physics problems as well as obtaining accurate results. The Super Monte Carlo Simulation Program (SuperMC) is developed by FDS Team in China for fusion, fission, and other nuclear applications. The simulations of radiation transport, isotope burn-up, material activation, radiation dose, and biology damage could be performed using SuperMC. Complicated geometries and the whole physical process of various types of particles in broad energy scale can be well handled. Bi-directional automatic conversion between general CAD models and full-formed input files of SuperMC is supported by MCAM, which is a CAD/image-based automatic modeling program for neutronics and radiation transport simulation. Mixed visualization of dynamical 3D dataset and geometry model is supported by RVIS, which is a nuclear radiation virtual simulation and assessment system. Continuous-energy cross section data from hybrid evaluated nuclear data library HENDL are utilized to support simulation. Neutronic fixed source and critical design parameters calculates for reactors of complex geometry and material distribution based on the transport of neutron and photon have been achieved in our former version of SuperMC. Recently, the proton transport has also been intergrated in SuperMC in the energy region up to 10 GeV. The physical processes considered for proton transport include electromagnetic processes and hadronic processes. The electromagnetic processes include ionization, multiple scattering, bremsstrahlung, and pair production processes. Public evaluated data from HENDL are used in some electromagnetic processes. In hadronic physics, the Bertini intra-nuclear cascade model with exitons, preequilibrium model, nucleus explosion model, fission model, and evaporation model are incorporated to treat the intermediate energy nuclear

  19. Assessment of Static Delamination Propagation Capabilities in Commercial Finite Element Codes Using Benchmark Analysis

    Science.gov (United States)

    Orifici, Adrian C.; Krueger, Ronald

    2010-01-01

    With capabilities for simulating delamination growth in composite materials becoming available, the need for benchmarking and assessing these capabilities is critical. In this study, benchmark analyses were performed to assess the delamination propagation simulation capabilities of the VCCT implementations in Marc TM and MD NastranTM. Benchmark delamination growth results for Double Cantilever Beam, Single Leg Bending and End Notched Flexure specimens were generated using a numerical approach. This numerical approach was developed previously, and involves comparing results from a series of analyses at different delamination lengths to a single analysis with automatic crack propagation. Specimens were analyzed with three-dimensional and two-dimensional models, and compared with previous analyses using Abaqus . The results demonstrated that the VCCT implementation in Marc TM and MD Nastran(TradeMark) was capable of accurately replicating the benchmark delamination growth results and that the use of the numerical benchmarks offers advantages over benchmarking using experimental and analytical results.

  20. Benchmark simulation models, quo vadis?

    Science.gov (United States)

    Jeppsson, U; Alex, J; Batstone, D J; Benedetti, L; Comas, J; Copp, J B; Corominas, L; Flores-Alsina, X; Gernaey, K V; Nopens, I; Pons, M-N; Rodríguez-Roda, I; Rosen, C; Steyer, J-P; Vanrolleghem, P A; Volcke, E I P; Vrecko, D

    2013-01-01

    As the work of the IWA Task Group on Benchmarking of Control Strategies for wastewater treatment plants (WWTPs) is coming to an end, it is essential to disseminate the knowledge gained. For this reason, all authors of the IWA Scientific and Technical Report on benchmarking have come together to provide their insights, highlighting areas where knowledge may still be deficient and where new opportunities are emerging, and to propose potential avenues for future development and application of the general benchmarking framework and its associated tools. The paper focuses on the topics of temporal and spatial extension, process modifications within the WWTP, the realism of models, control strategy extensions and the potential for new evaluation tools within the existing benchmark system. We find that there are major opportunities for application within all of these areas, either from existing work already being done within the context of the benchmarking simulation models (BSMs) or applicable work in the wider literature. Of key importance is increasing capability, usability and transparency of the BSM package while avoiding unnecessary complexity.

  1. Critical appraisal of assumptions in chains of model calculations used to project local climate impacts for adaptation decision support—the case of Baakse Beek

    Science.gov (United States)

    van der Sluijs, Jeroen P.; Arjan Wardekker, J.

    2015-04-01

    In order to enable anticipation and proactive adaptation, local decision makers increasingly seek detailed foresight about regional and local impacts of climate change. To this end, the Netherlands Models and Data-Centre implemented a pilot chain of sequentially linked models to project local climate impacts on hydrology, agriculture and nature under different national climate scenarios for a small region in the east of the Netherlands named Baakse Beek. The chain of models sequentially linked in that pilot includes a (future) weather generator and models of respectively subsurface hydrogeology, ground water stocks and flows, soil chemistry, vegetation development, crop yield and nature quality. These models typically have mismatching time step sizes and grid cell sizes. The linking of these models unavoidably involves the making of model assumptions that can hardly be validated, such as those needed to bridge the mismatches in spatial and temporal scales. Here we present and apply a method for the systematic critical appraisal of model assumptions that seeks to identify and characterize the weakest assumptions in a model chain. The critical appraisal of assumptions presented in this paper has been carried out ex-post. For the case of the climate impact model chain for Baakse Beek, the three most problematic assumptions were found to be: land use and land management kept constant over time; model linking of (daily) ground water model output to the (yearly) vegetation model around the root zone; and aggregation of daily output of the soil hydrology model into yearly input of a so called ‘mineralization reduction factor’ (calculated from annual average soil pH and daily soil hydrology) in the soil chemistry model. Overall, the method for critical appraisal of model assumptions presented and tested in this paper yields a rich qualitative insight in model uncertainty and model quality. It promotes reflectivity and learning in the modelling community, and leads to

  2. Benchmark experiment on vanadium assembly with D-T neutrons. In-situ measurement

    Energy Technology Data Exchange (ETDEWEB)

    Maekawa, Fujio; Kasugai, Yoshimi; Konno, Chikara; Wada, Masayuki; Oyama, Yukio; Ikeda, Yujiro [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment; Murata, Isao; Kokooo; Takahashi, Akito

    1998-03-01

    Fusion neutronics benchmark experimental data on vanadium were obtained for neutrons in almost entire energies as well as secondary gamma-rays. Benchmark calculations for the experiment were performed to investigate validity of recent nuclear data files, i.e., JENDL Fusion File, FENDL/E-1.0 and EFF-3. (author)

  3. Randomized benchmarking of multiqubit gates.

    Science.gov (United States)

    Gaebler, J P; Meier, A M; Tan, T R; Bowler, R; Lin, Y; Hanneke, D; Jost, J D; Home, J P; Knill, E; Leibfried, D; Wineland, D J

    2012-06-29

    We describe an extension of single-qubit gate randomized benchmarking that measures the error of multiqubit gates in a quantum information processor. This platform-independent protocol evaluates the performance of Clifford unitaries, which form a basis of fault-tolerant quantum computing. We implemented the benchmarking protocol with trapped ions and found an error per random two-qubit Clifford unitary of 0.162±0.008, thus setting the first benchmark for such unitaries. By implementing a second set of sequences with an extra two-qubit phase gate inserted after each step, we extracted an error per phase gate of 0.069±0.017. We conducted these experiments with transported, sympathetically cooled ions in a multizone Paul trap-a system that can in principle be scaled to larger numbers of ions.

  4. Fast Neutron Fluence Calculation Benchmark Analysis Based on 3D MC-SN Bidirectional Coupling Method%基于蒙特卡罗-离散纵标双向耦合方法的快中子注量基准分析

    Institute of Scientific and Technical Information of China (English)

    袁龙军; 陈义学; 韩静茹

    2014-01-01

    蒙特卡罗(MC)-离散纵标(SN )双向耦合方法是解决大型复杂核装置屏蔽问题的有效方法。本文针对三维MC-SN 双向耦合方法在大型压水堆核电站屏蔽计算中的应用,进行了基准验证分析。基于美国核管会(NRC)发布的NUREG/CR-6115压水堆基准模型,采用自主开发的三维MC-SN 双向耦合屏蔽计算分析方法,利用MCNP4C精确计算堆芯到热屏蔽精细模型以及位于压力容器内部计算区域的精确模型,三维SN 程序TORT用于进行热屏蔽到第2下降区外表面间的计算。通过自主研发的接口程序实现MC粒子概率分布与SN 角通量密度间的相互转换,实现MC和SN 双向耦合计算。三维MC-SN双向耦合方法计算结果与基准报告提供的MCNP、DORT结果符合良好,初步验证了该方法解决大型复杂核装置屏蔽问题的可行性。%The Monte Carlo (MC)-discrete ordinates (SN ) bidirectional coupling method is an efficient approach to solve shielding calculation of the large complex nuclear facility .The test calculation was taken by the application of the MC-SN bidirectional coupling method on the shielding calculation of the large PWR nuclear facility .Based on the characteristics of NUREG/CR-6115 PWR benchmark model issued by the NRC ,3D Monte Carlo code was employed to accurately simulate the structure from the core to the thermal shield and the dedicated model of the calculation parts locating in the pressure vessel ,while the TORT was used for the calculation from the thermal shield to the second down-comer region .The transform between particle probability distribution of MC and angular flux density of SN was realized by the interface program to achieve the coupling calculation . The calculation results were compared with MCNP and DORT solutions of benchmark report and satisfactory agreements were obtained . The preliminary validity of feasibility by using the method to solve shielding problem of a

  5. Benchmark Analysis of Subcritical Noise Measurements on a Nickel-Reflected Plutonium Metal Sphere

    Energy Technology Data Exchange (ETDEWEB)

    John D. Bess; Jesson Hutchinson

    2009-09-01

    Subcritical experiments using californium source-driven noise analysis (CSDNA) and Feynman variance-to-mean methods were performed with an alpha-phase plutonium sphere reflected by nickel shells, up to a maximum thickness of 7.62 cm. Both methods provide means of determining the subcritical multiplication of a system containing nuclear material. A benchmark analysis of the experiments was performed for inclusion in the 2010 edition of the International Handbook of Evaluated Criticality Safety Benchmark Experiments. Benchmark models have been developed that represent these subcritical experiments. An analysis of the computed eigenvalues and the uncertainty in the experiment and methods was performed. The eigenvalues computed using the CSDNA method were very close to those calculated using MCNP5; however, computed eigenvalues are used in the analysis of the CSDNA method. Independent calculations using KENO-VI provided similar eigenvalues to those determined using the CSDNA method and MCNP5. A slight trend with increasing nickel-reflector thickness was seen when comparing MCNP5 and KENO-VI results. For the 1.27-cm-thick configuration the MCNP eigenvalue was approximately 300 pcm greater. The calculated KENO eigenvalue was about 300 pcm greater for the 7.62-cm-thick configuration. The calculated results were approximately the same for a 5-cm-thick shell. The eigenvalues determined using the Feynman method are up to approximately 2.5% lower than those determined using either the CSDNA method or the Monte Carlo codes. The uncertainty in the results from either method was not large enough to account for the bias between the two experimental methods. An ongoing investigation is being performed to assess what potential uncertainties and/or biases exist that have yet to be properly accounted for. The dominant uncertainty in the CSDNA analysis was the uncertainty in selecting a neutron cross-section library for performing the analysis of the data. The uncertainty in the

  6. The Department of Energy Nuclear Criticality Safety Program

    Science.gov (United States)

    Felty, James R.

    2005-05-01

    This paper broadly covers key events and activities from which the Department of Energy Nuclear Criticality Safety Program (NCSP) evolved. The NCSP maintains fundamental infrastructure that supports operational criticality safety programs. This infrastructure includes continued development and maintenance of key calculational tools, differential and integral data measurements, benchmark compilation, development of training resources, hands-on training, and web-based systems to enhance information preservation and dissemination. The NCSP was initiated in response to Defense Nuclear Facilities Safety Board Recommendation 97-2, Criticality Safety, and evolved from a predecessor program, the Nuclear Criticality Predictability Program, that was initiated in response to Defense Nuclear Facilities Safety Board Recommendation 93-2, The Need for Critical Experiment Capability. This paper also discusses the role Dr. Sol Pearlstein played in helping the Department of Energy lay the foundation for a robust and enduring criticality safety infrastructure.

  7. Perceptual hashing algorithms benchmark suite

    Institute of Scientific and Technical Information of China (English)

    Zhang Hui; Schmucker Martin; Niu Xiamu

    2007-01-01

    Numerous perceptual hashing algorithms have been developed for identification and verification of multimedia objects in recent years. Many application schemes have been adopted for various commercial objects. Developers and users are looking for a benchmark tool to compare and evaluate their current algorithms or technologies. In this paper, a novel benchmark platform is presented. PHABS provides an open framework and lets its users define their own test strategy, perform tests, collect and analyze test data. With PHABS, various performance parameters of algorithms can be tested, and different algorithms or algorithms with different parameters can be evaluated and compared easily.

  8. Benchmark simulation models, quo vadis?

    DEFF Research Database (Denmark)

    Jeppsson, U.; Alex, J; Batstone, D. J.;

    2013-01-01

    and spatial extension, process modifications within the WWTP, the realism of models, control strategy extensions and the potential for new evaluation tools within the existing benchmark system. We find that there are major opportunities for application within all of these areas, either from existing work...... already being done within the context of the benchmarking simulation models (BSMs) or applicable work in the wider literature. Of key importance is increasing capability, usability and transparency of the BSM package while avoiding unnecessary complexity. © IWA Publishing 2013....

  9. The contextual benchmark method: benchmarking e-government services

    NARCIS (Netherlands)

    Jansen, Jurjen; Vries, de Sjoerd; Schaik, van Paul

    2010-01-01

    This paper offers a new method for benchmarking e-Government services. Government organizations no longer doubt the need to deliver their services on line. Instead, the question that is more relevant is how well the electronic services offered by a particular organization perform in comparison with

  10. Catalog and history of the experiments of criticality Saclay (1958-1964) Valduc / Building 10 (1964-2003); Catalogue et historique des experiences de criticite Saclay (1958 - 1964) Valduc / Batiment 10 (1964-2003)

    Energy Technology Data Exchange (ETDEWEB)

    Poullot, G.; Dumont, V.; Anno, J.; Cousinou, P. [Institut de Radioprotection et de Surete Nucleaire (IRSN), 92 - Fontenay aux Roses (France); Grivot, P.; Girault, E.; Fouillaud, P.; Barbry, F. [CEA Valduc, 21 - Is-sur-Tille (France)

    2003-07-01

    The group ' International Criticality Safety Evaluation Benchmark evaluation project ' (I.C.S.B.E.P.) has for aim to supply to the international community experiments of benchmarks criticality, of certified quality, used to guarantee the qualification of criticality calculation codes. Have been defined: a structure of experiments classification, a format of standard presentation, a structure of work with evaluation, internal and external checks, presentation in plenary session. After favourable opinion of the work group, the synthesis document called evaluation is integrated to the general report I.C.S.B.E.P. (N.C.)

  11. Implications of Monte Carlo Statistical Errors in Criticality Safety Assessments

    Energy Technology Data Exchange (ETDEWEB)

    Pevey, Ronald E.

    2005-09-15

    Most criticality safety calculations are performed using Monte Carlo techniques because of Monte Carlo's ability to handle complex three-dimensional geometries. For Monte Carlo calculations, the more histories sampled, the lower the standard deviation of the resulting estimates. The common intuition is, therefore, that the more histories, the better; as a result, analysts tend to run Monte Carlo analyses as long as possible (or at least to a minimum acceptable uncertainty). For Monte Carlo criticality safety analyses, however, the optimization situation is complicated by the fact that procedures usually require that an extra margin of safety be added because of the statistical uncertainty of the Monte Carlo calculations. This additional safety margin affects the impact of the choice of the calculational standard deviation, both on production and on safety. This paper shows that, under the assumptions of normally distributed benchmarking calculational errors and exact compliance with the upper subcritical limit (USL), the standard deviation that optimizes production is zero, but there is a non-zero value of the calculational standard deviation that minimizes the risk of inadvertently labeling a supercritical configuration as subcritical. Furthermore, this value is shown to be a simple function of the typical benchmarking step outcomes--the bias, the standard deviation of the bias, the upper subcritical limit, and the number of standard deviations added to calculated k-effectives before comparison to the USL.

  12. The European Union benchmarking experience. From euphoria to fatigue?

    Directory of Open Access Journals (Sweden)

    Michael Zängle

    2004-06-01

    Full Text Available Even if one may agree with the possible criticism of the Lisbon process as being too vague in com-mitment or as lacking appropriate statistical techniques and indicators, the benchmarking system pro-vided by EUROSTAT seems to be sufficiently effective in warning against imminent failure. The Lisbon objectives are very demanding. This holds true even if each of the objectives is looked at in isolation. But 'Lisbon' is more demanding than that, requiring a combination of several objectives to be achieved simultaneously (GDP growth, labour productivity, job-content of growth, higher quality of jobs and greater social cohesion. Even to countries like Ireland, showing exceptionally high performance in GDP growth and employment promotion during the period under investigation, achieving potentially conflicting objectives simultaneously seems to be beyond feasibility. The European Union benchmark-ing exercise is embedded in the context of the Open Method(s of Co-ordination (OMC. This context makes the benchmarking approach part and parcel of an overarching philosophy, which relates the benchmarking indicators to each other and assigns to them their role in corroborating the increasingly dominating project of the 'embedded neo-liberalism'. Against this background, the present paper is focussed on the following point. With the EU bench-marking system being effective enough to make the imminent under-achievement visible, there is a danger of disillusionment and 'benchmarking fatigue', which may provoke an ideological crisis. The dominant project being so deeply rooted, however, chances are high that this crisis will be solved im-manently in terms of embedded neo-liberalism by strengthening the neo-liberal branch of the Euro-pean project. Confining itself to the Europe of Fifteen, the analysis draws on EUROSTAT's database of Structural Indicators. ...

  13. Effects of exposure imprecision on estimation of the benchmark dose

    DEFF Research Database (Denmark)

    Budtz-Jørgensen, Esben; Keiding, Niels; Grandjean, Philippe

    2004-01-01

    approach is one of the most widely used methods for development of exposure limits. An important advantage of this approach is that it can be applied to observational data. However, in this type of data, exposure markers are seldom measured without error. It is shown that, if the exposure error is ignored......, then the benchmark approach produces results that are biased toward higher and less protective levels. It is therefore important to take exposure measurement error into account when calculating benchmark doses. Methods that allow this adjustment are described and illustrated in data from an epidemiological study...

  14. Simple mathematical law benchmarks human confrontations

    Science.gov (United States)

    Johnson, Neil F.; Medina, Pablo; Zhao, Guannan; Messinger, Daniel S.; Horgan, John; Gill, Paul; Bohorquez, Juan Camilo; Mattson, Whitney; Gangi, Devon; Qi, Hong; Manrique, Pedro; Velasquez, Nicolas; Morgenstern, Ana; Restrepo, Elvira; Johnson, Nicholas; Spagat, Michael; Zarama, Roberto

    2013-12-01

    Many high-profile societal problems involve an individual or group repeatedly attacking another - from child-parent disputes, sexual violence against women, civil unrest, violent conflicts and acts of terror, to current cyber-attacks on national infrastructure and ultrafast cyber-trades attacking stockholders. There is an urgent need to quantify the likely severity and timing of such future acts, shed light on likely perpetrators, and identify intervention strategies. Here we present a combined analysis of multiple datasets across all these domains which account for >100,000 events, and show that a simple mathematical law can benchmark them all. We derive this benchmark and interpret it, using a minimal mechanistic model grounded by state-of-the-art fieldwork. Our findings provide quantitative predictions concerning future attacks; a tool to help detect common perpetrators and abnormal behaviors; insight into the trajectory of a `lone wolf' identification of a critical threshold for spreading a message or idea among perpetrators; an intervention strategy to erode the most lethal clusters; and more broadly, a quantitative starting point for cross-disciplinary theorizing about human aggression at the individual and group level, in both real and online worlds.

  15. Toxicological benchmarks for screening potential contaminants of concern for effects on aquatic biota: 1994 Revision

    Energy Technology Data Exchange (ETDEWEB)

    Suter, G.W. II [Oak Ridge National Lab., TN (United States); Mabrey, J.B. [University of West Florida, Pensacola, FL (United States)

    1994-07-01

    This report presents potential screening benchmarks for protection of aquatic life from contaminants in water. Because there is no guidance for screening benchmarks, a set of alternative benchmarks is presented herein. The alternative benchmarks are based on different conceptual approaches to estimating concentrations causing significant effects. For the upper screening benchmark, there are the acute National Ambient Water Quality Criteria (NAWQC) and the Secondary Acute Values (SAV). The SAV concentrations are values estimated with 80% confidence not to exceed the unknown acute NAWQC for those chemicals with no NAWQC. The alternative chronic benchmarks are the chronic NAWQC, the Secondary Chronic Value (SCV), the lowest chronic values for fish and daphnids from chronic toxicity tests, the estimated EC20 for a sensitive species, and the concentration estimated to cause a 20% reduction in the recruit abundance of largemouth bass. It is recommended that ambient chemical concentrations be compared to all of these benchmarks. If NAWQC are exceeded, the chemicals must be contaminants of concern because the NAWQC are applicable or relevant and appropriate requirements (ARARs). If NAWQC are not exceeded, but other benchmarks are, contaminants should be selected on the basis of the number of benchmarks exceeded and the conservatism of the particular benchmark values, as discussed in the text. To the extent that toxicity data are available, this report presents the alternative benchmarks for chemicals that have been detected on the Oak Ridge Reservation. It also presents the data used to calculate benchmarks and the sources of the data. It compares the benchmarks and discusses their relative conservatism and utility.

  16. Benchmarking Universiteitsvastgoed: Managementinformatie bij vastgoedbeslissingen

    NARCIS (Netherlands)

    Den Heijer, A.C.; De Vries, J.C.

    2004-01-01

    Voor u ligt het eindrapport van het onderzoek "Benchmarking universiteitsvastgoed". Dit rapport is de samenvoeging van twee deel producten: het theorierapport (verschenen in december 2003) en het praktijkrapport (verschenen in januari 2004). Onderwerpen in het theoriedeel zijn de analyse van andere

  17. Benchmarked Library Websites Comparative Study

    KAUST Repository

    Ramli, Rindra M.

    2015-01-01

    This presentation provides an analysis of services provided by the benchmarked library websites. The exploratory study includes comparison of these websites against a list of criterion and presents a list of services that are most commonly deployed by the selected websites. In addition to that, the investigators proposed a list of services that could be provided via the KAUST library website.

  18. 42 CFR 440.385 - Delivery of benchmark and benchmark-equivalent coverage through managed care entities.

    Science.gov (United States)

    2010-10-01

    ... 42 Public Health 4 2010-10-01 2010-10-01 false Delivery of benchmark and benchmark-equivalent...: GENERAL PROVISIONS Benchmark Benefit and Benchmark-Equivalent Coverage § 440.385 Delivery of benchmark and benchmark-equivalent coverage through managed care entities. In implementing benchmark or...

  19. Sustainable value assessment of farms using frontier efficiency benchmarks.

    Science.gov (United States)

    Van Passel, Steven; Van Huylenbroeck, Guido; Lauwers, Ludwig; Mathijs, Erik

    2009-07-01

    Appropriate assessment of firm sustainability facilitates actor-driven processes towards sustainable development. The methodology in this paper builds further on two proven methodologies for the assessment of sustainability performance: it combines the sustainable value approach with frontier efficiency benchmarks. The sustainable value methodology tries to relate firm performance to the use of different resources. This approach assesses contributions to corporate sustainability by comparing firm resource productivity with the resource productivity of a benchmark, and this for all resources considered. The efficiency is calculated by estimating the production frontier indicating the maximum feasible production possibilities. In this research, the sustainable value approach is combined with efficiency analysis methods to benchmark sustainability assessment. In this way, the production theoretical underpinnings of efficiency analysis enrich the sustainable value approach. The methodology is presented using two different functional forms: the Cobb-Douglas and the translog functional forms. The simplicity of the Cobb-Douglas functional form as benchmark is very attractive but it lacks flexibility. The translog functional form is more flexible but has the disadvantage that it requires a lot of data to avoid estimation problems. Using frontier methods for deriving firm specific benchmarks has the advantage that the particular situation of each company is taken into account when assessing sustainability. Finally, we showed that the methodology can be used as an integrative sustainability assessment tool for policy measures.

  20. Calculation of benchmarks with a shear beam model

    NARCIS (Netherlands)

    Ferreira, D.

    2015-01-01

    Fiber models for beam and shell elements allow for relatively rapid finite element analysis of concrete structures and structural elements. This project aims at the development of the formulation of such elements and a pilot implementation. Standard nonlinear fiber beam formulations do not account

  1. Benchmarking clinical photography services in the NHS.

    Science.gov (United States)

    Arbon, Giles

    2015-01-01

    Benchmarking is used in services across the National Health Service (NHS) using various benchmarking programs. Clinical photography services do not have a program in place and services have to rely on ad hoc surveys of other services. A trial benchmarking exercise was undertaken with 13 services in NHS Trusts. This highlights valuable data and comparisons that can be used to benchmark and improve services throughout the profession.

  2. 高水位隧道衬砌临界水头数值计算%Numerical Calculation of High Water Head Tunnel on Critical Water Pressure

    Institute of Scientific and Technical Information of China (English)

    郑波; 吴剑; 王建宇

    2015-01-01

    Two kinds of typical tunnel section is chosen for analyzing the critical water pressure of high water head tunnel by numerical calculation method.The calculation results show that:①for the design speed 250 km/h double-line tunnel, when the lining thickness is respectively 30 cm, 50 cm, plain concrete lining is not liable for any water pressure load, and C30 reinforced concrete lining can bear respectively 11.2 m, 34.2 m water head load;②for the design speed 200 km/h single track railway tunnel, when the lining thickness is respectively 30 cm, 50 cm, 60 cm, plain concrete lining can bear respectively 0 m, 1 m, 5.5 m, and C30 reinforced concrete lin-ing can bear respectively 26 m, 55.2 m, 73.5 m water head load;③the tunnel shape has great influence on criti-cal water head, single-track railway tunnel can bear greater critical water pressure than double-track tunnel;④for high water head tunnel, according to new concept of water control in tunnel, the amount of water inflow should be limited.%选取2种典型隧道断面,对其所能承受的极限水头值进行分析。计算结果表明:①对设计时速250 km/h的客专双线隧道,当衬砌厚度分别为t=30 cm、50 cm时,素混凝土衬砌均不能承担任何水压力荷载,C30钢筋混凝土衬砌能承受的极限水位分别为11.2 m、34.2 m;②对设计时速200 km/h的单线铁路隧道,当衬砌厚度分别为30 cm、50 cm、60 cm时,素混凝土衬砌能承受的极限水头分别为0 m、1.0 m、5.5 m,C30钢筋混凝土衬砌能承受的极限水头分别为26.0 m、55.2 m、73.5 m;③隧道形状对所承受的极限水头值有较大的影响,单线铁路隧道比客专双线隧道更有利于承受水压力;④对于高水位隧道,应该采取“以堵为主,限量排放”的措施,降低衬砌水压力,使设计既安全又经济。

  3. The LDBC Social Network Benchmark: Interactive Workload

    NARCIS (Netherlands)

    Erling, O.; Averbuch, A.; Larriba-Pey, J.; Chafi, H.; Gubichev, A.; Prat, A.; Pham, M.D.; Boncz, P.A.

    2015-01-01

    The Linked Data Benchmark Council (LDBC) is now two years underway and has gathered strong industrial participation for its mission to establish benchmarks, and benchmarking practices for evaluating graph data management systems. The LDBC introduced a new choke-point driven methodology for developin

  4. How Benchmarking and Higher Education Came Together

    Science.gov (United States)

    Levy, Gary D.; Ronco, Sharron L.

    2012-01-01

    This chapter introduces the concept of benchmarking and how higher education institutions began to use benchmarking for a variety of purposes. Here, benchmarking is defined as a strategic and structured approach whereby an organization compares aspects of its processes and/or outcomes to those of another organization or set of organizations to…

  5. Benchmarking: Achieving the best in class

    Energy Technology Data Exchange (ETDEWEB)

    Kaemmerer, L

    1996-05-01

    Oftentimes, people find the process of organizational benchmarking an onerous task, or, because they do not fully understand the nature of the process, end up with results that are less than stellar. This paper presents the challenges of benchmarking and reasons why benchmarking can benefit an organization in today`s economy.

  6. Evaluation of Saxton critical experiments

    Energy Technology Data Exchange (ETDEWEB)

    Joo, Hyung Kook; Noh, Jae Man; Jung, Hyung Guk; Kim, Young Il; Kim, Young Jin [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    1997-12-31

    As a part of International Criticality Safety Benchmark Evaluation Project (ICSBEP), SAXTON critical experiments were reevaluated. The effects of k{sub eff} of the uncertainties in experiment parameters, fuel rod characterization, soluble boron, critical water level, core structure, {sup 241}Am and {sup 241}Pu isotope number densities, random pitch error, duplicated experiment, axial fuel position, model simplification, etc., were evaluated and added in benchmark-model k{sub eff}. In addition to detailed model, the simplified model for Saxton critical experiments was constructed by omitting the top, middle, and bottom grids and ignoring the fuel above water. 6 refs., 1 fig., 3 tabs. (Author)

  7. Benchmarking Variable Selection in QSAR.

    Science.gov (United States)

    Eklund, Martin; Norinder, Ulf; Boyer, Scott; Carlsson, Lars

    2012-02-01

    Variable selection is important in QSAR modeling since it can improve model performance and transparency, as well as reduce the computational cost of model fitting and predictions. Which variable selection methods that perform well in QSAR settings is largely unknown. To address this question we, in a total of 1728 benchmarking experiments, rigorously investigated how eight variable selection methods affect the predictive performance and transparency of random forest models fitted to seven QSAR datasets covering different endpoints, descriptors sets, types of response variables, and number of chemical compounds. The results show that univariate variable selection methods are suboptimal and that the number of variables in the benchmarked datasets can be reduced with about 60 % without significant loss in model performance when using multivariate adaptive regression splines MARS and forward selection.

  8. Methodology for Benchmarking IPsec Gateways

    Directory of Open Access Journals (Sweden)

    Adam Tisovský

    2012-08-01

    Full Text Available The paper analyses forwarding performance of IPsec gateway over the rage of offered loads. It focuses on the forwarding rate and packet loss particularly at the gateway’s performance peak and at the state of gateway’s overload. It explains possible performance degradation when the gateway is overloaded by excessive offered load. The paper further evaluates different approaches for obtaining forwarding performance parameters – a widely used throughput described in RFC 1242, maximum forwarding rate with zero packet loss and us proposed equilibrium throughput. According to our observations equilibrium throughput might be the most universal parameter for benchmarking security gateways as the others may be dependent on the duration of test trials. Employing equilibrium throughput would also greatly shorten the time required for benchmarking. Lastly, the paper presents methodology and a hybrid step/binary search algorithm for obtaining value of equilibrium throughput.

  9. Development of solutions to benchmark piping problems

    Energy Technology Data Exchange (ETDEWEB)

    Reich, M; Chang, T Y; Prachuktam, S; Hartzman, M

    1977-12-01

    Benchmark problems and their solutions are presented. The problems consist in calculating the static and dynamic response of selected piping structures subjected to a variety of loading conditions. The structures range from simple pipe geometries to a representative full scale primary nuclear piping system, which includes the various components and their supports. These structures are assumed to behave in a linear elastic fashion only, i.e., they experience small deformations and small displacements with no existing gaps, and remain elastic through their entire response. The solutions were obtained by using the program EPIPE, which is a modification of the widely available program SAP IV. A brief outline of the theoretical background of this program and its verification is also included.

  10. A Benchmark for Management Effectiveness

    OpenAIRE

    Zimmermann, Bill; Chanaron, Jean-Jacques; Klieb, Leslie

    2007-01-01

    International audience; This study presents a tool to gauge managerial effectiveness in the form of a questionnaire that is easy to administer and score. The instrument covers eight distinct areas of organisational climate and culture of management inside a company or department. Benchmark scores were determined by administering sample-surveys to a wide cross-section of individuals from numerous firms in Southeast Louisiana, USA. Scores remained relatively constant over a seven-year timeframe...

  11. Restaurant Energy Use Benchmarking Guideline

    Energy Technology Data Exchange (ETDEWEB)

    Hedrick, R.; Smith, V.; Field, K.

    2011-07-01

    A significant operational challenge for food service operators is defining energy use benchmark metrics to compare against the performance of individual stores. Without metrics, multiunit operators and managers have difficulty identifying which stores in their portfolios require extra attention to bring their energy performance in line with expectations. This report presents a method whereby multiunit operators may use their own utility data to create suitable metrics for evaluating their operations.

  12. Thermal Performance Benchmarking: Annual Report

    Energy Technology Data Exchange (ETDEWEB)

    Moreno, Gilbert

    2016-04-08

    The goal for this project is to thoroughly characterize the performance of state-of-the-art (SOA) automotive power electronics and electric motor thermal management systems. Information obtained from these studies will be used to: Evaluate advantages and disadvantages of different thermal management strategies; establish baseline metrics for the thermal management systems; identify methods of improvement to advance the SOA; increase the publicly available information related to automotive traction-drive thermal management systems; help guide future electric drive technologies (EDT) research and development (R&D) efforts. The performance results combined with component efficiency and heat generation information obtained by Oak Ridge National Laboratory (ORNL) may then be used to determine the operating temperatures for the EDT components under drive-cycle conditions. In FY15, the 2012 Nissan LEAF power electronics and electric motor thermal management systems were benchmarked. Testing of the 2014 Honda Accord Hybrid power electronics thermal management system started in FY15; however, due to time constraints it was not possible to include results for this system in this report. The focus of this project is to benchmark the thermal aspects of the systems. ORNL's benchmarking of electric and hybrid electric vehicle technology reports provide detailed descriptions of the electrical and packaging aspects of these automotive systems.

  13. HS06 Benchmark for an ARM Server

    Science.gov (United States)

    Kluth, Stefan

    2014-06-01

    We benchmarked an ARM cortex-A9 based server system with a four-core CPU running at 1.1 GHz. The system used Ubuntu 12.04 as operating system and the HEPSPEC 2006 (HS06) benchmarking suite was compiled natively with gcc-4.4 on the system. The benchmark was run for various settings of the relevant gcc compiler options. We did not find significant influence from the compiler options on the benchmark result. The final HS06 benchmark result is 10.4.

  14. HS06 Benchmark for an ARM Server

    CERN Document Server

    Kluth, Stefan

    2013-01-01

    We benchmarked an ARM cortex-A9 based server system with a four-core CPU running at 1.1 GHz. The system used Ubuntu 12.04 as operating system and the HEPSPEC 2006 (HS06) benchmarking suite was compiled natively with gcc-4.4 on the system. The benchmark was run for various settings of the relevant gcc compiler options. We did not find significant influence from the compiler options on the benchmark result. The final HS06 benchmark result is 10.4.

  15. Scale-4 Analysis of Pressurized Water Reactor Critical Configurations: Volume 5 - North Anna Unit 1 Cycle 5

    Energy Technology Data Exchange (ETDEWEB)

    Bowman, S.M.

    1993-01-01

    The requirements of ANSI/ANS 8.1 specify that calculational methods for away-from-reactor (AFR) criticality safety analyses be validated against experimental measurements. If credit for the negative reactivity of the depleted (or spent) fuel isotopics is desired, it is necessary to benchmark computational methods against spent fuel critical configurations. This report summarizes a portion of the ongoing effort to benchmark AFR criticality analysis methods using selected critical configurations from commercial pressurized-water reactors (PWR). The analysis methodology selected for all calculations reported herein was the codes and data provided in the SCALE-4 code system. The isotopic densities for the spent fuel assemblies in the critical configurations were calculated using the SAS2H analytical sequence of the SCALE-4 system. The sources of data and the procedures for deriving SAS2H input parameters are described in detail. The SNIKR code module was used to extract the necessary isotopic densities from the SAS2H results and to provide the data in the format required by the SCALE criticality analysis modules. The CSASN analytical sequence in SCALE-4 was used to perform resonance processing of the cross sections. The KENO V.a module of SCALE-4 was used to calculate the effective multiplication factor (k{sub eff}) of each case. The SCALE-4 27-group burnup library containing ENDF/B-IV (actinides) and ENDF/B-V (fission products) data was used for all the calculations. This volume of the report documents the SCALE system analysis of one reactor critical configuration for North Anna Unit 1 Cycle 5. This unit and cycle were chosen for a previous analysis using a different methodology because detailed isotopics from multidimensional reactor calculations were available from the Virginia Power Company. These data permitted comparison of criticality calculations directly using the utility-calculated isotopics to those using the isotopics generated by the SCALE-4 SAS2H

  16. Argonne Code Center: Benchmark problem book.

    Energy Technology Data Exchange (ETDEWEB)

    None, None

    1977-06-01

    This book is an outgrowth of activities of the Computational Benchmark Problems Committee of the Mathematics and Computation Division of the American Nuclear Society. This is the second supplement of the original benchmark book which was first published in February, 1968 and contained computational benchmark problems in four different areas. Supplement No. 1, which was published in December, 1972, contained corrections to the original benchmark book plus additional problems in three new areas. The current supplement. Supplement No. 2, contains problems in eight additional new areas. The objectives of computational benchmark work and the procedures used by the committee in pursuing the objectives are outlined in the original edition of the benchmark book (ANL-7416, February, 1968). The members of the committee who have made contributions to Supplement No. 2 are listed below followed by the contributors to the earlier editions of the benchmark book.

  17. Benchmarks

    Data.gov (United States)

    Earth Data Analysis Center, University of New Mexico — The National Flood Hazard Layer (NFHL) data incorporates all Digital Flood Insurance Rate Map(DFIRM) databases published by FEMA, and any Letters Of Map Revision...

  18. Critical experiments analyses by using 70 energy group library based on ENDF/B-VI

    Energy Technology Data Exchange (ETDEWEB)

    Tahara, Yoshihisa; Matsumoto, Hideki [Mitsubishi Heavy Industries Ltd., Yokohama (Japan). Nuclear Energy Systems Engineering Center; Huria, H.C.; Ouisloumen, M.

    1998-03-01

    The newly developed 70-group library has been validated by comparing kinf from a continuous energy Monte-Carlo code MCNP and two dimensional spectrum calculation code PHOENIX-CP. The code employs Discrete Angular Flux Method based on Collision Probability. The library has been also validated against a large number of critical experiments and numerical benchmarks for assemblies with MOX and Gd fuels. (author)

  19. Accumulation of nitrogen - a critical parameter for the calculation of load limits from nitrogen in forests; Akkumulering av nitrogen - en kritisk parameter for beregning av taalegrenser for nitrogen i skog

    Energy Technology Data Exchange (ETDEWEB)

    Sogn, T.A.; Stuanes, A.O.; Abrahamsen, G.

    1996-01-01

    The conference paper deals with the accumulation of nitrogen in forests in Norway. The level of accumulation is a critical factor for the calculation of load limits. The paper compares the average rapidity values of accumulation since the last glacial age with the calculated values from the more short-lasting period based on data from surveying programs of the State Pollution Control Authority, manuring experiments, and other relevant research programs in this field. 8 refs., 1 fig., 1 tab.

  20. An integrated data envelopment analysis-artificial neural network approach for benchmarking of bank branches

    Science.gov (United States)

    Shokrollahpour, Elsa; Hosseinzadeh Lotfi, Farhad; Zandieh, Mostafa

    2016-02-01

    Efficiency and quality of services are crucial to today's banking industries. The competition in this section has become increasingly intense, as a result of fast improvements in Technology. Therefore, performance analysis of the banking sectors attracts more attention these days. Even though data envelopment analysis (DEA) is a pioneer approach in the literature as of an efficiency measurement tool and finding benchmarks, it is on the other hand unable to demonstrate the possible future benchmarks. The drawback to it could be that the benchmarks it provides us with, may still be less efficient compared to the more advanced future benchmarks. To cover for this weakness, artificial neural network is integrated with DEA in this paper to calculate the relative efficiency and more reliable benchmarks of one of the Iranian commercial bank branches. Therefore, each branch could have a strategy to improve the efficiency and eliminate the cause of inefficiencies based on a 5-year time forecast.

  1. Benchmarking NMR experiments: a relational database of protein pulse sequences.

    Science.gov (United States)

    Senthamarai, Russell R P; Kuprov, Ilya; Pervushin, Konstantin

    2010-03-01

    Systematic benchmarking of multi-dimensional protein NMR experiments is a critical prerequisite for optimal allocation of NMR resources for structural analysis of challenging proteins, e.g. large proteins with limited solubility or proteins prone to aggregation. We propose a set of benchmarking parameters for essential protein NMR experiments organized into a lightweight (single XML file) relational database (RDB), which includes all the necessary auxiliaries (waveforms, decoupling sequences, calibration tables, setup algorithms and an RDB management system). The database is interfaced to the Spinach library (http://spindynamics.org), which enables accurate simulation and benchmarking of NMR experiments on large spin systems. A key feature is the ability to use a single user-specified spin system to simulate the majority of deposited solution state NMR experiments, thus providing the (hitherto unavailable) unified framework for pulse sequence evaluation. This development enables predicting relative sensitivity of deposited implementations of NMR experiments, thus providing a basis for comparison, optimization and, eventually, automation of NMR analysis. The benchmarking is demonstrated with two proteins, of 170 amino acids I domain of alphaXbeta2 Integrin and 440 amino acids NS3 helicase.

  2. PageRank Pipeline Benchmark: Proposal for a Holistic System Benchmark for Big-Data Platforms

    CERN Document Server

    Dreher, Patrick; Hill, Chris; Gadepally, Vijay; Kuszmaul, Bradley; Kepner, Jeremy

    2016-01-01

    The rise of big data systems has created a need for benchmarks to measure and compare the capabilities of these systems. Big data benchmarks present unique scalability challenges. The supercomputing community has wrestled with these challenges for decades and developed methodologies for creating rigorous scalable benchmarks (e.g., HPC Challenge). The proposed PageRank pipeline benchmark employs supercomputing benchmarking methodologies to create a scalable benchmark that is reflective of many real-world big data processing systems. The PageRank pipeline benchmark builds on existing prior scalable benchmarks (Graph500, Sort, and PageRank) to create a holistic benchmark with multiple integrated kernels that can be run together or independently. Each kernel is well defined mathematically and can be implemented in any programming environment. The linear algebraic nature of PageRank makes it well suited to being implemented using the GraphBLAS standard. The computations are simple enough that performance predictio...

  3. Bias and Uncertainty of Critical Experiment Models with CSAS25 from SCALE4.4a for Criticality Safety Analyses On the HP J-5600 (CMODB) Workstation

    Energy Technology Data Exchange (ETDEWEB)

    Smith, R.H.; Keener, H.J.; DeClue, J.F.; Krass, A.W.; Cain, V.R.

    2001-02-01

    This report documents establishment of bias, bias trends and uncertainty for validation of the CSAS25 control module from the SCALE 4.4a computer code system for use in evaluating criticality safety of uranium systems. The 27-group ENDF/B-IV, 44-group ENDF/B-V, and 238-group ENDF/B-V cross-section libraries were used. The criticality validation calculations were performed using over 500 benchmark cases from Volumes II and IV of the ''International Handbook of Evaluated Criticality Safety Benchmark Experiments,'' published by the Nuclear Energy Agency Organization for Economic Cooperation and Development (NEA/OECD). Based on statistical analysis of the calculation results, the bias, bias trends and uncertainty of the benchmark calculations have been established for these benchmark experiments. Numerical methods for applying margins are briefly described, but the determination of appropriate correlating parameter and values for additional margin, applicable to a particular analysis, must be determined as part of process analysis. As such, this document does not specify upper subcritical limits as has been done in the past. A follow-on report will be written to assess the methods for determination of an upper safety limit in more detail, provide comparisons, and recommend a preferred method. Analysts using these results are responsible for exercising sound engineering judgment using strong technical arguments to develop a margin in k{sub eff} or other correlating parameter that is sufficiently large to ensure that conditions (calculated by this method to be subcritical by this margin) will actually be subcritical. Documentation of determination and justification of the appropriate margin in the analyst's evaluation, in conjunction with this report, will constitute the complete Validation Report in accordance with ANSI/ANS-8.1-1998, Section 4.3.6(4).

  4. NASA Software Engineering Benchmarking Effort

    Science.gov (United States)

    Godfrey, Sally; Rarick, Heather

    2012-01-01

    Benchmarking was very interesting and provided a wealth of information (1) We did see potential solutions to some of our "top 10" issues (2) We have an assessment of where NASA stands with relation to other aerospace/defense groups We formed new contacts and potential collaborations (1) Several organizations sent us examples of their templates, processes (2) Many of the organizations were interested in future collaboration: sharing of training, metrics, Capability Maturity Model Integration (CMMI) appraisers, instructors, etc. We received feedback from some of our contractors/ partners (1) Desires to participate in our training; provide feedback on procedures (2) Welcomed opportunity to provide feedback on working with NASA

  5. Application of the PSI-NUSS Tool for the Estimation of Nuclear Data Related keff Uncertainties for the OECD/NEA WPNCS UACSA Phase I Benchmark

    Science.gov (United States)

    Zhu, T.; Vasiliev, A.; Ferroukhi, H.; Pautz, A.

    2014-04-01

    At the Paul Scherrer Institute (PSI), a methodology titled PSI-NUSS is under development for the propagation of nuclear data uncertainties into Criticality Safety Evaluation (CSE) with the Monte Carlo code MCNPX. The primary purpose is to provide a complementary option for the uncertainty assessment related to nuclear data, versus the traditional approach which relies on estimating biases/uncertainties based on validation studies against representative critical benchmark experiments. In the present paper, the PSI-NUSS methodology is applied to quantify nuclear data uncertainties for the OECD/NEA UACSA Exercise Phase I benchmark. One underlying reason is that PSI's CSE methodology developed so far and previously applied for this benchmark was based on using a more conventional approach, involving engineering guesses in order to estimate uncertainties in the calculated effective multiplication factor (keff). Therefore, as the PSI-NUSS methodology aims precisely at integrating a more rigorous treatment of the specific type of uncertainties from nuclear data for CSE, its application to the UACSA is conducted here: nuclear data related uncertainty component is estimated and compared to results obtained by other participants using different codes/libraries and methodologies.

  6. Fast neutron flux calculation benchmark analysis of PWR pressure vessel based on 3D MC-SN coupled method%基于三维MC-SN耦合方法的PWR压力容器快中子注量计算基准分析

    Institute of Scientific and Technical Information of China (English)

    韩静茹; 陈义学; 石生春; 袁龙军; 陆道纲

    2012-01-01

    The Monte Carlo (MC)-discrete ordinates (SN) coupled method is an efficient approach to solve shielding calculations of nuclear device with complex geometries and deep penetration. The 3D MC-SN coupled method has been used for PWR shielding calculation for the first time. According to characteristics of NUREG/CR-6115 PWR model, the thermal shield is specified as the common surface to link the Monte Carlo complex geometrical model and the deep penetration SN model. 3D Monte Carlo code is employed to accurately simulate the structure from core to thermal shield. The neutron tracks crossing the thermal shield inner surface are recorded by MC code. The SN boundary source is generated by the interface program and used by the 3D SN code to treat the calculation from thermal shield to pressure vessel. The calculation results include the circular distributions of fast neutron flux at pressure vessel inner wall, pressure vessel T/4 and lower weld locations. The calculation results are performed with comparison to MCNP and DORT solutions of benchmark report and satisfactory agreements are obtained. The validity of the method and the correctness of the programs are proved.%蒙特卡罗(MC)-离散纵标(SN)耦合方法是解决同时具有复杂几何和深穿透特点的核装置屏蔽问题的有效方法.本文首次将三维MC-SN耦合方法应用于压水堆屏蔽计算.针对NUREG/CR-6115压水堆基准模型,选取热屏蔽内表面为公共交界面,将其分为几何复杂的MC模拟区和具有深穿透特点的SN模拟区.三维MC程序用于精确描述堆芯到热屏蔽精细模型,并记录穿过热屏蔽内表面的中子径迹信息.接口程序将中子径迹转换为SN计算所需的边界源,提供给三维SN程序进行热屏蔽到压力容器的计算.计算结果包括压力容器内表面、1/4壁厚处及焊缝处快中子注量(E>1.0 MeV)圆周方向分布.三维耦合方法计算结果与基准报告提供的MCNP、DORT结果符合良好,验证了

  7. Benchmarking the financial performance of local councils in Ireland

    Directory of Open Access Journals (Sweden)

    Robbins Geraldine

    2016-05-01

    Full Text Available It was over a quarter of a century ago that information from the financial statements was used to benchmark the efficiency and effectiveness of local government in the US. With the global adoption of New Public Management ideas, benchmarking practice spread to the public sector and has been employed to drive reforms aimed at improving performance and, ultimately, service delivery and local outcomes. The manner in which local authorities in OECD countries compare and benchmark their performance varies widely. The methodology developed in this paper to rate the relative financial performance of Irish city and county councils is adapted from an earlier assessment tool used to measure the financial condition of small cities in the US. Using our financial performance framework and the financial data in the audited annual financial statements of Irish local councils, we calculate composite scores for each of the thirty-four local authorities for the years 2007–13. This paper contributes composite scores that measure the relative financial performance of local councils in Ireland, as well as a full set of yearly results for a seven-year period in which local governments witnessed significant changes in their financial health. The benchmarking exercise is useful in highlighting those councils that, in relative financial performance terms, are the best/worst performers.

  8. [Benchmarking in health care: conclusions and recommendations].

    Science.gov (United States)

    Geraedts, Max; Selbmann, Hans-Konrad

    2011-01-01

    The German Health Ministry funded 10 demonstration projects and accompanying research of benchmarking in health care. The accompanying research work aimed to infer generalisable findings and recommendations. We performed a meta-evaluation of the demonstration projects and analysed national and international approaches to benchmarking in health care. It was found that the typical benchmarking sequence is hardly ever realised. Most projects lack a detailed analysis of structures and processes of the best performers as a starting point for the process of learning from and adopting best practice. To tap the full potential of benchmarking in health care, participation in voluntary benchmarking projects should be promoted that have been demonstrated to follow all the typical steps of a benchmarking process.

  9. Criticality calculations of a generic fuel container for fuel assemblies PWR, by means of the code MCNP; Calculos de criticidad de un contenedor de combustible generico para ensambles combustibles PWR, mediante el codigo MCNP

    Energy Technology Data Exchange (ETDEWEB)

    Vargas E, S.; Esquivel E, J.; Ramirez S, J. R., E-mail: samuel.vargas@inin.gob.mx [ININ, Carretera Mexico-Toluca s/n, 52750 Ocoyoacac, Estado de Mexico (Mexico)

    2013-10-15

    The purpose of the concept of burned consideration (Burn-up credit) is determining the capacity of the calculation codes, as well as of the nuclear data associates to predict the isotopic composition and the corresponding neutrons effective multiplication factor in a generic container of spent fuel during some time of relevant storage. The present work has as objective determining this capacity of the calculation code MCNP in the prediction of the neutrons effective multiplication factor for a fuel assemblies arrangement type PWR inside a container of generic storage. The calculations are divided in two parts, the first, in the decay calculations with specified nuclide concentrations by the reference for a pressure water reactor (PWR) with enriched fuel to 4.5% and a discharge burned of 50 GW d/Mtu. The second, in criticality calculations with isotopic compositions dependent of the time for actinides and important fission products, taking 30 time steps, for two actinide groups and fission products. (Author)

  10. Benchmarking i eksternt regnskab og revision

    DEFF Research Database (Denmark)

    Thinggaard, Frank; Kiertzner, Lars

    2001-01-01

    løbende i en benchmarking-proces. Dette kapitel vil bredt undersøge, hvor man med nogen ret kan få benchmarking-begrebet knyttet til eksternt regnskab og revision. Afsnit 7.1 beskæftiger sig med det eksterne årsregnskab, mens afsnit 7.2 tager fat i revisionsområdet. Det sidste afsnit i kapitlet opsummerer...... betragtningerne om benchmarking i forbindelse med begge områder....

  11. Computational Chemistry Comparison and Benchmark Database

    Science.gov (United States)

    SRD 101 NIST Computational Chemistry Comparison and Benchmark Database (Web, free access)   The NIST Computational Chemistry Comparison and Benchmark Database is a collection of experimental and ab initio thermochemical properties for a selected set of molecules. The goals are to provide a benchmark set of molecules for the evaluation of ab initio computational methods and allow the comparison between different ab initio computational methods for the prediction of thermochemical properties.

  12. An Effective Approach for Benchmarking Implementation

    OpenAIRE

    B. M. Deros; Tan, J.; M.N.A. Rahman; N. A.Q.M. Daud

    2011-01-01

    Problem statement: The purpose of this study is to present a benchmarking guideline, conceptual framework and computerized mini program to assists companies achieve better performance in terms of quality, cost, delivery, supply chain and eventually increase their competitiveness in the market. The study begins with literature review on benchmarking definition, barriers and advantages from the implementation and the study of benchmarking framework. Approach: Thirty res...

  13. Applicability of 3D Monte Carlo simulations for local values calculations in a PWR core

    Science.gov (United States)

    Bernard, Franck; Cochet, Bertrand; Jinaphanh, Alexis; Jacquet, Olivier

    2014-06-01

    As technical support of the French Nuclear Safety Authority, IRSN has been developing the MORET Monte Carlo code for many years in the framework of criticality safety assessment and is now working to extend its application to reactor physics. For that purpose, beside the validation for criticality safety (more than 2000 benchmarks from the ICSBEP Handbook have been modeled and analyzed), a complementary validation phase for reactor physics has been started, with benchmarks from IRPHEP Handbook and others. In particular, to evaluate the applicability of MORET and other Monte Carlo codes for local flux or power density calculations in large power reactors, it has been decided to contribute to the "Monte Carlo Performance Benchmark" (hosted by OECD/NEA). The aim of this benchmark is to monitor, in forthcoming decades, the performance progress of detailed Monte Carlo full core calculations. More precisely, it measures their advancement towards achieving high statistical accuracy in reasonable computation time for local power at fuel pellet level. A full PWR reactor core is modeled to compute local power densities for more than 6 million fuel regions. This paper presents results obtained at IRSN for this benchmark with MORET and comparisons with MCNP. The number of fuel elements is so large that source convergence as well as statistical convergence issues could cause large errors in local tallies, especially in peripheral zones. Various sampling or tracking methods have been implemented in MORET, and their operational effects on such a complex case have been studied. Beyond convergence issues, to compute local values in so many fuel regions could cause prohibitive slowing down of neutron tracking. To avoid this, energy grid unification and tallies preparation before tracking have been implemented, tested and proved to be successful. In this particular case, IRSN obtained promising results with MORET compared to MCNP, in terms of local power densities, standard

  14. Benchmarking for controllere: Metoder, teknikker og muligheder

    DEFF Research Database (Denmark)

    Bukh, Per Nikolaj; Sandalgaard, Niels; Dietrichson, Lars

    2008-01-01

    Der vil i artiklen blive stillet skarpt på begrebet benchmarking ved at præsentere og diskutere forskellige facetter af det. Der vil blive redegjort for fire forskellige anvendelser af benchmarking for at vise begrebets bredde og væsentligheden af at klarlægge formålet med et benchmarkingprojekt......, inden man går i gang. Forskellen på resultatbenchmarking og procesbenchmarking vil blive behandlet, hvorefter brugen af intern hhv. ekstern benchmarking vil blive diskuteret. Endelig introduceres brugen af benchmarking i budgetlægning og budgetopfølgning....

  15. The Zoo, Benchmarks & You: How To Reach the Oregon State Benchmarks with Zoo Resources.

    Science.gov (United States)

    2002

    This document aligns Oregon state educational benchmarks and standards with Oregon Zoo resources. Benchmark areas examined include English, mathematics, science, social studies, and career and life roles. Brief descriptions of the programs offered by the zoo are presented. (SOE)

  16. Benchmarking Implementations of Functional Languages with ``Pseudoknot'', a Float-Intensive Benchmark

    NARCIS (Netherlands)

    Hartel, P.H.; Feeley, M.; Alt, M.; Augustsson, L.

    1996-01-01

    Over 25 implementations of different functional languages are benchmarked using the same program, a floatingpoint intensive application taken from molecular biology. The principal aspects studied are compile time and execution time for the various implementations that were benchmarked. An important

  17. Systematic Benchmarking of Diagnostic Technologies for an Electrical Power System

    Science.gov (United States)

    Kurtoglu, Tolga; Jensen, David; Poll, Scott

    2009-01-01

    Automated health management is a critical functionality for complex aerospace systems. A wide variety of diagnostic algorithms have been developed to address this technical challenge. Unfortunately, the lack of support to perform large-scale V&V (verification and validation) of diagnostic technologies continues to create barriers to effective development and deployment of such algorithms for aerospace vehicles. In this paper, we describe a formal framework developed for benchmarking of diagnostic technologies. The diagnosed system is the Advanced Diagnostics and Prognostics Testbed (ADAPT), a real-world electrical power system (EPS), developed and maintained at the NASA Ames Research Center. The benchmarking approach provides a systematic, empirical basis to the testing of diagnostic software and is used to provide performance assessment for different diagnostic algorithms.

  18. Benchmark of SCALE (SAS2H) isotopic predictions of depletion analyses for San Onofre PWR MOX fuel

    Energy Technology Data Exchange (ETDEWEB)

    Hermann, O.W.

    2000-02-01

    The isotopic composition of mixed-oxide (MOX) fuel, fabricated with both uranium and plutonium, after discharge from reactors is of significant interest to the Fissile Materials Disposition Program. The validation of the SCALE (SAS2H) depletion code for use in the prediction of isotopic compositions of MOX fuel, similar to previous validation studies on uranium-only fueled reactors, has corresponding significance. The EEI-Westinghouse Plutonium Recycle Demonstration Program examined the use of MOX fuel in the San Onofre PWR, Unit 1, during cycles 2 and 3. Isotopic analyses of the MOX spent fuel were conducted on 13 actinides and {sup 148}Nd by either mass or alpha spectrometry. Six fuel pellet samples were taken from four different fuel pins of an irradiated MOX assembly. The measured actinide inventories from those samples has been used to benchmark SAS2H for MOX fuel applications. The average percentage differences in the code results compared with the measurement were {minus}0.9% for {sup 235}U and 5.2% for {sup 239}Pu. The differences for most of the isotopes were significantly larger than in the cases for uranium-only fueled reactors. In general, comparisons of code results with alpha spectrometer data had extreme differences, although the differences in the calculations compared with mass spectrometer analyses were not extremely larger than that of uranium-only fueled reactors. This benchmark study should be useful in estimating uncertainties of inventory, criticality and dose calculations of MOX spent fuel.

  19. Evaluation of the applicability of the Benchmark approach to existing toxicological data. Framework: Chemical compounds in the working place

    OpenAIRE

    Appel MJ; Bouman HGM; Pieters MN; Slob W; Adviescentrum voor chemische arbeidsomstandigheden (ACCA) TNO; CSR

    2001-01-01

    Five chemicals used in workplace, for which a risk assessment had already been carried out, were selected and the relevant critical studies re-analyzed by the Benchmark approach. The endpoints involved included continuous, and ordinal data. Dose-response modeling could be reasonablyapplied to the dose-response data encountered, and Critical Effect Doses (CEDs) could be derived for almost all of the endpoints considered. The resulting benchmark dose for the study as a whole was close to the NO...

  20. General benchmarks for quantum repeaters

    CERN Document Server

    Pirandola, Stefano

    2015-01-01

    Using a technique based on quantum teleportation, we simplify the most general adaptive protocols for key distribution, entanglement distillation and quantum communication over a wide class of quantum channels in arbitrary dimension. Thanks to this method, we bound the ultimate rates for secret key generation and quantum communication through single-mode Gaussian channels and several discrete-variable channels. In particular, we derive exact formulas for the two-way assisted capacities of the bosonic quantum-limited amplifier and the dephasing channel in arbitrary dimension, as well as the secret key capacity of the qubit erasure channel. Our results establish the limits of quantum communication with arbitrary systems and set the most general and precise benchmarks for testing quantum repeaters in both discrete- and continuous-variable settings.

  1. Benchmarking Asteroid-Deflection Experiment

    Science.gov (United States)

    Remington, Tane; Bruck Syal, Megan; Owen, John Michael; Miller, Paul L.

    2016-10-01

    An asteroid impacting Earth could have devastating consequences. In preparation to deflect or disrupt one before it reaches Earth, it is imperative to have modeling capabilities that adequately simulate the deflection actions. Code validation is key to ensuring full confidence in simulation results used in an asteroid-mitigation plan. We are benchmarking well-known impact experiments using Spheral, an adaptive smoothed-particle hydrodynamics code, to validate our modeling of asteroid deflection. We describe our simulation results, compare them with experimental data, and discuss what we have learned from our work. This work was performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344. LLNL-ABS-695540

  2. Benchmarking ICRF simulations for ITER

    Energy Technology Data Exchange (ETDEWEB)

    R. V. Budny, L. Berry, R. Bilato, P. Bonoli, M. Brambilla, R.J. Dumont, A. Fukuyama, R. Harvey, E.F. Jaeger, E. Lerche, C.K. Phillips, V. Vdovin, J. Wright, and members of the ITPA-IOS

    2010-09-28

    Abstract Benchmarking of full-wave solvers for ICRF simulations is performed using plasma profiles and equilibria obtained from integrated self-consistent modeling predictions of four ITER plasmas. One is for a high performance baseline (5.3 T, 15 MA) DT H-mode plasma. The others are for half-field, half-current plasmas of interest for the pre-activation phase with bulk plasma ion species being either hydrogen or He4. The predicted profiles are used by seven groups to predict the ICRF electromagnetic fields and heating profiles. Approximate agreement is achieved for the predicted heating power partitions for the DT and He4 cases. Profiles of the heating powers and electromagnetic fields are compared.

  3. COG validation: SINBAD Benchmark Problems

    Energy Technology Data Exchange (ETDEWEB)

    Lent, E M; Sale, K E; Buck, R M; Descalle, M

    2004-02-23

    We validated COG, a 3D Monte Carlo radiation transport code, against experimental data and MNCP4C simulations from the Shielding Integral Benchmark Archive Database (SINBAD) compiled by RSICC. We modeled three experiments: the Osaka Nickel and Aluminum sphere experiments conducted at the OKTAVIAN facility, and the liquid oxygen experiment conducted at the FNS facility. COG results are in good agreement with experimental data and generally within a few % of MCNP results. There are several possible sources of discrepancy between MCNP and COG results: (1) the cross-section database versions are different, MCNP uses ENDFB VI 1.1 while COG uses ENDFB VIR7, (2) the code implementations are different, and (3) the models may differ slightly. We also limited the use of variance reduction methods when running the COG version of the problems.

  4. Comparisons of Neutron Cross Sections and Isotopic Composition Calculations for Fission-Product Evaluations

    Science.gov (United States)

    Kim, Do Heon; Gil, Choong-Sup; Chang, Jonghwa; Lee, Yong-Deok

    2005-05-01

    The neutron absorption cross sections for 18 fission products evaluated within the framework of the KAERI (Korea Atomic Energy Research Institute)-BNL (Brookhaven National Laboratory) international collaboration have been compared with ENDF/B-VI.7. Also, the influence of the new evaluations on the isotopic composition calculations of the fission products has been estimated through the OECD/NEA burnup credit criticality benchmarks (Phase 1B) and the LWR/Pu recycling benchmarks. These calculations were performed by WIMSD-5B with the 69-group libraries prepared from three evaluated nuclear data libraries: ENDF/B-VI.7, ENDF/B-VI.8 including the new evaluations in the resonance region covering the thermal region, and the expected ENDF/B-VII including those in the upper resonance region up to 20 MeV. For Xe-131, the composition calculated with ENDF/B-VI.8 shows a maximum difference of 5.02% compared to ENDF/B-VI.7. However, the isotopic compositions of all the fission products calculated with the expected ENDF/B-VII show no differences when compared to ENDF/B-VI.7 for the thermal reactor benchmark cases.

  5. Verification of the code DYN3D/R with the help of international benchmarks

    Energy Technology Data Exchange (ETDEWEB)

    Grundmann, U.; Rohde, U.

    1997-10-01

    Different benchmarks for reactors with quadratic fuel assemblies were calculated with the code DYN3D/R. In this report comparisons with the results of the reference solutions are carried out. The results of DYN3D/R and the reference calculation for the eigenvalue k{sub eff} and the power distribution are shown for the steady-state 3-dimensional IAEA-Benchmark. The results of NEACRP-Benchmarks on control rod ejections in a standard PWR were compared with the reference solutions published by the NEA Data Bank. For assessing the accuracy of DYN3D/R results in comparison to other codes the deviations to the reference solutions are considered. Detailed comparisons with the published reference solutions of the NEA-NSC Benchmarks on uncontrolled withdrawal of control rods are made. The influence of the axial nodalization is also investigated. All in all, a good agreement of the DYN3D/R results with the reference solutions can be seen for the considered benchmark problems. (orig.) [Deutsch] Verschiedene Benchmarks fuer Reaktoren mit quadratischen Brennelementen wurden mit dem Code DYN3D/R berechnet. In diesem Bericht erfolgen Vergleiche mit den Ergebnissen der Referenzloesungen. Die Ergebnisse von DYN3D/R und der Referenzrechnung fuer Eigenwert k{sub eff} und Leistungsverteilung des stationaeren 3-dimensionalen IAEA-Benchmarks werden dargestellt. Die Ergebnisse der NEACRP-Benchmarks fuer die Auswuerfe von Steuerstaeben in einem typischen DWR werden mit den von der NEA Data Bank veroeffentlichten Referenzloesungen verglichen. Zur Einschaetzung der Genauigkeit der DYN3D/R Resultate im Vergleich zu anderen Rechenprogrammen werden die Abweichungen zu den Referenzloesungen betrachtet. Detaillierte Vergleiche mit den Referenzloesungen erfolgen fuer die NEA-NSC Benchmarks zum unkontrollierten Ausfahren von Steuerstaeben. Dabei wird der Einfluss der axialen Nodalisierung untersucht. Insgesamt wird eine gute Uebereinstimmung der DYN3D/R Resultate mit den Referenzloesungen fuer die

  6. 42 CFR 440.330 - Benchmark health benefits coverage.

    Science.gov (United States)

    2010-10-01

    ... 42 Public Health 4 2010-10-01 2010-10-01 false Benchmark health benefits coverage. 440.330 Section... SERVICES (CONTINUED) MEDICAL ASSISTANCE PROGRAMS SERVICES: GENERAL PROVISIONS Benchmark Benefit and Benchmark-Equivalent Coverage § 440.330 Benchmark health benefits coverage. Benchmark coverage is...

  7. HPC Analytics Support. Requirements for Uncertainty Quantification Benchmarks

    Energy Technology Data Exchange (ETDEWEB)

    Paulson, Patrick R. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Purohit, Sumit [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Rodriguez, Luke R. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2015-05-01

    This report outlines techniques for extending benchmark generation products so they support uncertainty quantification by benchmarked systems. We describe how uncertainty quantification requirements can be presented to candidate analytical tools supporting SPARQL. We describe benchmark data sets for evaluating uncertainty quantification, as well as an approach for using our benchmark generator to produce data sets for generating benchmark data sets.

  8. Benchmark Assessment for Improved Learning. AACC Report

    Science.gov (United States)

    Herman, Joan L.; Osmundson, Ellen; Dietel, Ronald

    2010-01-01

    This report describes the purposes of benchmark assessments and provides recommendations for selecting and using benchmark assessments--addressing validity, alignment, reliability, fairness and bias and accessibility, instructional sensitivity, utility, and reporting issues. We also present recommendations on building capacity to support schools'…

  9. Benchmark analysis of railway networks and undertakings

    NARCIS (Netherlands)

    Hansen, I.A.; Wiggenraad, P.B.L.; Wolff, J.W.

    2013-01-01

    Benchmark analysis of railway networks and companies has been stimulated by the European policy of deregulation of transport markets, the opening of national railway networks and markets to new entrants and separation of infrastructure and train operation. Recent international railway benchmarking s

  10. Machines are benchmarked by code, not algorithms

    NARCIS (Netherlands)

    Poss, R.

    2013-01-01

    This article highlights how small modifications to either the source code of a benchmark program or the compilation options may impact its behavior on a specific machine. It argues that for evaluating machines, benchmark providers and users be careful to ensure reproducibility of results based on th

  11. An Effective Approach for Benchmarking Implementation

    Directory of Open Access Journals (Sweden)

    B. M. Deros

    2011-01-01

    Full Text Available Problem statement: The purpose of this study is to present a benchmarking guideline, conceptual framework and computerized mini program to assists companies achieve better performance in terms of quality, cost, delivery, supply chain and eventually increase their competitiveness in the market. The study begins with literature review on benchmarking definition, barriers and advantages from the implementation and the study of benchmarking framework. Approach: Thirty respondents were involved in the case study. They comprise of industrial practitioners, which had assessed usability and practicability of the guideline, conceptual framework and computerized mini program. Results: A guideline and template were proposed to simplify the adoption of benchmarking techniques. A conceptual framework was proposed by integrating the Deming’s PDCA and Six Sigma DMAIC theory. It was provided a step-by-step method to simplify the implementation and to optimize the benchmarking results. A computerized mini program was suggested to assist the users in adopting the technique as part of improvement project. As the result from the assessment test, the respondents found that the implementation method provided an idea for company to initiate benchmarking implementation and it guides them to achieve the desired goal as set in a benchmarking project. Conclusion: The result obtained and discussed in this study can be applied in implementing benchmarking in a more systematic way for ensuring its success.

  12. Synergetic effect of benchmarking competitive advantages

    Directory of Open Access Journals (Sweden)

    N.P. Tkachova

    2011-12-01

    Full Text Available It is analyzed the essence of synergistic competitive benchmarking. The classification of types of synergies is developed. It is determined the sources of synergies in conducting benchmarking of competitive advantages. It is proposed methodological framework definition of synergy in the formation of competitive advantage.

  13. Synergetic effect of benchmarking competitive advantages

    OpenAIRE

    N.P. Tkachova; P.G. Pererva

    2011-01-01

    It is analyzed the essence of synergistic competitive benchmarking. The classification of types of synergies is developed. It is determined the sources of synergies in conducting benchmarking of competitive advantages. It is proposed methodological framework definition of synergy in the formation of competitive advantage.

  14. Benchmark Two-Good Utility Functions

    NARCIS (Netherlands)

    de Jaegher, K.

    2007-01-01

    Benchmark two-good utility functions involving a good with zero income elasticity and unit income elasticity are well known. This paper derives utility functions for the additional benchmark cases where one good has zero cross-price elasticity, unit own-price elasticity, and zero own price elasticit

  15. Benchmarking Learning and Teaching: Developing a Method

    Science.gov (United States)

    Henderson-Smart, Cheryl; Winning, Tracey; Gerzina, Tania; King, Shalinie; Hyde, Sarah

    2006-01-01

    Purpose: To develop a method for benchmarking teaching and learning in response to an institutional need to validate a new program in Dentistry at the University of Sydney, Australia. Design/methodology/approach: After a collaborative partner, University of Adelaide, was identified, the areas of teaching and learning to be benchmarked, PBL…

  16. RESRAD benchmarking against six radiation exposure pathway models

    Energy Technology Data Exchange (ETDEWEB)

    Faillace, E.R.; Cheng, J.J.; Yu, C.

    1994-10-01

    A series of benchmarking runs were conducted so that results obtained with the RESRAD code could be compared against those obtained with six pathway analysis models used to determine the radiation dose to an individual living on a radiologically contaminated site. The RESRAD computer code was benchmarked against five other computer codes - GENII-S, GENII, DECOM, PRESTO-EPA-CPG, and PATHRAE-EPA - and the uncodified methodology presented in the NUREG/CR-5512 report. Estimated doses for the external gamma pathway; the dust inhalation pathway; and the soil, food, and water ingestion pathways were calculated for each methodology by matching, to the extent possible, input parameters such as occupancy, shielding, and consumption factors.

  17. Benchmarking of neutron production of heavy-ion transport codes

    Energy Technology Data Exchange (ETDEWEB)

    Remec, I. [Oak Ridge National Laboratory, Oak Ridge, TN 37831-6172 (United States); Ronningen, R. M. [Michigan State Univ., National Superconductiong Cyclotron Laboratory, East Lansing, MI 48824-1321 (United States); Heilbronn, L. [Univ. of Tennessee, 1004 Estabrook Rd., Knoxville, TN 37996-2300 (United States)

    2011-07-01

    Document available in abstract form only, full text of document follows: Accurate prediction of radiation fields generated by heavy ion interactions is important in medical applications, space missions, and in design and operation of rare isotope research facilities. In recent years, several well-established computer codes in widespread use for particle and radiation transport calculations have been equipped with the capability to simulate heavy ion transport and interactions. To assess and validate these capabilities, we performed simulations of a series of benchmark-quality heavy ion experiments with the computer codes FLUKA, MARS15, MCNPX, and PHITS. We focus on the comparisons of secondary neutron production. Results are encouraging; however, further improvements in models and codes and additional benchmarking are required. (authors)

  18. RZBENCH: Performance evaluation of current HPC architectures using low-level and application benchmarks

    CERN Document Server

    Hager, Georg; Zeiser, Thomas; Wellein, Gerhard

    2007-01-01

    RZBENCH is a benchmark suite that was specifically developed to reflect the requirements of scientific supercomputer users at the University of Erlangen-Nuremberg (FAU). It comprises a number of application and low-level codes under a common build infrastructure that fosters maintainability and expandability. This paper reviews the structure of the suite and briefly introduces the most relevant benchmarks. In addition, some widely known standard benchmark codes are reviewed in order to emphasize the need for a critical review of often-cited performance results. Benchmark data is presented for the HLRB-II at LRZ Munich and a local InfiniBand Woodcrest cluster as well as two uncommon system architectures: A bandwidth-optimized InfiniBand cluster based on single socket nodes ("Port Townsend") and an early version of Sun's highly threaded T2 architecture ("Niagara 2").

  19. Validation of MCNP6.1 for Criticality Safety of Pu-Metal, -Solution, and -Oxide Systems

    Energy Technology Data Exchange (ETDEWEB)

    Kiedrowski, Brian C. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Conlin, Jeremy Lloyd [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Favorite, Jeffrey A. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Kahler, III, Albert C. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Kersting, Alyssa R. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Parsons, Donald K. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Walker, Jessie L. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2014-05-13

    Guidance is offered to the Los Alamos National Laboratory Nuclear Criticality Safety division towards developing an Upper Subcritical Limit (USL) for MCNP6.1 calculations with ENDF/B-VII.1 nuclear data for three classes of problems: Pu-metal, -solution, and -oxide systems. A benchmark suite containing 1,086 benchmarks is prepared, and a sensitivity/uncertainty (S/U) method with a generalized linear least squares (GLLS) data adjustment is used to reject outliers, bringing the total to 959 usable benchmarks. For each class of problem, S/U methods are used to select relevant experimental benchmarks, and the calculational margin is computed using extreme value theory. A portion of the margin of sub criticality is defined considering both a detection limit for errors in codes and data and uncertainty/variability in the nuclear data library. The latter employs S/U methods with a GLLS data adjustment to find representative nuclear data covariances constrained by integral experiments, which are then used to compute uncertainties in keff from nuclear data. The USLs for the classes of problems are as follows: Pu metal, 0.980; Pu solutions, 0.973; dry Pu oxides, 0.978; dilute Pu oxide-water mixes, 0.970; and intermediate-spectrum Pu oxide-water mixes, 0.953.

  20. Benchmarking of small-signal dynamics of single-phase PLLs

    DEFF Research Database (Denmark)

    Zhang, Chong; Wang, Xiongfei; Blaabjerg, Frede;

    2015-01-01

    Phase-looked Loop (PLL) is a critical component for the control and grid synchronization of grid-connected power converters. This paper presents a benchmarking study on the small-signal dynamics of three commonly used PLLs for single-phase converters, including enhanced PLL, second-order generali......Phase-looked Loop (PLL) is a critical component for the control and grid synchronization of grid-connected power converters. This paper presents a benchmarking study on the small-signal dynamics of three commonly used PLLs for single-phase converters, including enhanced PLL, second...

  1. Effect of noise correlations on randomized benchmarking

    Science.gov (United States)

    Ball, Harrison; Stace, Thomas M.; Flammia, Steven T.; Biercuk, Michael J.

    2016-02-01

    Among the most popular and well-studied quantum characterization, verification, and validation techniques is randomized benchmarking (RB), an important statistical tool used to characterize the performance of physical logic operations useful in quantum information processing. In this work we provide a detailed mathematical treatment of the effect of temporal noise correlations on the outcomes of RB protocols. We provide a fully analytic framework capturing the accumulation of error in RB expressed in terms of a three-dimensional random walk in "Pauli space." Using this framework we derive the probability density function describing RB outcomes (averaged over noise) for both Markovian and correlated errors, which we show is generally described by a Γ distribution with shape and scale parameters depending on the correlation structure. Long temporal correlations impart large nonvanishing variance and skew in the distribution towards high-fidelity outcomes—consistent with existing experimental data—highlighting potential finite-sampling pitfalls and the divergence of the mean RB outcome from worst-case errors in the presence of noise correlations. We use the filter-transfer function formalism to reveal the underlying reason for these differences in terms of effective coherent averaging of correlated errors in certain random sequences. We conclude by commenting on the impact of these calculations on the utility of single-metric approaches to quantum characterization, verification, and validation.

  2. Gaia FGK Benchmark Stars: New Candidates At Low-Metallicities

    CERN Document Server

    Hawkins, Keith; Heiter, Ulrike; Soubiran, Caroline; Blanco-Cuaresma, Sergi; Casagrande, Luca; Gilmore, Gerry; Lind, Karin; Magrini, Laura; Masseron, Thomas; Pancino, Elena; Randich, Sofia; Worley, Clare C

    2016-01-01

    We have entered an era of large spectroscopic surveys in which we can measure, through automated pipelines, the atmospheric parameters and chemical abundances for large numbers of stars. Calibrating these survey pipelines using a set of "benchmark stars" in order to evaluate the accuracy and precision of the provided parameters and abundances is of utmost importance. The recent proposed set of Gaia FGK benchmark stars of Heiter et al. (2015) has no recommended stars within the critical metallicity range of $-2.0 <$ [Fe/H] $< -1.0$ dex. In this paper, we aim to add candidate Gaia benchmark stars inside of this metal-poor gap. We began with a sample of 21 metal-poor stars which was reduced to 10 stars by requiring accurate photometry and parallaxes, and high-resolution archival spectra. The procedure used to determine the stellar parameters was similar to Heiter et al. (2015) and Jofre et al. (2014) for consistency. The effective temperature (T$_{\\mathrm{eff}}$) of all candidate stars was determined using...

  3. The results of the pantograph-catenary interaction benchmark

    Science.gov (United States)

    Bruni, Stefano; Ambrosio, Jorge; Carnicero, Alberto; Cho, Yong Hyeon; Finner, Lars; Ikeda, Mitsuru; Kwon, Sam Young; Massat, Jean-Pierre; Stichel, Sebastian; Tur, Manuel; Zhang, Weihua

    2015-03-01

    This paper describes the results of a voluntary benchmark initiative concerning the simulation of pantograph-catenary interaction, which was proposed and coordinated by Politecnico di Milano and participated by 10 research institutions established in 9 different countries across Europe and Asia. The aims of the benchmark are to assess the dispersion of results on the same simulation study cases, to demonstrate the accuracy of numerical methodologies and simulation models and to identify the best suited modelling approaches to study pantograph-catenary interaction. One static and three dynamic simulation cases were defined for a non-existing but realistic high-speed pantograph-catenary couple. These cases were run using 10 of the major simulation codes presently in use for the study of pantograph-catenary interaction, and the results are presented and critically discussed here. All input data required to run the study cases are also provided, allowing the use of this benchmark as a term of comparison for other simulation codes.

  4. Lecture Notes on Criticality Safety Validation Using MCNP & Whisper

    Energy Technology Data Exchange (ETDEWEB)

    Brown, Forrest B. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Rising, Michael Evan [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Alwin, Jennifer Louise [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-03-11

    Training classes for nuclear criticality safety, MCNP documentation. The need for, and problems surrounding, validation of computer codes and data area considered first. Then some background for MCNP & Whisper is given--best practices for Monte Carlo criticality calculations, neutron spectra, S(α,β) thermal neutron scattering data, nuclear data sensitivities, covariance data, and correlation coefficients. Whisper is computational software designed to assist the nuclear criticality safety analyst with validation studies with the Monte Carlo radiation transport package MCNP. Whisper's methodology (benchmark selection – Ck's, weights; extreme value theory – bias, bias uncertainty; MOS for nuclear data uncertainty – GLLS) and usage are discussed.

  5. Benchmarks and statistics of entanglement dynamics

    Energy Technology Data Exchange (ETDEWEB)

    Tiersch, Markus

    2009-09-04

    In the present thesis we investigate how the quantum entanglement of multicomponent systems evolves under realistic conditions. More specifically, we focus on open quantum systems coupled to the (uncontrolled) degrees of freedom of an environment. We identify key quantities that describe the entanglement dynamics, and provide efficient tools for its calculation. For quantum systems of high dimension, entanglement dynamics can be characterized with high precision. In the first part of this work, we derive evolution equations for entanglement. These formulas determine the entanglement after a given time in terms of a product of two distinct quantities: the initial amount of entanglement and a factor that merely contains the parameters that characterize the dynamics. The latter is given by the entanglement evolution of an initially maximally entangled state. A maximally entangled state thus benchmarks the dynamics, and hence allows for the immediate calculation or - under more general conditions - estimation of the change in entanglement. Thereafter, a statistical analysis supports that the derived (in-)equalities describe the entanglement dynamics of the majority of weakly mixed and thus experimentally highly relevant states with high precision. The second part of this work approaches entanglement dynamics from a topological perspective. This allows for a quantitative description with a minimum amount of assumptions about Hilbert space (sub-)structure and environment coupling. In particular, we investigate the limit of increasing system size and density of states, i.e. the macroscopic limit. In this limit, a universal behaviour of entanglement emerges following a ''reference trajectory'', similar to the central role of the entanglement dynamics of a maximally entangled state found in the first part of the present work. (orig.)

  6. Benchmarking Measures of Network Influence

    Science.gov (United States)

    Bramson, Aaron; Vandermarliere, Benjamin

    2016-01-01

    Identifying key agents for the transmission of diseases (ideas, technology, etc.) across social networks has predominantly relied on measures of centrality on a static base network or a temporally flattened graph of agent interactions. Various measures have been proposed as the best trackers of influence, such as degree centrality, betweenness, and k-shell, depending on the structure of the connectivity. We consider SIR and SIS propagation dynamics on a temporally-extruded network of observed interactions and measure the conditional marginal spread as the change in the magnitude of the infection given the removal of each agent at each time: its temporal knockout (TKO) score. We argue that this TKO score is an effective benchmark measure for evaluating the accuracy of other, often more practical, measures of influence. We find that none of the network measures applied to the induced flat graphs are accurate predictors of network propagation influence on the systems studied; however, temporal networks and the TKO measure provide the requisite targets for the search for effective predictive measures. PMID:27670635

  7. Developing integrated benchmarks for DOE performance measurement

    Energy Technology Data Exchange (ETDEWEB)

    Barancik, J.I.; Kramer, C.F.; Thode, Jr. H.C.

    1992-09-30

    The objectives of this task were to describe and evaluate selected existing sources of information on occupational safety and health with emphasis on hazard and exposure assessment, abatement, training, reporting, and control identifying for exposure and outcome in preparation for developing DOE performance benchmarks. Existing resources and methodologies were assessed for their potential use as practical performance benchmarks. Strengths and limitations of current data resources were identified. Guidelines were outlined for developing new or improved performance factors, which then could become the basis for selecting performance benchmarks. Data bases for non-DOE comparison populations were identified so that DOE performance could be assessed relative to non-DOE occupational and industrial groups. Systems approaches were described which can be used to link hazards and exposure, event occurrence, and adverse outcome factors, as needed to generate valid, reliable, and predictive performance benchmarks. Data bases were identified which contain information relevant to one or more performance assessment categories . A list of 72 potential performance benchmarks was prepared to illustrate the kinds of information that can be produced through a benchmark development program. Current information resources which may be used to develop potential performance benchmarks are limited. There is need to develop an occupational safety and health information and data system in DOE, which is capable of incorporating demonstrated and documented performance benchmarks prior to, or concurrent with the development of hardware and software. A key to the success of this systems approach is rigorous development and demonstration of performance benchmark equivalents to users of such data before system hardware and software commitments are institutionalized.

  8. Guideline of Monte Carlo calculation. Neutron/gamma ray transport simulation by Monte Carlo method

    CERN Document Server

    2002-01-01

    This report condenses basic theories and advanced applications of neutron/gamma ray transport calculations in many fields of nuclear energy research. Chapters 1 through 5 treat historical progress of Monte Carlo methods, general issues of variance reduction technique, cross section libraries used in continuous energy Monte Carlo codes. In chapter 6, the following issues are discussed: fusion benchmark experiments, design of ITER, experiment analyses of fast critical assembly, core analyses of JMTR, simulation of pulsed neutron experiment, core analyses of HTTR, duct streaming calculations, bulk shielding calculations, neutron/gamma ray transport calculations of the Hiroshima atomic bomb. Chapters 8 and 9 treat function enhancements of MCNP and MVP codes, and a parallel processing of Monte Carlo calculation, respectively. An important references are attached at the end of this report.

  9. Solution of the neutronics code dynamic benchmark by finite element method

    Science.gov (United States)

    Avvakumov, A. V.; Vabishchevich, P. N.; Vasilev, A. O.; Strizhov, V. F.

    2016-10-01

    The objective is to analyze the dynamic benchmark developed by Atomic Energy Research for the verification of best-estimate neutronics codes. The benchmark scenario includes asymmetrical ejection of a control rod in a water-type hexagonal reactor at hot zero power. A simple Doppler feedback mechanism assuming adiabatic fuel temperature heating is proposed. The finite element method on triangular calculation grids is used to solve the three-dimensional neutron kinetics problem. The software has been developed using the engineering and scientific calculation library FEniCS. The matrix spectral problem is solved using the scalable and flexible toolkit SLEPc. The solution accuracy of the dynamic benchmark is analyzed by condensing calculation grid and varying degree of finite elements.

  10. Standardized benchmarking in the quest for orthologs

    DEFF Research Database (Denmark)

    Altenhoff, Adrian M; Boeckmann, Brigitte; Capella-Gutierrez, Salvador;

    2016-01-01

    -recall trade-offs. As a result, it is difficult to assess the performance of orthology inference methods. Here, we present a community effort to establish standards and an automated web-based service to facilitate orthology benchmarking. Using this service, we characterize 15 well-established inference methods...... and resources on a battery of 20 different benchmarks. Standardized benchmarking provides a way for users to identify the most effective methods for the problem at hand, sets a minimum requirement for new tools and resources, and guides the development of more accurate orthology inference methods....

  11. Benchmarking Attosecond Physics with Atomic Hydrogen

    Science.gov (United States)

    2015-05-25

    Final 3. DATES COVERED (From - To) 12 Mar 12 – 11 Mar 15 4. TITLE AND SUBTITLE Benchmarking attosecond physics with atomic hydrogen 5a...AND SUBTITLE Benchmarking attosecond physics with atomic hydrogen 5a. CONTRACT NUMBER FA2386-12-1-4025 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER...THIS PAGE unclassified Standard Form 298 (Rev. 8-98) Prescribed by ANSI Std Z39-18 Final Report for AOARD Grant FA2386-12-1-4025 “ Benchmarking

  12. On the nature of lithium biphenyl in ethereal solvents. A critical analysis unifying DFT calculations, physicochemical data in solution, and a X-ray structure.

    Science.gov (United States)

    de la Viuda, Mónica; Yus, Miguel; Guijarro, Albert

    2011-12-15

    The lithium ion is an important type of electrolyte that has technological applications in the manufacture of lithium ion cells; therefore, a better understanding of the nature of its solutions is desirable. When associated to the radical anion of biphenyl in an organic solvent, it forms conducting solutions comparable to strong electrolytes such as lithium perchlorate. We have studied the lithium biphenyl solution in dimethoxyethane using DFT calculations. The nature of these ionic solutions is described in terms of a dynamic equilibrium between different types of ionic associations, the composition of which depends on the solvent and the temperature. The X-ray structure of [Li(+)·4C(5)H(10)O][C(12)H(10)(•-)], a solvent-separated ion pair of lithium biphenyl complexed with tetrahydropyran, is reported. Its main structural characteristics coincide with the calculated one, which we think is the dominant species at room temperature, in agreement with the available physicochemical data.

  13. Scale-4 Analysis of Pressurized Water Reactor Critical Configurations: Volume 4-Three Mile Island Unit 1 Cycle 5

    Energy Technology Data Exchange (ETDEWEB)

    DeHart, M.D.

    1995-01-01

    The requirements of ANSI/ANS-8.1 specify that calculational methods for away-from-reactor criticality safety analyses be validated against experimental measurements. If credit is to be taken for the reduced reactivity of burned or spent fuel relative to its original ''fresh'' composition, it is necessary to benchmark computational methods used in determining such reactivity worth against spent fuel reactivity measurements. This report summarizes a portion of the ongoing effort to benchmark away-from-reactor criticality analysis methods using relevant and well-documented critical configurations from commercial pressurized water reactors. The analysis methodology utilized for all calculations in this report is based on the modules and data associated with the SCALE-4 code system. Isotopic densities for spent fuel assemblies in the core were calculated using the SCALE-4 SAS2H analytical sequence. The sources of data and the procedures for deriving SAS2H input parameters are described in detail. The SNIKR code family was used to extract the necessary isotopic densities from SAS2H results and to provide the data in the format required for SCALE criticality analysis modules. The CSASN analytical sequence in SCALE-4 was used to perform resonance processing of cross sections. The KENO V.a module of SCALE-4 was used to calculate the effective multiplication factor (k{sub eff}) for the critical configuration. The SCALE-4 27-group burnup library containing ENDF/B-IV (actinides) and ENDF/B-V (fission products) data was used for all calculations. This volume of the report documents a reactor critical calculation for GPU Nuclear Corporation's Three Mile Island Unit 1 (TMI-1) during hot, zero-power startup testing for the beginning of cycle 5. This unit and cycle were selected because of their relevance in spent fuel benchmark applications: (1) cycle 5 startup occurred after an especially long downtime of 6.6 years; and (2) the core consisted primarily

  14. Benchmark ultra-cool dwarfs in widely separated binary systems

    Directory of Open Access Journals (Sweden)

    Jones H.R.A.

    2011-07-01

    Full Text Available Ultra-cool dwarfs as wide companions to subgiants, giants, white dwarfs and main sequence stars can be very good benchmark objects, for which we can infer physical properties with minimal reference to theoretical models, through association with the primary stars. We have searched for benchmark ultra-cool dwarfs in widely separated binary systems using SDSS, UKIDSS, and 2MASS. We then estimate spectral types using SDSS spectroscopy and multi-band colors, place constraints on distance, and perform proper motions calculations for all candidates which have sufficient epoch baseline coverage. Analysis of the proper motion and distance constraints show that eight of our ultra-cool dwarfs are members of widely separated binary systems. Another L3.5 dwarf, SDSS 0832, is shown to be a companion to the bright K3 giant η Cancri. Such primaries can provide age and metallicity constraints for any companion objects, yielding excellent benchmark objects. This is the first wide ultra-cool dwarf + giant binary system identified.

  15. A critical study of different Monte Carlo scoring methods of dose average linear-energy-transfer maps calculated in voxelized geometries irradiated with clinical proton beams.

    Science.gov (United States)

    Cortés-Giraldo, M A; Carabe, A

    2015-04-07

    We compare unrestricted dose average linear energy transfer (LET) maps calculated with three different Monte Carlo scoring methods in voxelized geometries irradiated with proton therapy beams with three different Monte Carlo scoring methods. Simulations were done with the Geant4 (Geometry ANd Tracking) toolkit. The first method corresponds to a step-by-step computation of LET which has been reported previously in the literature. We found that this scoring strategy is influenced by spurious high LET components, which relative contribution in the dose average LET calculations significantly increases as the voxel size becomes smaller. Dose average LET values calculated for primary protons in water with voxel size of 0.2 mm were a factor ~1.8 higher than those obtained with a size of 2.0 mm at the plateau region for a 160 MeV beam. Such high LET components are a consequence of proton steps in which the condensed-history algorithm determines an energy transfer to an electron of the material close to the maximum value, while the step length remains limited due to voxel boundary crossing. Two alternative methods were derived to overcome this problem. The second scores LET along the entire path described by each proton within the voxel. The third followed the same approach of the first method, but the LET was evaluated at each step from stopping power tables according to the proton kinetic energy value. We carried out microdosimetry calculations with the aim of deriving reference dose average LET values from microdosimetric quantities. Significant differences between the methods were reported either with pristine or spread-out Bragg peaks (SOBPs). The first method reported values systematically higher than the other two at depths proximal to SOBP by about 15% for a 5.9 cm wide SOBP and about 30% for a 11.0 cm one. At distal SOBP, the second method gave values about 15% lower than the others. Overall, we found that the third method gave the most consistent

  16. Plutonium Critical Mass Curve Comparison to Mass at Upper Subcritical Limit (USL) Using Whisper

    Energy Technology Data Exchange (ETDEWEB)

    Alwin, Jennifer Louise [Los Alamos National Lab. (LANL), Los Alamos, NM (United States). Monte Carlo Codes; Zhang, Ning [Los Alamos National Lab. (LANL), Los Alamos, NM (United States). Nuclear Criticality Safety Division

    2016-09-27

    Whisper is computational software designed to assist the nuclear criticality safety analyst with validation studies with the MCNP® Monte Carlo radiation transport package. Standard approaches to validation rely on the selection of benchmarks based upon expert judgment. Whisper uses sensitivity/uncertainty (S/U) methods to select relevant benchmarks to a particular application or set of applications being analyzed. Using these benchmarks, Whisper computes a calculational margin. Whisper attempts to quantify the margin of subcriticality (MOS) from errors in software and uncertainties in nuclear data. The combination of the Whisper-derived calculational margin and MOS comprise the baseline upper subcritical limit (USL), to which an additional margin may be applied by the nuclear criticality safety analyst as appropriate to ensure subcriticality. A series of critical mass curves for plutonium, similar to those found in Figure 31 of LA-10860-MS, have been generated using MCNP6.1.1 and the iterative parameter study software, WORM_Solver. The baseline USL for each of the data points of the curves was then computed using Whisper 1.1. The USL was then used to determine the equivalent mass for plutonium metal-water system. ANSI/ANS-8.1 states that it is acceptable to use handbook data, such as the data directly from the LA-10860-MS, as it is already considered validated (Section 4.3 4) “Use of subcritical limit data provided in ANSI/ANS standards or accepted reference publications does not require further validation.”). This paper attempts to take a novel approach to visualize traditional critical mass curves and allows comparison with the amount of mass for which the keff is equal to the USL (calculational margin + margin of subcriticality). However, the intent is to plot the critical mass data along with USL, not to suggest that already accepted handbook data should have new and more rigorous requirements for validation.

  17. Reevaluation of JACS code system benchmark analyses of the heterogeneous system. Fuel rods in U+Pu nitric acid solution system

    Energy Technology Data Exchange (ETDEWEB)

    Takada, Tomoyuki; Miyoshi, Yoshinori; Katakura, Jun-ichi [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment

    2003-03-01

    In order to perform accuracy evaluation of the critical calculation by the combination of multi-group constant library MGCL and 3-dimensional Monte Carlo code KENO-IV among critical safety evaluation code system JACS, benchmark calculation was carried out from 1980 in 1982. Some cases where the neutron multiplication factor calculated in the heterogeneous system in it was less than 0.95 were seen. In this report, it re-calculated by considering the cause about the heterogeneous system of the U+Pu nitric acid solution systems containing the neutron poison shown in JAERI-M 9859. The present study has shown that the k{sub eff} value less than 0.95 given in JAERI-M 9859 is caused by the fact that the water reflector below a cylindrical container was not taken into consideration in the KENO-IV calculation model. By taking into the water reflector, the KENO-IV calculation gives a k{sub eff} value greater than 0.95 and a good agreement with the experiment. (author)

  18. Benchmarking – A tool for judgment or improvement?

    DEFF Research Database (Denmark)

    Rasmussen, Grane Mikael Gregaard

    2010-01-01

    these issues, and describes how effects are closely connected to the perception of benchmarking, the intended users of the system and the application of the benchmarking results. The fundamental basis of this paper is taken from the development of benchmarking in the Danish construction sector. Two distinct...... perceptions of benchmarking will be presented; public benchmarking and best practice benchmarking. These two types of benchmarking are used to characterize and discuss the Danish benchmarking system and to enhance which effects, possibilities and challenges that follow in the wake of using this kind...... of benchmarking. In conclusion it is argued that clients and the Danish government are the intended users of the benchmarking system. The benchmarking results are primarily used by the government for monitoring and regulation of the construction sector and by clients for contractor selection. The dominating use...

  19. Medicare Contracting - Redacted Benchmark Metric Reports

    Data.gov (United States)

    U.S. Department of Health & Human Services — The Centers for Medicare and Medicaid Services has compiled aggregate national benchmark cost and workload metrics using data submitted to CMS by the AB MACs and the...

  20. Professional Performance and Bureaucratic Benchmarking Information

    DEFF Research Database (Denmark)

    Schneider, Melanie L.; Mahlendorf, Matthias D.; Schäffer, Utz

    provision to the chief physician of the respective department. Professional performance is publicly disclosed due to regulatory requirements. At the same time, chief physicians typically receive bureaucratic benchmarking information from the administration. We find that more frequent bureaucratic...

  1. XWeB: The XML Warehouse Benchmark

    Science.gov (United States)

    Mahboubi, Hadj; Darmont, Jérôme

    With the emergence of XML as a standard for representing business data, new decision support applications are being developed. These XML data warehouses aim at supporting On-Line Analytical Processing (OLAP) operations that manipulate irregular XML data. To ensure feasibility of these new tools, important performance issues must be addressed. Performance is customarily assessed with the help of benchmarks. However, decision support benchmarks do not currently support XML features. In this paper, we introduce the XML Warehouse Benchmark (XWeB), which aims at filling this gap. XWeB derives from the relational decision support benchmark TPC-H. It is mainly composed of a test data warehouse that is based on a unified reference model for XML warehouses and that features XML-specific structures, and its associate XQuery decision support workload. XWeB's usage is illustrated by experiments on several XML database management systems.

  2. XWeB: the XML Warehouse Benchmark

    CERN Document Server

    Mahboubi, Hadj

    2011-01-01

    With the emergence of XML as a standard for representing business data, new decision support applications are being developed. These XML data warehouses aim at supporting On-Line Analytical Processing (OLAP) operations that manipulate irregular XML data. To ensure feasibility of these new tools, important performance issues must be addressed. Performance is customarily assessed with the help of benchmarks. However, decision support benchmarks do not currently support XML features. In this paper, we introduce the XML Warehouse Benchmark (XWeB), which aims at filling this gap. XWeB derives from the relational decision support benchmark TPC-H. It is mainly composed of a test data warehouse that is based on a unified reference model for XML warehouses and that features XML-specific structures, and its associate XQuery decision support workload. XWeB's usage is illustrated by experiments on several XML database management systems.

  3. In Silico Calculation of Infinite Dilution Activity Coefficients of Molecular Solutes in Ionic Liquids: Critical Review of Current Methods and New Models Based on Three Machine Learning Algorithms.

    Science.gov (United States)

    Paduszyński, Kamil

    2016-08-22

    The aim of the paper is to address all the disadvantages of currently available models for calculating infinite dilution activity coefficients (γ(∞)) of molecular solutes in ionic liquids (ILs)-a relevant property from the point of view of many applications of ILs, particularly in separations. Three new models are proposed, each of them based on distinct machine learning algorithm: stepwise multiple linear regression (SWMLR), feed-forward artificial neural network (FFANN), and least-squares support vector machine (LSSVM). The models were established based on the most comprehensive γ(∞) data bank reported so far (>34 000 data points for 188 ILs and 128 solutes). Following the paper published previously [J. Chem. Inf. Model 2014, 54, 1311-1324], the ILs were treated in terms of group contributions, whereas the Abraham solvation parameters were used to quantify an impact of solute structure. Temperature is also included in the input data of the models so that they can be utilized to obtain temperature-dependent data and thus related thermodynamic functions. Both internal and external validation techniques were applied to assess the statistical significance and explanatory power of the final correlations. A comparative study of the overall performance of the investigated SWMLR/FFANN/LSSVM approaches is presented in terms of root-mean-square error and average absolute relative deviation between calculated and experimental γ(∞), evaluated for different families of ILs and solutes, as well as between calculated and experimental infinite dilution selectivity for separation problems benzene from n-hexane and thiophene from n-heptane. LSSVM is shown to be a method with the lowest values of both training and generalization errors. It is finally demonstrated that the established models exhibit an improved accuracy compared to the state-of-the-art model, namely, temperature-dependent group contribution linear solvation energy relationship, published in 2011 [J. Chem

  4. Gas phase NMR and ab initio molecular orbital calculations of 5-methoxy-1,3-dioxanes: a critical survey of the Gauche effect

    Science.gov (United States)

    Abe, Akihiro; Furuya, Hidemine; Ichimura, Noriko; Kawauchi, Susumu

    1997-02-01

    The gas-phase NMR analysis of 5-methoxy-1,3-dioxanes was carried out. The conformational energies estimated from the observed coupling constant data were compared with the results of ab initio MO calculations using d95 + (2df,p) basis set at the MP2 level. While the energy difference between the axial-out and equatorial-out forms was in a reasonable agreement, the 1,5 interaction energy between the methoxy methyl and the ring oxygens was not in accord.

  5. Benchmarking Danish Vocational Education and Training Programmes

    DEFF Research Database (Denmark)

    Bogetoft, Peter; Wittrup, Jesper

    This study paper discusses methods whereby Danish vocational education and training colleges can be benchmarked, and presents results from a number of models. It is conceptually complicated to benchmark vocational colleges, as the various colleges in Denmark offer a wide range of course programmes...... attempt to summarise the various effects that the colleges have in two relevant figures, namely retention rates of students and employment rates among students who have completed training programmes....

  6. Aerodynamic Benchmarking of the Deepwind Design

    DEFF Research Database (Denmark)

    Bedona, Gabriele; Schmidt Paulsen, Uwe; Aagaard Madsen, Helge;

    2015-01-01

    The aerodynamic benchmarking for the DeepWind rotor is conducted comparing different rotor geometries and solutions and keeping the comparison as fair as possible. The objective for the benchmarking is to find the most suitable configuration in order to maximize the power production and minimize...... NACA airfoil family. (C) 2015 Published by Elsevier Ltd. This is an open access article under the CC BY-NC-ND license...

  7. Simple Benchmark Specifications for Space Radiation Protection

    Science.gov (United States)

    Singleterry, Robert C. Jr.; Aghara, Sukesh K.

    2013-01-01

    This report defines space radiation benchmark specifications. This specification starts with simple, monoenergetic, mono-directional particles on slabs and progresses to human models in spacecraft. This report specifies the models and sources needed to what the team performing the benchmark needs to produce in a report. Also included are brief descriptions of how OLTARIS, the NASA Langley website for space radiation analysis, performs its analysis.

  8. Neutrophil and Monocyte CD64 and CD163 Expression in Critically Ill Neonates and Children with Sepsis: Comparison of Fluorescence Intensities and Calculated Indexes

    Directory of Open Access Journals (Sweden)

    Mojca Groselj-Grenc

    2008-01-01

    Full Text Available Objective. To evaluate the expression of CD64 and CD163 on neutrophils and monocytes in SIRS with/without sepsis and to compare the diagnostic accuracy of CD64 and CD163 molecules expression determined as (1 mean fluorescence intensities (MFI of CD64 and CD163; and (2 the ratio (index of linearized MFI to the fluorescence signal of standardized beads. Patients and methods. Fifty-six critically ill neonates and children with systemic inflammatory response syndrome (SIRS and suspected sepsis, classified into two groups: SIRS with sepsis (n=29 and SIRS without sepsis (n=27. Results. CD64 and CD163 MFI measured on neutrophils and monocytes were elevated in patients with SIRS with sepsis. Diagnostic accuracy of indexes was equal to diagnostic accuracy of MFI for CD64 on neutrophils (0.833 versus 0.854 for day 0 and 0.975 versus 0.983 for day 1 and monocytes (0.811 versus 0.865 for day 0 and 0.825 versus 0.858 for day 1, and CD163 on neutrophils (0.595 versus 0.655 for day 0 and 0.677 versus 0.750 for day 1, but not for CD163 on monocytes. Conclusion. CD64 MFI, CD163 MFI, CD64 indexes for neutrophils and monocytes, and CD163 index for neutrophils can all be used for discrimination of SIRS and sepsis in critically ill neonates and children. CD64 index for neutrophils, however, is superior to all other markers.

  9. Benchmarking for Cost Improvement. Final report

    Energy Technology Data Exchange (ETDEWEB)

    1993-09-01

    The US Department of Energy`s (DOE) Office of Environmental Restoration and Waste Management (EM) conducted the Benchmarking for Cost Improvement initiative with three objectives: Pilot test benchmarking as an EM cost improvement tool; identify areas for cost improvement and recommend actions to address these areas; provide a framework for future cost improvement. The benchmarking initiative featured the use of four principal methods (program classification, nationwide cost improvement survey, paired cost comparison and component benchmarking). Interested parties contributed during both the design and execution phases. The benchmarking initiative was conducted on an accelerated basis. Of necessity, it considered only a limited set of data that may not be fully representative of the diverse and complex conditions found at the many DOE installations. The initiative generated preliminary data about cost differences and it found a high degree of convergence on several issues. Based on this convergence, the report recommends cost improvement strategies and actions. This report describes the steps taken as part of the benchmarking initiative and discusses the findings and recommended actions for achieving cost improvement. The results and summary recommendations, reported below, are organized by the study objectives.

  10. Several Light Nulcie Evaluations Testing With LLNL Pulsed Sphere Benchmarks

    Institute of Scientific and Technical Information of China (English)

    ZHANG; Huan-yu

    2012-01-01

    <正>In this work, Lawrence Livermore pulsed sphere experiments were modeled using Monte Carlo N-particle code (MCNP) for the purpose of benchmarking the new release of nuclear data librarys (CENDL-3[1], ENDF/B-Ⅶ.1[2], JENDL-4.0[3]). This program consisted of 12 different spheres, including 6 Li, 7Li, Be, C, N, O, LiD, Air, H2O, D2O, polythene and teflon. The calculated results were compared to experimental results[4-5].

  11. Synthesis of the OECD/NEA-PSI CFD benchmark exercise

    Energy Technology Data Exchange (ETDEWEB)

    Andreani, Michele, E-mail: Michele.andreani@psi.ch; Badillo, Arnoldo; Kapulla, Ralf

    2016-04-01

    Highlights: • A benchmark exercise on stratification erosion in containment was conducted using a test in the PANDA facility. • Blind calculations were provided by nineteen participants. • Results were compared with experimental data. • A ranking was made. • A large spread of results was observed, with very few simulations providing accurate results for the most important variables, though not for velocities. - Abstract: The third International Benchmark Exercise (IBE-3) conducted under the auspices of OECD/NEA is based on the comparison of blind CFD simulations with experimental data addressing the erosion of a stratified layer by an off-axis buoyant jet in a large vessel. The numerical benchmark exercise is based on a dedicated experiment in the PANDA facility conducted at the Paul Scherrer Institut (PSI) in Switzerland, using only one vessel. The use of non-prototypical fluids (i.e. helium as simulant for hydrogen, and air as simulant for steam), and the consequent absence of the complex physical effects produced by steam condensation enhanced the suitability of the data for CFD validation purposes. The test started with a helium–air layer at the top of the vessel and air in the lower part. The helium-rich layer was gradually eroded by a low-momentum air/helium jet emerging at a lower elevation. Blind calculation results were submitted by nineteen participants, and the calculation results have been compared with the PANDA data. This report, adopting the format of the reports for the two previous exercises, includes a ranking of the contributions, where the largest weight is given to the time progression of the erosion of the helium-rich layer. In accordance with the limited scope of the benchmark exercise, this report is more a collection of comparisons between calculated results and data than a synthesis. Therefore, the few conclusions are based on the mere observation of the agreement of the various submissions with the test result, and do not

  12. National Energy Software Center: benchmark problem book. Revision

    Energy Technology Data Exchange (ETDEWEB)

    None

    1985-12-01

    Computational benchmarks are given for the following problems: (1) Finite-difference, diffusion theory calculation of a highly nonseparable reactor, (2) Iterative solutions for multigroup two-dimensional neutron diffusion HTGR problem, (3) Reference solution to the two-group diffusion equation, (4) One-dimensional neutron transport transient solutions, (5) To provide a test of the capabilities of multi-group multidimensional kinetics codes in a heavy water reactor, (6) Test of capabilities of multigroup neutron diffusion in LMFBR, and (7) Two-dimensional PWR models.

  13. Critical evaluation of the LDA + U approach for band gap corrections in point defect calculations: The oxygen vacancy in ZnO case study

    Energy Technology Data Exchange (ETDEWEB)

    Boonchun, Adisak; Lambrecht, Walter R.L. [Department of Physics, Case Western Reserve University, Cleveland, OH 444106-7079 (United States)

    2011-05-15

    The LDA + U approach can be used essentially as a shift potential to open up the band gap of a semiconductor. This approach was previously applied to the oxygen vacancy in ZnO by Paudel and Lambrecht (PL) [Phys. Rev. B 77, 205202 (2008)]. Here, we review the results of that approach and introduce additional refinements of the LDA + U model. Good agreement is obtained with recent hybrid functional calculations on the position of the {epsilon}(2+/0) transition state. A comparison of various approaches on the oxygen vacancy in ZnO is provided. The relevance of the one-electron levels to the experiments is discussed. (Copyright copyright 2011 WILEY-VCH Verlag GmbH and Co. KGaA, Weinheim)

  14. Benchmarking Equity in Transfer Policies for Career and Technical Associate's Degrees

    Science.gov (United States)

    Chase, Megan M.

    2011-01-01

    Using critical policy analysis, this study considers state policies that impede technical credit transfer from public 2-year colleges to 4-year institutions of higher education. The states of Ohio, Texas, Washington, and Wisconsin are considered, and seven policy benchmarks for facilitating the transfer of technical credits are proposed. (Contains…

  15. Benchmarking von Krankenhausinformationssystemen – eine vergleichende Analyse deutschsprachiger Benchmarkingcluster

    Directory of Open Access Journals (Sweden)

    Jahn, Franziska

    2015-08-01

    Full Text Available Benchmarking is a method of strategic information management used by many hospitals today. During the last years, several benchmarking clusters have been established within the German-speaking countries. They support hospitals in comparing and positioning their information system’s and information management’s costs, performance and efficiency against other hospitals. In order to differentiate between these benchmarking clusters and to provide decision support in selecting an appropriate benchmarking cluster, a classification scheme is developed. The classification scheme observes both general conditions and examined contents of the benchmarking clusters. It is applied to seven benchmarking clusters which have been active in the German-speaking countries within the last years. Currently, performance benchmarking is the most frequent benchmarking type, whereas the observed benchmarking clusters differ in the number of benchmarking partners and their cooperation forms. The benchmarking clusters also deal with different benchmarking subjects. Assessing costs and quality application systems, physical data processing systems, organizational structures of information management and IT services processes are the most frequent benchmarking subjects. There is still potential for further activities within the benchmarking clusters to measure strategic and tactical information management, IT governance and quality of data and data-processing processes. Based on the classification scheme and the comparison of the benchmarking clusters, we derive general recommendations for benchmarking of hospital information systems.

  16. Benchmarking study and its application for shielding analysis of large accelerator facilities

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Hee-Seock; Kim, Dong-hyun; Oranj, Leila Mokhtari; Oh, Joo-Hee; Lee, Arim; Jung, Nam-Suk [POSTECH, Pohang (Korea, Republic of)

    2015-10-15

    Shielding Analysis is one of subjects which are indispensable to construct large accelerator facility. Several methods, such as the Monte Carlo, discrete ordinate, and simplified calculation, have been used for this purpose. The calculation precision is overcome by increasing the trial (history) numbers. However its accuracy is still a big issue in the shielding analysis. To secure the accuracy in the Monte Carlo calculation, the benchmarking study using experimental data and the code comparison are adopted fundamentally. In this paper, the benchmarking result for electrons, protons, and heavy ions are presented as well as the proper application of the results is discussed. The benchmarking calculations, which are indispensable in the shielding analysis were performed for different particles: proton, heavy ion and electron. Four different multi-particle Monte Carlo codes, MCNPX, FLUKA, PHITS, and MARS, were examined for higher energy range equivalent to large accelerator facility. The degree of agreement between the experimental data including the SINBAD database and the calculated results were estimated in the terms of secondary neutron production and attenuation through the concrete and iron shields. The degree of discrepancy and the features of Monte Carlo codes were investigated and the application way of the benchmarking results are discussed in the view of safety margin and selecting the code for the shielding analysis. In most cases, the tested Monte Carlo codes give proper credible results except of a few limitation of each codes.

  17. Neutron Monitors and muon detectors for solar modulation studies: Interstellar flux, yield function, and assessment of critical parameters in count rate calculations

    CERN Document Server

    Maurin, D; Derome, L; Ghelfi, A; Hubert, G

    2014-01-01

    Particles count rates at given Earth location and altitude result from the convolution of (i) the interstellar (IS) cosmic-ray fluxes outside the solar cavity, (ii) the time-dependent modulation of IS into Top-of-Atmosphere (TOA) fluxes, (iii) the rigidity cut-off (or geomagnetic transmission function) and grammage at the counter location, (iv) the atmosphere response to incoming TOA cosmic rays (shower development), and (v) the counter response to the various particles/energies in the shower. Count rates from neutron monitors or muon counters are therefore a proxy to solar activity. In this paper, we review all ingredients, discuss how their uncertainties impact count rate calculations, and how they translate into variation/uncertainties on the level of solar modulation $\\phi$ (in the simple Force-Field approximation). The main uncertainty for neutron monitors is related to the yield function. However, many other effects have a significant impact, at the 5-10% level on $\\phi$ values. We find no clear ranking...

  18. Gauge Couplings calculated from Multiple Point Criticality yield $\\alpha^{-1}=136._8\\pm 9$ : At Last the Elusive Case of $U(1)$

    CERN Document Server

    Bennett, D L

    1996-01-01

    We calculate the $U(1)$ continuum gauge coupling using the values of action parameters at the multiple point in the phase diagram of a lattice gauge theory. The multiple point is where a maximum number of phases convene. We obtain for the running inverse finestructure constant the values scale and the $M_Z$ scale. The gauge group underlying the phase diagram in which we seek multiple point values of action parameters is what we call the Anti Grand Unified Theory (AGUT) gauge group $SMG^3$ which is the Cartesian product of 3 standard model groups (SMGs). There is one SMG factor for each of the $N_{gen}=3$ generations of quarks and leptons. In our model, this gauge group $SMG^3$ is the predecessor to the usual standard model group. The latter arises as the diagonal subgroup surviving the Planck scale breakdown of $SMG^3. This breakdown leads to a weakening of the $U(1)$ coupling by a $N_{gen}$-related factor. The most important correction obtained from using multiple point parameter values (in a multi-parameter...

  19. Criticality and shielding calculations for containers in dry of spent fuel of TRIGA Mark III reactor of ININ; Calculos de criticidad y blindaje para contenedores en seco de combustible gastado del reactor Triga Mark III del ININ

    Energy Technology Data Exchange (ETDEWEB)

    Barranco R, F.

    2015-07-01

    In this thesis criticality and shielding calculations to evaluate the design of a container of dry storage of spent nuclear fuel generated in research reactors were made. The design of such container was originally proposed by Argentina and Brazil, and the Instituto Nacional de Investigaciones Nucleares (ININ) of Mexico. Additionally, it is proposed to modify the design of this container to store spent fuel 120 that are currently in the pool of TRIGA Mark III reactor, the Nuclear Center of Mexico and calculations and analyzes are made to verify that the settlement of these fuel elements is subcritical limits and dose rates to workers and the general public are not exceeded. These calculations are part of the design criteria for security protection systems in dry storage system (Dss for its acronym in English) proposed by the Nuclear Regulatory Commission (NRC) of the United States. To carry out these calculations simulation codes of Monte Carlo particle transport as MCNPX and MCNP5 were used. The initial design (design 1) 78 intended to store spent fuel with a maximum of 115. The ININ has 120 fuel elements and spent 3 control rods (currently stored in the reactor pool). This leads to the construction of two containers of the original design, but for economic reasons was decided to modify (design 2) to store in a single container. Criticality calculations are performed to 78, 115 and fresh fuel elements 124 within the container, to the two arrangements described in Chapter 4, modeling the three-dimensional geometry assuming normal operating conditions and accident. These calculations are focused to demonstrate that the container will remain subcritical, that is, that the effective multiplication factor is less than 1, in particular not greater than 0.95 (as per specified by the NRC). Spent fuel 78 and 124 within the container, both gamma radiation to neutron shielding calculations for only two cases were simulated. First actinides and fission products generated

  20. Criteria of benchmark selection for efficient flexible multibody system formalisms

    Directory of Open Access Journals (Sweden)

    Valášek M.

    2007-10-01

    Full Text Available The paper deals with the selection process of benchmarks for testing and comparing efficient flexible multibody formalisms. The existing benchmarks are briefly summarized. The purposes for benchmark selection are investigated. The result of this analysis is the formulation of the criteria of benchmark selection for flexible multibody formalisms. Based on them the initial set of suitable benchmarks is described. Besides that the evaluation measures are revised and extended.

  1. Benchmarking local healthcare-associated infections: available benchmarks and interpretation challenges.

    Science.gov (United States)

    El-Saed, Aiman; Balkhy, Hanan H; Weber, David J

    2013-10-01

    Growing numbers of healthcare facilities are routinely collecting standardized data on healthcare-associated infection (HAI), which can be used not only to track internal performance but also to compare local data to national and international benchmarks. Benchmarking overall (crude) HAI surveillance metrics without accounting or adjusting for potential confounders can result in misleading conclusions. Methods commonly used to provide risk-adjusted metrics include multivariate logistic regression analysis, stratification, indirect standardization, and restrictions. The characteristics of recognized benchmarks worldwide, including the advantages and limitations are described. The choice of the right benchmark for the data from the Gulf Cooperation Council (GCC) states is challenging. The chosen benchmark should have similar data collection and presentation methods. Additionally, differences in surveillance environments including regulations should be taken into consideration when considering such a benchmark. The GCC center for infection control took some steps to unify HAI surveillance systems in the region. GCC hospitals still need to overcome legislative and logistic difficulties in sharing data to create their own benchmark. The availability of a regional GCC benchmark may better enable health care workers and researchers to obtain more accurate and realistic comparisons.

  2. The Concepts "Benchmarks and Benchmarking" Used in Education Planning: Teacher Education as Example

    Science.gov (United States)

    Steyn, H. J.

    2015-01-01

    Planning in education is a structured activity that includes several phases and steps that take into account several kinds of information (Steyn, Steyn, De Waal & Wolhuter, 2002: 146). One of the sets of information that are usually considered is the (so-called) "benchmarks" and "benchmarking" regarding the focus of a…

  3. Test Nationally, Benchmark Locally: Using Local DIBELS Benchmarks to Predict Performance on the Pssa

    Science.gov (United States)

    Ferchalk, Matthew R.

    2013-01-01

    The Dynamic Indicators of Basic Early Literacy Skills (DIBELS) benchmarks are frequently used to make important decision regarding student performance. More information, however, is needed to understand if the nationally-derived benchmarks created by the DIBELS system provide the most accurate criterion for evaluating reading proficiency. The…

  4. 扩散临界计算中的多重迭代优化技术%Multi-level Iteration Optimization for Diffusive Critical Calculation

    Institute of Scientific and Technical Information of China (English)

    李云召; 吴宏春; 曹良志; 郑友琦

    2013-01-01

    In nuclear reactor core neutron diffusion calculation ,there are usually at least three levels of iterations ,namely the fission source iteration ,the multi-group scattering source iteration and the within-group iteration .Unnecessary calculations occur if the inner iterations are converged extremely tight .But the convergence of the outer iteration may be affected if the inner ones are converged insufficiently tight .Thus ,a common scheme suit for most of the problems was proposed in this work to automatically find the optimized settings .The basic idea is to optimize the relative error tolerance of the inner iteration based on the corresponding convergence rate of the outer iteration .Numerical results of a typical thermal neutron reactor core problem and a fast neutron reactor core problem demonstrate the effectiveness of this algorithm in the variational nodal method code NODAL with the Gauss-Seidel left preconditioned multi-group GMRES algorithm . The multi-level iteration optimization scheme reduces the number of multi-group and within-group iterations respectively by a factor of about 1-2 and 5-21 .%核反应堆堆芯中子扩散计算中一般存在至少3重迭代:裂变源迭代、多群散射源迭代和群内迭代。为避免相邻两重迭代之间产生的内层迭代收敛太强会引入多余计算量、收敛太弱会使外层迭代收敛性变差甚至不收敛的问题,本文针对采用带Gauss-Seidel左预处理的多群GM RES算法的变分节块法,设计并验证了一种对大多数问题均适用的多重迭代优化技术,其优化设置多群迭代和群内迭代的收敛准则的基本思想是:相应内层迭代收敛误差限正比于相应外层迭代的误差衰减率。数值计算基于商用压水堆和钠冷快堆选取的两个有代表性的堆芯算例进行,相应的结果表明:该多重迭代优化技术可对多群迭代加速约1~2倍,对群内迭代加速约5~21倍。

  5. Critical analysis of spectral solvent shifts calculated by the contemporary PCM approaches of a representative series of charge-transfer complexes between tetracyanoethylene and methylated benzenes.

    Science.gov (United States)

    Budzák, Šimon; Mach, Pavel; Medved', Miroslav; Kysel', Ondrej

    2015-07-21

    Applications of contemporary polarisable continuum model (PCM) quantum chemical approaches to account for the solvent shifts of UV-Vis absorption charge transfer (CT) transitions in electron donor-acceptor (EDA) complexes (as well as to account for their stability and other properties in solvents) have been rather rare until now. In this study, we systematically applied different - mainly state-specific - PCM approaches to examine excited state properties, namely, solvatochromic excitation energy shifts in a series of EDA complexes of a tetracyanoethylene (TCNE) acceptor with methyl substituted benzenes with different degrees of methylation N (NMB). For these complexes, representative and reliable experimental data exist both for the gas phase and in solution (dichloromethane). We have found that the linear response (LR) solvent shifts are too small compared to the experimental values, while self-consistent SS approaches give values that are too large. The best agreement with experimental values was obtained by corrected LR (cLR). The transition energies were calculated by means of TD-DFT methodology with PBE0, CAM-B3LYP and M06-2X functionals as well as the wave function CC2 method for the gas phase, and the PCM solvent shifts were added to account for the solvent effects. The best results for transition energies in solvents were obtained using the CC2 method complemented by CAM-B3LYP/cLR for the gas phase transition energy red solvent shift, while all three TD-DFT approaches used gave insufficient values (ca. 50%) of the slope of the dependence of the transition energies on N compared to experimental values.

  6. Features and technology of enterprise internal benchmarking

    Directory of Open Access Journals (Sweden)

    A.V. Dubodelova

    2013-06-01

    Full Text Available The aim of the article. The aim of the article is to generalize characteristics, objectives, advantages of internal benchmarking. The stages sequence of internal benchmarking technology is formed. It is focused on continuous improvement of process of the enterprise by implementing existing best practices.The results of the analysis. Business activity of domestic enterprises in crisis business environment has to focus on the best success factors of their structural units by using standard research assessment of their performance and their innovative experience in practice. Modern method of those needs satisfying is internal benchmarking. According to Bain & Co internal benchmarking is one the three most common methods of business management.The features and benefits of benchmarking are defined in the article. The sequence and methodology of implementation of individual stages of benchmarking technology projects are formulated.The authors define benchmarking as a strategic orientation on the best achievement by comparing performance and working methods with the standard. It covers the processes of researching, organization of production and distribution, management and marketing methods to reference objects to identify innovative practices and its implementation in a particular business.Benchmarking development at domestic enterprises requires analysis of theoretical bases and practical experience. Choice best of experience helps to develop recommendations for their application in practice.Also it is essential to classificate species, identify characteristics, study appropriate areas of use and development methodology of implementation. The structure of internal benchmarking objectives includes: promoting research and establishment of minimum acceptable levels of efficiency processes and activities which are available at the enterprise; identification of current problems and areas that need improvement without involvement of foreign experience

  7. Toxicological benchmarks for wildlife: 1994 Revision

    Energy Technology Data Exchange (ETDEWEB)

    Opresko, D.M.; Sample, B.E.; Suter, G.W. II

    1994-09-01

    The process by which ecological risks of environmental contaminants are evaluated is two-tiered. The first tier is a screening assessment where concentrations of contaminants in the environment are compared to toxicological benchmarks which represent concentrations of chemicals in environmental media (water, sediment, soil, food, etc.) that are presumed to be nonhazardous to the surrounding biota. The second tier is a baseline ecological risk assessment where toxicological benchmarks are one of several lines of evidence used to support or refute the presence of ecological effects. The report presents toxicological benchmarks for assessment of effects of 76 chemicals on 8 representative mammalian wildlife species and 31 chemicals on 9 avian wildlife species. The chemicals are some of those that occur at United States Department of Energy waste sites; the wildlife species were chosen because they are widely distributed and provide a representative range of body sizes and diets. Further descriptions of the chosen wildlife species and chemicals are provided in the report. The benchmarks presented in this report represent values believed to be nonhazardous for the listed wildlife species. These benchmarks only consider contaminant exposure through oral ingestion of contaminated media; exposure through inhalation or direct dermal exposure are not considered in this report.

  8. Organizational and economic aspects of benchmarking innovative products at the automobile industry enterprises

    Directory of Open Access Journals (Sweden)

    L.M. Taraniuk

    2016-06-01

    Full Text Available The aim of the article. The aim of the article is to determine the nature and characteristics of the use of benchmarking in the activity of domestic enterprises of automobile industry under current economic conditions. The results of the analysis. The article identified the concept of benchmarking, examining the stages of benchmarking, determination the efficiency of benchmarking in work automakers. It is considered the historical aspects of the emergence of benchmarking method in world economics. It is determined the economic aspects of the benchmarking in the work of enterprise automobile industry. The analysis on the stages of benchmarking of innovative products in the modern development of the productive forces and the impact of market factors on the economic activities of companies, including in the enterprise of automobile industry. The attention is focused on the specifics of implementing benchmarking at companies of automobile industry. It is considered statistics number of owners of electric vehicles worldwide. The authors researched market of electric vehicles in Ukraine. Also, it is considered the need of benchmarking using to improve the competitiveness of the national automobile industry especially CJSC “Zaporizhia Automobile Building Plant”. Authors suggested reasonable steps for its improvement. The authors improved methodical approach to assessing the selection of vehicles with the best technical parameters based on benchmarking, which, unlike the existing ones, based on the calculation of the integral factor of technical specifications of vehicles in order to establish better competitive products of companies automobile industry among evaluated. The main indicators of the national production of electric vehicles are shown. Attention is paid to the development of important ways of CJSC “Zaporizhia Automobile Building Plant”, where authors established the aspects that need to pay attention in the management of the

  9. Actinides transmutation - a comparison of results for PWR benchmark

    Energy Technology Data Exchange (ETDEWEB)

    Claro, Luiz H. [Instituto de Estudos Avancados (IEAv/CTA), Sao Jose dos Campos, SP (Brazil)], e-mail: luizhenu@ieav.cta.br

    2009-07-01

    The physical aspects involved in the Partitioning and Transmutation (P and T) of minor actinides (MA) and fission products (FP) generated by reactors PWR are of great interest in the nuclear industry. Besides these the reduction in the storage of radioactive wastes are related with the acceptability of the nuclear electric power. From the several concepts for partitioning and transmutation suggested in literature, one of them involves PWR reactors to burn the fuel containing plutonium and minor actinides reprocessed of UO{sub 2} used in previous stages. In this work are presented the results of the calculations of a benchmark in P and T carried with WIMSD5B program using its new cross sections library generated from the ENDF-B-VII and the comparison with the results published in literature by other calculations. For comparison, was used the benchmark transmutation concept based in a typical PWR cell and the analyzed results were the k{infinity} and the atomic density of the isotopes Np-239, Pu-241, Pu-242 and Am-242m, as function of burnup considering discharge of 50 GWd/tHM. (author)

  10. Status of international benchmark experiment for effective delayed neutron fraction ({beta}eff)

    Energy Technology Data Exchange (ETDEWEB)

    Okajima, S.; Sakurai, T.; Mukaiyama, T. [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment

    1997-03-01

    To improve the prediction accuracy of the {beta}eff, the program of the international benchmark experiment (Beta Effect Reactor Experiment for a New International Collaborative Evaluation: BERNICE) was planned. This program composed of two parts; BERNICE-MASURCA and BERNICE-FCA. The former one was carried out in the fast critical facility MASURCA of CEA, FRANCE between 1993 and 1994. The latter one started in the FCA, JAERI in 1995 and still is going. In these benchmark experiments, various experimental techniques have been applied for in-pile measurements of the {beta}eff. The accuracy of the measurements was better than 3%. (author)

  11. CFD Modeling of Thermal Manikin Heat Loss in a Comfort Evaluation Benchmark Test

    DEFF Research Database (Denmark)

    Nilsson, Håkan O.; Brohus, Henrik; Nielsen, Peter V.

    2007-01-01

    and companies still use several in-house codes for their calculations. The validation and association with human perception and heat losses in reality is consequently very difficult to make. This paper is providing requirements for the design and development of computer manikins and CFD benchmark tests...

  12. Clinically Significant Change to Establish Benchmarks in Residential Drug and Alcohol Treatment Services

    Science.gov (United States)

    Billingham, Daniel D.; Kelly, Peter J.; Deane, Frank P.; Crowe, Trevor P.; Buckingham, Mark S.; Craig, Fiona L.

    2012-01-01

    There is increasing emphasis on the use routine outcome assessment measures to inform quality assurance initiatives. The calculation of reliable and clinically significant change indices is one strategy that organizations could use to develop both internal and externally focused benchmarking processes. The current study aimed to develop reliable…

  13. Energy benchmarking of South Australian WWTPs.

    Science.gov (United States)

    Krampe, J

    2013-01-01

    Optimising the energy consumption and energy generation of wastewater treatment plants (WWTPs) is a topic with increasing importance for water utilities in times of rising energy costs and pressures to reduce greenhouse gas (GHG) emissions. Assessing the energy efficiency and energy optimisation of a WWTP are difficult tasks as most plants vary greatly in size, process layout and other influencing factors. To overcome these limits it is necessary to compare energy efficiency with a statistically relevant base to identify shortfalls and optimisation potential. Such energy benchmarks have been successfully developed and used in central Europe over the last two decades. This paper demonstrates how the latest available energy benchmarks from Germany have been applied to 24 WWTPs in South Australia. It shows how energy benchmarking can be used to identify shortfalls in current performance, prioritise detailed energy assessments and help inform decisions on capital investment.

  14. Confidential benchmarking based on multiparty computation

    DEFF Research Database (Denmark)

    Damgård, Ivan Bjerre; Damgård, Kasper Lyneborg; Nielsen, Kurt;

    We report on the design and implementation of a system that uses multiparty computation to enable banks to benchmark their customers' confidential performance data against a large representative set of confidential performance data from a consultancy house. The system ensures that both the banks......' and the consultancy house's data stays confidential, the banks as clients learn nothing but the computed benchmarking score. In the concrete business application, the developed prototype help Danish banks to find the most efficient customers among a large and challenging group of agricultural customers with too much...... debt. We propose a model based on linear programming for doing the benchmarking and implement it using the SPDZ protocol by Damgård et al., which we modify using a new idea that allows clients to supply data and get output without having to participate in the preprocessing phase and without keeping...

  15. FGK Benchmark Stars A new metallicity scale

    CERN Document Server

    Jofre, Paula; Soubiran, C; Blanco-Cuaresma, S; Pancino, E; Bergemann, M; Cantat-Gaudin, T; Hernandez, J I Gonzalez; Hill, V; Lardo, C; de Laverny, P; Lind, K; Magrini, L; Masseron, T; Montes, D; Mucciarelli, A; Nordlander, T; Recio-Blanco, A; Sobeck, J; Sordo, R; Sousa, S G; Tabernero, H; Vallenari, A; Van Eck, S; Worley, C C

    2013-01-01

    In the era of large spectroscopic surveys of stars of the Milky Way, atmospheric parameter pipelines require reference stars to evaluate and homogenize their values. We provide a new metallicity scale for the FGK benchmark stars based on their corresponding fundamental effective temperature and surface gravity. This was done by analyzing homogeneously with up to seven different methods a spectral library of benchmark stars. Although our direct aim was to provide a reference metallicity to be used by the Gaia-ESO Survey, the fundamental effective temperatures and surface gravities of benchmark stars of Heiter et al. 2013 (in prep) and their metallicities obtained in this work can also be used as reference parameters for other ongoing surveys, such as Gaia, HERMES, RAVE, APOGEE and LAMOST.

  16. Professional Performance and Bureaucratic Benchmarking Information

    DEFF Research Database (Denmark)

    Schneider, Melanie L.; Mahlendorf, Matthias D.; Schäffer, Utz

    Professionals are often expected to be reluctant with regard to bureaucratic controls because of assumed conflicting values and goals of the organization vis-à-vis the profession. We suggest however, that the provision of bureaucratic benchmarking information is positively associated with profess......Professionals are often expected to be reluctant with regard to bureaucratic controls because of assumed conflicting values and goals of the organization vis-à-vis the profession. We suggest however, that the provision of bureaucratic benchmarking information is positively associated...... for 191 orthopaedics departments of German hospitals matched with survey data on bureaucratic benchmarking information provision to the chief physician of the respective department. Professional performance is publicly disclosed due to regulatory requirements. At the same time, chief physicians typically...

  17. Shielding Integral Benchmark Archive and Database (SINBAD)

    Energy Technology Data Exchange (ETDEWEB)

    Kirk, Bernadette Lugue [ORNL; Grove, Robert E [ORNL; Kodeli, I. [International Atomic Energy Agency (IAEA); Sartori, Enrico [ORNL; Gulliford, J. [OECD Nuclear Energy Agency

    2011-01-01

    The Shielding Integral Benchmark Archive and Database (SINBAD) collection of benchmarks was initiated in the early 1990 s. SINBAD is an international collaboration between the Organization for Economic Cooperation and Development s Nuclear Energy Agency Data Bank (OECD/NEADB) and the Radiation Safety Information Computational Center (RSICC) at Oak Ridge National Laboratory (ORNL). SINBAD is a major attempt to compile experiments and corresponding computational models with the goal of preserving institutional knowledge and expertise that need to be handed down to future scientists. SINBAD is also a learning tool for university students and scientists who need to design experiments or gain expertise in modeling and simulation. The SINBAD database is currently divided into three categories fission, fusion, and accelerator benchmarks. Where possible, each experiment is described and analyzed using deterministic or probabilistic (Monte Carlo) radiation transport software.

  18. Benchmarking optimization solvers for structural topology optimization

    DEFF Research Database (Denmark)

    Rojas Labanda, Susana; Stolpe, Mathias

    2015-01-01

    The purpose of this article is to benchmark different optimization solvers when applied to various finite element based structural topology optimization problems. An extensive and representative library of minimum compliance, minimum volume, and mechanism design problem instances for different...... sizes is developed for this benchmarking. The problems are based on a material interpolation scheme combined with a density filter. Different optimization solvers including Optimality Criteria (OC), the Method of Moving Asymptotes (MMA) and its globally convergent version GCMMA, the interior point...... profiles conclude that general solvers are as efficient and reliable as classical structural topology optimization solvers. Moreover, the use of the exact Hessians in SAND formulations, generally produce designs with better objective function values. However, with the benchmarked implementations solving...

  19. Benchmark Evaluation of the HTR-PROTEUS Absorber Rod Worths (Core 4)

    Energy Technology Data Exchange (ETDEWEB)

    John D. Bess; Leland M. Montierth

    2014-06-01

    PROTEUS was a zero-power research reactor at the Paul Scherrer Institute (PSI) in Switzerland. The critical assembly was constructed from a large graphite annulus surrounding a central cylindrical cavity. Various experimental programs were investigated in PROTEUS; during the years 1992 through 1996, it was configured as a pebble-bed reactor and designated HTR-PROTEUS. Various critical configurations were assembled with each accompanied by an assortment of reactor physics experiments including differential and integral absorber rod measurements, kinetics, reaction rate distributions, water ingress effects, and small sample reactivity effects [1]. Four benchmark reports were previously prepared and included in the March 2013 edition of the International Handbook of Evaluated Reactor Physics Benchmark Experiments (IRPhEP Handbook) [2] evaluating eleven critical configurations. A summary of that effort was previously provided [3] and an analysis of absorber rod worth measurements for Cores 9 and 10 have been performed prior to this analysis and included in PROTEUS-GCR-EXP-004 [4]. In the current benchmark effort, absorber rod worths measured for Core Configuration 4, which was the only core with a randomly-packed pebble loading, have been evaluated for inclusion as a revision to the HTR-PROTEUS benchmark report PROTEUS-GCR-EXP-002.

  20. Calculation method of critical buckling stress for stiffened plate with closed ribs%闭口肋加劲板屈曲临界应力计算方法

    Institute of Scientific and Technical Information of China (English)

    张茜; 狄谨; 周绪红

    2012-01-01

    采用能量法,推导了单向均匀受压四边简支闭口肋加劲板屈曲临界应力计算方法,考虑加劲肋扭转刚度的影响,按照截面实际形心位置计算了加劲肋和母板的抗弯刚度。以苏通大桥钢箱梁中采用的梯形闭口肋加劲板为例,采用Timoshenko方法、小西一郎方法、板壳有限元法及提出的能量法进行了屈曲临界应力比较。分析结果表明:加劲板长宽比口小于1时,Timoshenko方法和小西一郎方法计算的临界应力与钢材屈服强度比值A大于能量法计算值;口在1~6之间时,Timoshenko方法和小西一郎方法计算的A值小于能量法计算值;口在3~6之间时,能量法计算值与有限元分析结果最接近,偏差在9%~25%之间。可见,采用能量法进行正交异性钢箱梁顶、底板弹性稳定分析可行。%A calculation method of critical buckling stress for stiffened plate with closed ribs was proposed by using energy method under unidirectional uniform pressure and simply supported on four sides. The influence of torsional rigidity of stiffened ribs was considered, the whole flexural rigidity of mother board and stiffened ribs was calculated according to the centroid of actual section. The stiffened plates with closed trapezoidal ribs in the steel box girder of Suzhou- Nantong Bridge were taken as example, the critical buckling stresses calculated by Timoshenko method, Ichiro Konishi method, shell finite element method and the proposed energy method were compared. Analysis result shows that when the length-width ratio t9 of stiffened plate is less than 1, the ratio A values of critical buckling stress to steel yield strength calculated by Timoshenko method and Iehiro Konishi method are greater than the calculation value of energy method. When p is between 1 to 6, the 3, values calculated by Timoshenko method and Ichiro Konishi method are less than the calculation value of energy method. When/5 is between

  1. Benchmark of the bootstrap current simulation in helical plasmas

    CERN Document Server

    Huang, Botsz; Kanno, Ryutaro; Sugama, Hideo; Goto, Takuya

    2016-01-01

    The importance of the parallel momentum conservation on the bootstrap current evaluation in nonaxisymmetric systems is demonstrated by the benchmarks among the local drift-kinetic equation solvers, i.e., the Zero-Orbit-width(ZOW), DKES, and PENTA codes. The ZOW model is extended to include the ion parallel mean flow effect on the electron-ion parallel friction. Compared to the DKES model in which only the pitch-angle-scattering term is included in the collision operator, the PENTA model employs the Sugama-Nishimura method to correct the momentum balance. The ZOW and PENTA models agree each other well on the calculations of the bootstrap current. The DKES results without the parallel momentum conservation deviates significantly from those from the ZOW and PENTA models. This work verifies the reliability of the bootstrap current calculation with the ZOW and PENTA models for the helical plasmas.

  2. 大型屏蔽电机泵转子系统的建模及临界转速计算%Modelling and Critical Speed Calculation of Large-scale Canned Motor Pump Rotor System

    Institute of Scientific and Technical Information of China (English)

    师名林; 王德忠; 张继革

    2012-01-01

    The lumped parameter model of large -scale canned motor pump rotor system is established,and the critical speeds and modal shapes are calculated based on Riccati transfer matrix method. The calculated results show the critical rotary speeds can effectively avoid the operation speed,thus the critical rotary speed design of large-scale canned motor pump rotor system k sufficient in design capacity. In addition,the locations of upper and lower flywheels are the sensitive part of radical vibration of rotor system and should be monitored to avoid collision with pressure casing in operation.%建立了大型屏蔽电机泵转子系统的集总参数模型,并采用Riccati传递矩阵法对转子系统的临界转速及振型进行了计算.计算结果表明:(1)采用Riccati传递矩阵法编制的转子系统临界转速求解程序计算稳定,计算精度足够高;(2)大型屏蔽电机泵转子系统的临界转速为设计超速的1.2倍,能够有效避开工作转速,设计裕量足够;(3)在正常工作下,转子系统上、下飞轮处为振动敏感部位,应重点监测,以免和承压壳体发生碰磨.

  3. Benchmarking af kommunernes førtidspensionspraksis

    DEFF Research Database (Denmark)

    Gregersen, Ole

    Hvert år udgiver Den Sociale Ankestyrelse statistikken over afgørelser i sager om førtidspension. I forbindelse med årsstatistikken udgives resultater fra en benchmarking model, hvor antal tilkendelser i den enkelte kommune sammenlignes med et forventet antal tilkendelser, hvis kommunen havde haft...... samme afgørelsespraksis, som den "gennemsnitlige kommune", når vi korrigerer for den sociale struktur i kommunen. Den hidtil anvendte benchmarking model er dokumenteret i Ole Gregersen (1994): Kommunernes Pensionspraksis, Servicerapport, Socialforskningsinstituttet. I dette notat dokumenteres en...

  4. Toxicological benchmarks for wildlife: 1996 Revision

    Energy Technology Data Exchange (ETDEWEB)

    Sample, B.E.; Opresko, D.M.; Suter, G.W., II

    1996-06-01

    The purpose of this report is to present toxicological benchmarks for assessment of effects of certain chemicals on mammalian and avian wildlife species. Publication of this document meets a milestone for the Environmental Restoration (ER) Risk Assessment Program. This document provides the ER Program with toxicological benchmarks that may be used as comparative tools in screening assessments as well as lines of evidence to support or refute the presence of ecological effects in ecological risk assessments. The chemicals considered in this report are some that occur at US DOE waste sites, and the wildlife species evaluated herein were chosen because they represent a range of body sizes and diets.

  5. Detailed Burnup Calculations for Research Reactors

    Energy Technology Data Exchange (ETDEWEB)

    Leszczynski, F. [Centro Atomico Bariloche (CNEA), 8400 S. C. de Bariloche (Argentina)

    2011-07-01

    tasks for each burn up step: 1) Monte Carlo criticality calculation of the full system tallying spatial power distribution for each spatial region of interest. 2) Preparation of depletion code input and cross- section libraries from Monte Carlo calculation output and other auxiliary code, including normalized power density of each spatial zone with an auxiliary program. The 1 group cross section library needed for depletion calculations can be obtained with a cell code such as DRAGON4 vs. burn up. 3) Depletion calculations of isotope concentrations on the input burn up time-step. 4) Preparation of Monte Carlo calculation input with the new isotope concentrations output of depletion calculation with other auxiliary program. This sequence is implemented in an automatic way. On the first stages of RRMCQ development, a simplified version has been tested with a set of dependent numerical and experimental benchmarks using standard nuclear data libraries at lattice cell level. Then a full core model has been developed and it is to day used on RA6 reactor of Bariloche Atomic Centre. (author)

  6. TREAT Transient Analysis Benchmarking for the HEU Core

    Energy Technology Data Exchange (ETDEWEB)

    Kontogeorgakos, D. C. [Argonne National Lab. (ANL), Argonne, IL (United States); Connaway, H. M. [Argonne National Lab. (ANL), Argonne, IL (United States); Wright, A. E. [Argonne National Lab. (ANL), Argonne, IL (United States)

    2014-05-01

    This work was performed to support the feasibility study on the potential conversion of the Transient Reactor Test Facility (TREAT) at Idaho National Laboratory from the use of high enriched uranium (HEU) fuel to the use of low enriched uranium (LEU) fuel. The analyses were performed by the GTRI Reactor Conversion staff at the Argonne National Laboratory (ANL). The objective of this study was to benchmark the transient calculations against temperature-limited transients performed in the final operating HEU TREAT core configuration. The MCNP code was used to evaluate steady-state neutronics behavior, and the point kinetics code TREKIN was used to determine core power and energy during transients. The first part of the benchmarking process was to calculate with MCNP all the neutronic parameters required by TREKIN to simulate the transients: the transient rod-bank worth, the prompt neutron generation lifetime, the temperature reactivity feedback as a function of total core energy, and the core-average temperature and peak temperature as a functions of total core energy. The results of these calculations were compared against measurements or against reported values as documented in the available TREAT reports. The heating of the fuel was simulated as an adiabatic process. The reported values were extracted from ANL reports, intra-laboratory memos and experiment logsheets and in some cases it was not clear if the values were based on measurements, on calculations or a combination of both. Therefore, it was decided to use the term “reported” values when referring to such data. The methods and results from the HEU core transient analyses will be used for the potential LEU core configurations to predict the converted (LEU) core’s performance.

  7. World-Wide Benchmarking of ITER Nb3Sn Strand Test Facilities

    CERN Document Server

    Jewell, MC; Takahashi, Yoshikazu; Shikov, Alexander; Devred, Arnaud; Vostner, Alexander; Liu, Fang; Wu, Yu; Jewell, Matthew C; Boutboul, Thierry; Bessette, Denis; Park, Soo-Hyeon; Isono, Takaaki; Vorobieva, Alexandra; Martovetsky, Nicolai; Seo, Kazutaka

    2010-01-01

    The world-wide procurement of Nb3Sn and NbTi for the ITER superconducting magnet systems will involve eight to ten strand suppliers from six Domestic Agencies (DAs) on three continents. To ensure accurate and consistent measurement of the physical and superconducting properties of the composite strand, a strand test facility benchmarking effort was initiated in August 2008. The objectives of this effort are to assess and improve the superconducting strand test and sample preparation technologies at each DA and supplier, in preparation for the more than ten thousand samples that will be tested during ITER procurement. The present benchmarking includes tests for critical current (I-c), n-index, hysteresis loss (Q(hys)), residual resistivity ratio (RRR), strand diameter, Cu fraction, twist pitch, twist direction, and metal plating thickness (Cr or Ni). Nineteen participants from six parties (China, EU, Japan, South Korea, Russia, and the United States) have participated in the benchmarking. This round, conducted...

  8. Implementation and analysis of ITER strand test of CNDA for world-wide benchmarking

    Energy Technology Data Exchange (ETDEWEB)

    Liu Fang, E-mail: fangliu@ipp.ac.cn [Institute of Plasma Physics, Chinese Academy of Science (ASIPP), Hefei 230031 (China); Long Feng; Liu Bo; Lei Lei; Wu Yu; Liu Huajun [Institute of Plasma Physics, Chinese Academy of Science (ASIPP), Hefei 230031 (China)

    2013-01-15

    Highlights: Black-Right-Pointing-Pointer The activities of CNDA for superconducting strand benchmarking were summarized. Black-Right-Pointing-Pointer Details of the tests facility and process were described and analyzed. Black-Right-Pointing-Pointer The test items include room temperature and low temperature measurements. Black-Right-Pointing-Pointer The results of the two round benchmarking were shown and the deviation of each item was summarized. - Abstract: According to the International Thermonuclear Experimental Reactor (ITER) Procurement Arrangement (PA) of Cable-In-Conduit Conductor unit lengths for the magnet systems, at the start of process qualification, the Domestic Agency (DA) shall be required to conduct a benchmarking of the room and low temperature acceptance tests carried out at the strand suppliers and/or at its reference laboratories designated by the ITER Organization (IO). The first benchmarking was carried out successfully in 2009 and the second round in 2010. Bronze-Route (BR) Nb{sub 3}Sn strand and samples prepared by CERN were sent out to each participant in the first round. The second round was referred to the Internal-Tin(IT) Nb{sub 3}Sn and NbTi strand. The two rounds of benchmarking included tests for critical current, hysteresis loss, residual resistance ratio, strand diameter, Cu fraction, twist pitch, and plating thickness. As the referenced lab of Chinese DA (CNDA), the superconducting strand test lab from Institute of Plasma Physics, Chinese Academy of Sciences (ASIPP) participated in the Benchmarking. The feedback from IO showed good results and high coherence. The test facility and test results for benchmarking of CNDA were presented in this paper.

  9. Benchmarking an unstructured grid sediment model in an energetic estuary

    Science.gov (United States)

    Lopez, Jesse E.; Baptista, António M.

    2017-02-01

    A sediment model coupled to the hydrodynamic model SELFE is validated against a benchmark combining a set of idealized tests and an application to a field-data rich energetic estuary. After sensitivity studies, model results for the idealized tests largely agree with previously reported results from other models in addition to analytical, semi-analytical, or laboratory results. Results of suspended sediment in an open channel test with fixed bottom are sensitive to turbulence closure and treatment for hydrodynamic bottom boundary. Results for the migration of a trench are very sensitive to critical stress and erosion rate, but largely insensitive to turbulence closure. The model is able to qualitatively represent sediment dynamics associated with estuarine turbidity maxima in an idealized estuary. Applied to the Columbia River estuary, the model qualitatively captures sediment dynamics observed by fixed stations and shipborne profiles. Representation of the vertical structure of suspended sediment degrades when stratification is underpredicted. Across all tests, skill metrics of suspended sediments lag those of hydrodynamics even when qualitatively representing dynamics. The benchmark is fully documented in an openly available repository to encourage unambiguous comparisons against other models.

  10. The Benchmark Testing of ~9Be of CENDL-3.0

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    The 9Be data of CENDL-3.0 were updated again recently by Prof.Zhang Jing-shang et al. by using anew approach. In order to test the reliability of 9Be data of CENDL-3.0, some benchmarks were used. Inaddition to the values of Keff, the leakage spectrum of Be sphere was calculated. The data processing wascarried out by using the NJOY nuclear data processing code system. The calculations and analyses of

  11. Benchmarking of methods for genomic taxonomy

    DEFF Research Database (Denmark)

    Larsen, Mette Voldby; Cosentino, Salvatore; Lukjancenko, Oksana;

    2014-01-01

    . Nevertheless, the method has been found to have a number of shortcomings. In the current study, we trained and benchmarked five methods for whole-genome sequence-based prokaryotic species identification on a common data set of complete genomes: (i) SpeciesFinder, which is based on the complete 16S rRNA gene...

  12. Seven Benchmarks for Information Technology Investment.

    Science.gov (United States)

    Smallen, David; Leach, Karen

    2002-01-01

    Offers benchmarks to help campuses evaluate their efforts in supplying information technology (IT) services. The first three help understand the IT budget, the next three provide insight into staffing levels and emphases, and the seventh relates to the pervasiveness of institutional infrastructure. (EV)

  13. Simple benchmark for complex dose finding studies.

    Science.gov (United States)

    Cheung, Ying Kuen

    2014-06-01

    While a general goal of early phase clinical studies is to identify an acceptable dose for further investigation, modern dose finding studies and designs are highly specific to individual clinical settings. In addition, as outcome-adaptive dose finding methods often involve complex algorithms, it is crucial to have diagnostic tools to evaluate the plausibility of a method's simulated performance and the adequacy of the algorithm. In this article, we propose a simple technique that provides an upper limit, or a benchmark, of accuracy for dose finding methods for a given design objective. The proposed benchmark is nonparametric optimal in the sense of O'Quigley et al. (2002, Biostatistics 3, 51-56), and is demonstrated by examples to be a practical accuracy upper bound for model-based dose finding methods. We illustrate the implementation of the technique in the context of phase I trials that consider multiple toxicities and phase I/II trials where dosing decisions are based on both toxicity and efficacy, and apply the benchmark to several clinical examples considered in the literature. By comparing the operating characteristics of a dose finding method to that of the benchmark, we can form quick initial assessments of whether the method is adequately calibrated and evaluate its sensitivity to the dose-outcome relationships.

  14. Professional Performance and Bureaucratic Benchmarking Information

    DEFF Research Database (Denmark)

    Schneider, Melanie L.; Mahlendorf, Matthias D.; Schäffer, Utz

    Professionals are often expected to be reluctant with regard to bureaucratic controls because of assumed conflicting values and goals of the organization vis-à-vis the profession. We suggest however, that the provision of bureaucratic benchmarking information is positively associated with profess......Professionals are often expected to be reluctant with regard to bureaucratic controls because of assumed conflicting values and goals of the organization vis-à-vis the profession. We suggest however, that the provision of bureaucratic benchmarking information is positively associated...... with professional performance. Employed professionals will further be more open to consider bureaucratic benchmarking information provided by the administration, if they are aware that their professional performance is low. To test our hypotheses, we rely on a sample of archival public professional performance data...... for 191 orthopaedics departments of German hospitals matched with survey data on bureaucratic benchmarking information provision to the chief physician of the respective department. Professional performance is publicly disclosed due to regulatory requirements. At the same time, chief physicians typically...

  15. Benchmarking Peer Production Mechanisms, Processes & Practices

    Science.gov (United States)

    Fischer, Thomas; Kretschmer, Thomas

    2008-01-01

    This deliverable identifies key approaches for quality management in peer production by benchmarking peer production practices and processes in other areas. (Contains 29 footnotes, 13 figures and 2 tables.)[This report has been authored with contributions of: Kaisa Honkonen-Ratinen, Matti Auvinen, David Riley, Jose Pinzon, Thomas Fischer, Thomas…

  16. Cleanroom Energy Efficiency: Metrics and Benchmarks

    Energy Technology Data Exchange (ETDEWEB)

    International SEMATECH Manufacturing Initiative; Mathew, Paul A.; Tschudi, William; Sartor, Dale; Beasley, James

    2010-07-07

    Cleanrooms are among the most energy-intensive types of facilities. This is primarily due to the cleanliness requirements that result in high airflow rates and system static pressures, as well as process requirements that result in high cooling loads. Various studies have shown that there is a wide range of cleanroom energy efficiencies and that facility managers may not be aware of how energy efficient their cleanroom facility can be relative to other cleanroom facilities with the same cleanliness requirements. Metrics and benchmarks are an effective way to compare one facility to another and to track the performance of a given facility over time. This article presents the key metrics and benchmarks that facility managers can use to assess, track, and manage their cleanroom energy efficiency or to set energy efficiency targets for new construction. These include system-level metrics such as air change rates, air handling W/cfm, and filter pressure drops. Operational data are presented from over 20 different cleanrooms that were benchmarked with these metrics and that are part of the cleanroom benchmark dataset maintained by Lawrence Berkeley National Laboratory (LBNL). Overall production efficiency metrics for cleanrooms in 28 semiconductor manufacturing facilities in the United States and recorded in the Fabs21 database are also presented.

  17. Benchmarking 2010: Trends in Education Philanthropy

    Science.gov (United States)

    Bearman, Jessica

    2010-01-01

    "Benchmarking 2010" offers insights into the current priorities, practices and concerns of education grantmakers. The report is divided into five sections: (1) Mapping the Education Grantmaking Landscape; (2) 2010 Funding Priorities; (3) Strategies for Leveraging Greater Impact; (4) Identifying Significant Trends in Education Funding; and (5)…

  18. Benchmark Experiment for Beryllium Slab Samples

    Institute of Scientific and Technical Information of China (English)

    NIE; Yang-bo; BAO; Jie; HAN; Rui; RUAN; Xi-chao; REN; Jie; HUANG; Han-xiong; ZHOU; Zu-ying

    2015-01-01

    In order to validate the evaluated nuclear data on beryllium,a benchmark experiment has been performed at China Institution of Atomic Energy(CIAE).Neutron leakage spectra from pure beryllium slab samples(10cm×10cm×11cm)were measured at 61°and 121°using timeof-

  19. Benchmarking 2011: Trends in Education Philanthropy

    Science.gov (United States)

    Grantmakers for Education, 2011

    2011-01-01

    The analysis in "Benchmarking 2011" is based on data from an unduplicated sample of 184 education grantmaking organizations--approximately two-thirds of Grantmakers for Education's (GFE's) network of grantmakers--who responded to an online survey consisting of fixed-choice and open-ended questions. Because a different subset of funders elects to…

  20. Operational benchmarking of Japanese and Danish hopsitals

    DEFF Research Database (Denmark)

    Traberg, Andreas; Itoh, Kenji; Jacobsen, Peter

    2010-01-01

    This benchmarking model is designed as an integration of three organizational dimensions suited for the healthcare sector. The model incorporates posterior operational indicators, and evaluates upon aggregation of performance. The model is tested upon seven cases from Japan and Denmark. Japanese...

  1. Benchmark Generation and Simulation at Extreme Scale

    Energy Technology Data Exchange (ETDEWEB)

    Lagadapati, Mahesh [North Carolina State University (NCSU), Raleigh; Mueller, Frank [North Carolina State University (NCSU), Raleigh; Engelmann, Christian [ORNL

    2016-01-01

    The path to extreme scale high-performance computing (HPC) poses several challenges related to power, performance, resilience, productivity, programmability, data movement, and data management. Investigating the performance of parallel applications at scale on future architectures and the performance impact of different architectural choices is an important component of HPC hardware/software co-design. Simulations using models of future HPC systems and communication traces from applications running on existing HPC systems can offer an insight into the performance of future architectures. This work targets technology developed for scalable application tracing of communication events. It focuses on extreme-scale simulation of HPC applications and their communication behavior via lightweight parallel discrete event simulation for performance estimation and evaluation. Instead of simply replaying a trace within a simulator, this work promotes the generation of a benchmark from traces. This benchmark is subsequently exposed to simulation using models to reflect the performance characteristics of future-generation HPC systems. This technique provides a number of benefits, such as eliminating the data intensive trace replay and enabling simulations at different scales. The presented work features novel software co-design aspects, combining the ScalaTrace tool to generate scalable trace files, the ScalaBenchGen tool to generate the benchmark, and the xSim tool to assess the benchmark characteristics within a simulator.

  2. Thermodynamic benchmark study using Biacore technology

    NARCIS (Netherlands)

    Navratilova, I.; Papalia, G.A.; Rich, R.L.; Bedinger, D.; Brophy, S.; Condon, B.; Deng, T.; Emerick, A.W.; Guan, H.W.; Hayden, T.; Heutmekers, T.; Hoorelbeke, B.; McCroskey, M.C.; Murphy, M.M.; Nakagawa, T.; Parmeggiani, F.; Xiaochun, Q.; Rebe, S.; Nenad, T.; Tsang, T.; Waddell, M.B.; Zhang, F.F.; Leavitt, S.; Myszka, D.G.

    2007-01-01

    A total of 22 individuals participated in this benchmark study to characterize the thermodynamics of small-molecule inhibitor-enzyme interactions using Biacore instruments. Participants were provided with reagents (the enzyme carbonic anhydrase II, which was immobilized onto the sensor surface, and

  3. A Benchmark and Simulator for UAV Tracking

    KAUST Repository

    Mueller, Matthias

    2016-09-16

    In this paper, we propose a new aerial video dataset and benchmark for low altitude UAV target tracking, as well as, a photorealistic UAV simulator that can be coupled with tracking methods. Our benchmark provides the first evaluation of many state-of-the-art and popular trackers on 123 new and fully annotated HD video sequences captured from a low-altitude aerial perspective. Among the compared trackers, we determine which ones are the most suitable for UAV tracking both in terms of tracking accuracy and run-time. The simulator can be used to evaluate tracking algorithms in real-time scenarios before they are deployed on a UAV “in the field”, as well as, generate synthetic but photo-realistic tracking datasets with automatic ground truth annotations to easily extend existing real-world datasets. Both the benchmark and simulator are made publicly available to the vision community on our website to further research in the area of object tracking from UAVs. (https://ivul.kaust.edu.sa/Pages/pub-benchmark-simulator-uav.aspx.). © Springer International Publishing AG 2016.

  4. Alberta K-12 ESL Proficiency Benchmarks

    Science.gov (United States)

    Salmon, Kathy; Ettrich, Mike

    2012-01-01

    The Alberta K-12 ESL Proficiency Benchmarks are organized by division: kindergarten, grades 1-3, grades 4-6, grades 7-9, and grades 10-12. They are descriptors of language proficiency in listening, speaking, reading, and writing. The descriptors are arranged in a continuum of seven language competences across five proficiency levels. Several…

  5. Algorithm and Architecture Independent Benchmarking with SEAK

    Energy Technology Data Exchange (ETDEWEB)

    Tallent, Nathan R.; Manzano Franco, Joseph B.; Gawande, Nitin A.; Kang, Seung-Hwa; Kerbyson, Darren J.; Hoisie, Adolfy; Cross, Joseph

    2016-05-23

    Many applications of high performance embedded computing are limited by performance or power bottlenecks. We have designed the Suite for Embedded Applications & Kernels (SEAK), a new benchmark suite, (a) to capture these bottlenecks in a way that encourages creative solutions; and (b) to facilitate rigorous, objective, end-user evaluation for their solutions. To avoid biasing solutions toward existing algorithms, SEAK benchmarks use a mission-centric (abstracted from a particular algorithm) and goal-oriented (functional) specification. To encourage solutions that are any combination of software or hardware, we use an end-user black-box evaluation that can capture tradeoffs between performance, power, accuracy, size, and weight. The tradeoffs are especially informative for procurement decisions. We call our benchmarks future proof because each mission-centric interface and evaluation remains useful despite shifting algorithmic preferences. It is challenging to create both concise and precise goal-oriented specifications for mission-centric problems. This paper describes the SEAK benchmark suite and presents an evaluation of sample solutions that highlights power and performance tradeoffs.

  6. A human benchmark for language recognition

    NARCIS (Netherlands)

    Orr, R.; Leeuwen, D.A. van

    2009-01-01

    In this study, we explore a human benchmark in language recognition, for the purpose of comparing human performance to machine performance in the context of the NIST LRE 2007. Humans are categorised in terms of language proficiency, and performance is presented per proficiency. Themain challenge in

  7. Benchmarking Year Five Students' Reading Abilities

    Science.gov (United States)

    Lim, Chang Kuan; Eng, Lin Siew; Mohamed, Abdul Rashid

    2014-01-01

    Reading and understanding a written text is one of the most important skills in English learning.This study attempts to benchmark Year Five students' reading abilities of fifteen rural schools in a district in Malaysia. The objectives of this study are to develop a set of standardised written reading comprehension and a set of indicators to inform…

  8. Benchmarking European Gas Transmission System Operators

    DEFF Research Database (Denmark)

    Agrell, Per J.; Bogetoft, Peter; Trinkner, Urs

    This is the final report for the pan-European efficiency benchmarking of gas transmission system operations commissioned by the Netherlands Authority for Consumers and Markets (ACM), Den Haag, on behalf of the Council of European Energy Regulators (CEER) under the supervision of the authors....

  9. Automatic generation of reaction energy databases from highly accurate atomization energy benchmark sets.

    Science.gov (United States)

    Margraf, Johannes T; Ranasinghe, Duminda S; Bartlett, Rodney J

    2017-03-31

    In this contribution, we discuss how reaction energy benchmark sets can automatically be created from arbitrary atomization energy databases. As an example, over 11 000 reaction energies derived from the W4-11 database, as well as some relevant subsets are reported. Importantly, there is only very modest computational overhead involved in computing >11 000 reaction energies compared to 140 atomization energies, since the rate-determining step for either benchmark is performing the same 140 quantum chemical calculations. The performance of commonly used electronic structure methods for the new database is analyzed. This allows investigating the relationship between the performances for atomization and reaction energy benchmarks based on an identical set of molecules. The atomization energy is found to be a weak predictor for the overall usefulness of a method. The performance of density functional approximations in light of the number of empirically optimized parameters used in their design is also discussed.

  10. Electricity consumption in school buildings - benchmark and web tools; Elforbrug i skoler - benchmark og webvaerktoej

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2006-07-01

    The aim of this project has been to produce benchmarks for electricity consumption in Danish schools in order to encourage electricity conservation. An internet programme has been developed with the aim of facilitating schools' access to benchmarks and to evaluate energy consumption. The overall purpose is to create increased attention to the electricity consumption of each separate school by publishing benchmarks which take the schools' age and number of pupils as well as after school activities into account. Benchmarks can be used to make green accounts and work as markers in e.g. energy conservation campaigns, energy management and for educational purposes. The internet tool can be found on www.energiguiden.dk. (BA)

  11. ORSPHERE: CRITICAL, BARE, HEU(93.2)-METAL SPHERE

    Energy Technology Data Exchange (ETDEWEB)

    Margaret A. Marshall

    2013-09-01

    In the early 1970’s Dr. John T. Mihalczo (team leader), J.J. Lynn, and J.R. Taylor performed experiments at the Oak Ridge Critical Experiments Facility (ORCEF) with highly enriched uranium (HEU) metal (called Oak Ridge Alloy or ORALLOY) in an attempt to recreate GODIVA I results with greater accuracy than those performed at Los Alamos National Laboratory in the 1950’s (HEU-MET-FAST-001). The purpose of the Oak Ridge ORALLOY Sphere (ORSphere) experiments was to estimate the unreflected and unmoderated critical mass of an idealized sphere of uranium metal corrected to a density, purity, and enrichment such that it could be compared with the GODIVA I experiments. “The very accurate description of this sphere, as assembled, establishes it as an ideal benchmark for calculational methods and cross-section data files.” (Reference 1) While performing the ORSphere experiments care was taken to accurately document component dimensions (±0. 0001 in. for non-spherical parts), masses (±0.01 g), and material data The experiment was also set up to minimize the amount of structural material in the sphere proximity. A three part sphere was initially assembled with an average radius of 3.4665 in. and was then machined down to an average radius of 3.4420 in. (3.4425 in. nominal). These two spherical configurations were evaluated and judged to be acceptable benchmark experiments; however, the two experiments are highly correlated.

  12. CriticalEd

    DEFF Research Database (Denmark)

    Kjellberg, Caspar Mølholt; Meredith, David

    2014-01-01

    , consisting of a Sibelius plug-in, a cross-platform application, called CriticalEd, and a REST-based solution, which handles data storage/retrieval. A prototype has been tested at the Danish Centre for Music Publication, and the results suggest that the system could greatly improve the efficiency...

  13. Uncertainty Analysis for OECD-NEA-UAM Benchmark Problem of TMI-1 PWR Fuel Assembly

    Energy Technology Data Exchange (ETDEWEB)

    Kwon, Hyuk; Kim, S. J.; Seo, K.W.; Hwang, D. H. [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2015-10-15

    A quantification of code uncertainty is one of main questions that is continuously asked by the regulatory body like KINS. Utility and code developers solve the issue case by case because the general answer about this question is still opened. Under the circumference, OECD-NEA has attracted the global consensus on the uncertainty quantification through the UAM benchmark program. OECD-NEA benchmark II-2 problem is a problem on the uncertainty quantification of subchannel code. It is a problem that the uncertainty of fuel temperature and ONB location on the TMI-1 fuel assembly are estimated on the transient and steady condition. In this study, the uncertainty quantification of MATRA code is performed on the problem. Workbench platform is developed to produce the large set of inputs that is needed to estimate the uncertainty quantification on the benchmark problem. Direct Monte Carlo sampling is used to the random sampling from sample PDF. Uncertainty analysis of MATRA code on OECD-NEA benchmark problem is estimated using the developed tool and MATRA code. Uncertainty analysis on OECD-NEA benchmark II-2 problem was performed to quantify the uncertainty of MATRA code. Direct Monte Carlo sampling is used to extract 2000 random parameters. Workbench program is developed to generate input files and post process of calculation results. Uncertainty affected by input parameters was estimated on the DNBR, the cladding and the coolant temperatures.

  14. Benchmark 1 - Failure Prediction after Cup Drawing, Reverse Redrawing and Expansion Part A: Benchmark Description

    Science.gov (United States)

    Watson, Martin; Dick, Robert; Huang, Y. Helen; Lockley, Andrew; Cardoso, Rui; Santos, Abel

    2016-08-01

    This Benchmark is designed to predict the fracture of a food can after drawing, reverse redrawing and expansion. The aim is to assess different sheet metal forming difficulties such as plastic anisotropic earing and failure models (strain and stress based Forming Limit Diagrams) under complex nonlinear strain paths. To study these effects, two distinct materials, TH330 steel (unstoved) and AA5352 aluminum alloy are considered in this Benchmark. Problem description, material properties, and simulation reports with experimental data are summarized.

  15. The ACRV Picking Benchmark (APB): A Robotic Shelf Picking Benchmark to Foster Reproducible Research

    OpenAIRE

    Leitner, Jürgen; Tow, Adam W.; Dean, Jake E.; Suenderhauf, Niko; Durham, Joseph W.; Cooper, Matthew; Eich, Markus; Lehnert, Christopher; Mangels, Ruben; McCool, Christopher; Kujala, Peter; Nicholson, Lachlan; Van Pham, Trung; Sergeant, James; Wu, Liao

    2016-01-01

    Robotic challenges like the Amazon Picking Challenge (APC) or the DARPA Challenges are an established and important way to drive scientific progress. They make research comparable on a well-defined benchmark with equal test conditions for all participants. However, such challenge events occur only occasionally, are limited to a small number of contestants, and the test conditions are very difficult to replicate after the main event. We present a new physical benchmark challenge for robotic pi...

  16. 燃气轮机拉杆转子动力学建模及临界转速计算%Dynamic Modeling and Critical Speed Calculation of Gas Turbine Rod Fastening Rotor

    Institute of Scientific and Technical Information of China (English)

    王少波; 孟成; 苏明

    2013-01-01

    重型燃气轮机通常采用拉杆转子结构,此类转子-支撑系统的临界转速的分析计算与整体结构转子不同.以Riccati传递矩阵法为框架,对某燃气轮机中心拉杆转子结构进行分析并离散,考虑轴承支撑以及端面齿啮合对转子动力学特性的影响,建立转子-支撑系统的动力学计算模型.利用该模型对临界转速以及相应的振型进行计算,并通过与试验台实测结果的对比,验证了计算模型和方法的正确性,该方法可应用于类似结构转子动力学特性的分析研究.%Rod fastening rotor is usually used in heavy duty gas turbine rotor-support system, of which critical speed calculation differs from that of the integral rotor. In the framework of Riccati transfer matrix method, discretization of central tie-rod fastening rotor structure with detailed analysis was conducted. Taking into account the impacts which are brought by bearing support and meshing face tooth on dynamic characteristics of rotor system, a computation model of rotor-support system was further established. In this way, critical speeds and the corresponding modes were obtained. The calculated result shows a good a-greement with the test measurement result, which implies that the method is accurate and computation model is reliable. This approach can also be applied to analyze dynamic characteristics of rotors with homogenous structure.

  17. Generation IV benchmarking of TRISO fuel performance models under accident conditions: Modeling input data

    Energy Technology Data Exchange (ETDEWEB)

    Collin, Blaise P. [Idaho National Laboratory (INL), Idaho Falls, ID (United States)

    2014-09-01

    This document presents the benchmark plan for the calculation of particle fuel performance on safety testing experiments that are representative of operational accidental transients. The benchmark is dedicated to the modeling of fission product release under accident conditions by fuel performance codes from around the world, and the subsequent comparison to post-irradiation experiment (PIE) data from the modeled heating tests. The accident condition benchmark is divided into three parts: the modeling of a simplified benchmark problem to assess potential numerical calculation issues at low fission product release; the modeling of the AGR-1 and HFR-EU1bis safety testing experiments; and, the comparison of the AGR-1 and HFR-EU1bis modeling results with PIE data. The simplified benchmark case, thereafter named NCC (Numerical Calculation Case), is derived from ''Case 5'' of the International Atomic Energy Agency (IAEA) Coordinated Research Program (CRP) on coated particle fuel technology [IAEA 2012]. It is included so participants can evaluate their codes at low fission product release. ''Case 5'' of the IAEA CRP-6 showed large code-to-code discrepancies in the release of fission products, which were attributed to ''effects of the numerical calculation method rather than the physical model''[IAEA 2012]. The NCC is therefore intended to check if these numerical effects subsist. The first two steps imply the involvement of the benchmark participants with a modeling effort following the guidelines and recommendations provided by this document. The third step involves the collection of the modeling results by Idaho National Laboratory (INL) and the comparison of these results with the available PIE data. The objective of this document is to provide all necessary input data to model the benchmark cases, and to give some methodology guidelines and recommendations in order to make all results suitable for comparison

  18. Energy saving in WWTP: Daily benchmarking under uncertainty and data availability limitations.

    Science.gov (United States)

    Torregrossa, D; Schutz, G; Cornelissen, A; Hernández-Sancho, F; Hansen, J

    2016-07-01

    Efficient management of Waste Water Treatment Plants (WWTPs) can produce significant environmental and economic benefits. Energy benchmarking can be used to compare WWTPs, identify targets and use these to improve their performance. Different authors have performed benchmark analysis on monthly or yearly basis but their approaches suffer from a time lag between an event, its detection, interpretation and potential actions. The availability of on-line measurement data on many WWTPs should theoretically enable the decrease of the management response time by daily benchmarking. Unfortunately this approach is often impossible because of limited data availability. This paper proposes a methodology to perform a daily benchmark analysis under database limitations. The methodology has been applied to the Energy Online System (EOS) developed in the framework of the project "INNERS" (INNovative Energy Recovery Strategies in the urban water cycle). EOS calculates a set of Key Performance Indicators (KPIs) for the evaluation of energy and process performances. In EOS, the energy KPIs take in consideration the pollutant load in order to enable the comparison between different plants. For example, EOS does not analyse the energy consumption but the energy consumption on pollutant load. This approach enables the comparison of performances for plants with different loads or for a single plant under different load conditions. The energy consumption is measured by on-line sensors, while the pollutant load is measured in the laboratory approximately every 14 days. Consequently, the unavailability of the water quality parameters is the limiting factor in calculating energy KPIs. In this paper, in order to overcome this limitation, the authors have developed a methodology to estimate the required parameters and manage the uncertainty in the estimation. By coupling the parameter estimation with an interval based benchmark approach, the authors propose an effective, fast and reproducible

  19. Effects of Exposure Imprecision on Estimation of the Benchmark Dose

    DEFF Research Database (Denmark)

    Budtz-Jørgensen, Esben; Keiding, Niels; Grandjean, Philippe

    Environmental epidemiology; exposure measurement error; effect of prenatal mercury exposure; exposure standards; benchmark dose......Environmental epidemiology; exposure measurement error; effect of prenatal mercury exposure; exposure standards; benchmark dose...

  20. Benchmarking in Identifying Priority Directions of Development of Telecommunication Operators

    Directory of Open Access Journals (Sweden)

    Zaharchenko Lolita A.

    2013-12-01

    Full Text Available The article analyses evolution of development and possibilities of application of benchmarking in the telecommunication sphere. It studies essence of benchmarking on the basis of generalisation of approaches of different scientists to definition of this notion. In order to improve activity of telecommunication operators, the article identifies the benchmarking technology and main factors, that determine success of the operator in the modern market economy, and the mechanism of benchmarking and component stages of carrying out benchmarking by a telecommunication operator. It analyses the telecommunication market and identifies dynamics of its development and tendencies of change of the composition of telecommunication operators and providers. Having generalised the existing experience of benchmarking application, the article identifies main types of benchmarking of telecommunication operators by the following features: by the level of conduct of (branch, inter-branch and international benchmarking; by relation to participation in the conduct (competitive and joint; and with respect to the enterprise environment (internal and external.