WorldWideScience

Sample records for fuel lattices benchmark

  1. Non-judgemental Dynamic Fuel Cycle Benchmarking

    CERN Document Server

    Scopatz, Anthony Michael

    2015-01-01

    This paper presents a new fuel cycle benchmarking analysis methodology by coupling Gaussian process regression, a popular technique in Machine Learning, to dynamic time warping, a mechanism widely used in speech recognition. Together they generate figures-of-merit that are applicable to any time series metric that a benchmark may study. The figures-of-merit account for uncertainty in the metric itself, utilize information across the whole time domain, and do not require that the simulators use a common time grid. Here, a distance measure is defined that can be used to compare the performance of each simulator for a given metric. Additionally, a contribution measure is derived from the distance measure that can be used to rank order the importance of fuel cycle metrics. Lastly, this paper warns against using standard signal processing techniques for error reduction. This is because it is found that error reduction is better handled by the Gaussian process regression itself.

  2. Uncertainty Analysis of Light Water Reactor Fuel Lattices

    Directory of Open Access Journals (Sweden)

    C. Arenas

    2013-01-01

    Full Text Available The study explored the calculation of uncertainty based on available cross-section covariance data and computational tool on fuel lattice levels, which included pin cell and the fuel assembly models. Uncertainty variations due to temperatures changes and different fuel compositions are the main focus of this analysis. Selected assemblies and unit pin cells were analyzed according to the OECD LWR UAM benchmark specifications. Criticality and uncertainty analysis were performed using TSUNAMI-2D sequence in SCALE 6.1. It was found that uncertainties increase with increasing temperature, while kinf decreases. This increase in the uncertainty is due to the increase in sensitivity of the largest contributing reaction of uncertainty, namely, the neutron capture reaction 238U(n, γ due to the Doppler broadening. In addition, three types (UOX, MOX, and UOX-Gd2O3 of fuel material compositions were analyzed. A remarkable increase in uncertainty in kinf was observed for the case of MOX fuel. The increase in uncertainty of kinf in MOX fuel was nearly twice the corresponding value in UOX fuel. The neutron-nuclide reaction of 238U, mainly inelastic scattering (n, n′, contributed the most to the uncertainties in the MOX fuel, shifting the neutron spectrum to higher energy compared to the UOX fuel.

  3. Criticality benchmark guide for light-water-reactor fuel in transportation and storage packages

    Energy Technology Data Exchange (ETDEWEB)

    Lichtenwalter, J.J.; Bowman, S.M.; DeHart, M.D.; Hopper, C.M.

    1997-03-01

    This report is designed as a guide for performing criticality benchmark calculations for light-water-reactor (LWR) fuel applications. The guide provides documentation of 180 criticality experiments with geometries, materials, and neutron interaction characteristics representative of transportation packages containing LWR fuel or uranium oxide pellets or powder. These experiments should benefit the U.S. Nuclear Regulatory Commission (NRC) staff and licensees in validation of computational methods used in LWR fuel storage and transportation concerns. The experiments are classified by key parameters such as enrichment, water/fuel volume, hydrogen-to-fissile ratio (H/X), and lattice pitch. Groups of experiments with common features such as separator plates, shielding walls, and soluble boron are also identified. In addition, a sample validation using these experiments and a statistical analysis of the results are provided. Recommendations for selecting suitable experiments and determination of calculational bias and uncertainty are presented as part of this benchmark guide.

  4. Fuel lattice design using heuristics and new strategies

    Energy Technology Data Exchange (ETDEWEB)

    Ortiz S, J. J.; Castillo M, J. A.; Torres V, M.; Perusquia del Cueto, R. [ININ, Carretera Mexico-Toluca s/n, Ocoyoacac 52750, Estado de Mexico (Mexico); Pelta, D. A. [ETS Ingenieria Informatica y Telecomunicaciones, Universidad de Granada, Daniel Saucedo Aranda s/n, 18071 Granada (Spain); Campos S, Y., E-mail: juanjose.ortiz@inin.gob.m [IPN, Escuela Superior de Fisica y Matematicas, Unidad Profesional Adolfo Lopez Mateos, Edif. 9, 07738 Mexico D. F. (Mexico)

    2010-10-15

    This work show some results of the fuel lattice design in BWRs when some allocation pin rod rules are not taking into account. Heuristics techniques like Path Re linking and Greedy to design fuel lattices were used. The scope of this work is to search about how do classical rules in design fuel lattices affect the heuristics techniques results and the fuel lattice quality. The fuel lattices quality is measured by Power Peaking Factor and Infinite Multiplication Factor at the beginning of the fuel lattice life. CASMO-4 code to calculate these parameters was used. The analyzed rules are the following: pin rods with lowest uranium enrichment are only allocated in the fuel lattice corner, and pin rods with gadolinium cannot allocated in the fuel lattice edge. Fuel lattices with and without gadolinium in the main diagonal were studied. Some fuel lattices were simulated in an equilibrium cycle fuel reload, using Simulate-3 to verify their performance. So, the effective multiplication factor and thermal limits can be verified. The obtained results show a good performance in some fuel lattices designed, even thought, the knowing rules were not implemented. A fuel lattice performance and fuel lattice design characteristics analysis was made. To the realized tests, a dell workstation was used, under Li nux platform. (Author)

  5. Reactor based plutonium disposition - physics and fuel behaviour benchmark studies of an OECD/NEA experts group

    Energy Technology Data Exchange (ETDEWEB)

    D' Hondt, P. [SCK.CEN, Mol (Belgium); Gehin, J. [ORNL, Oak Ridge, TN (United States); Na, B.C.; Sartori, E. [Organisation for Economic Co-Operation and Development, Nuclear Energy Agency, 92 - Issy les Moulineaux (France); Wiesenack, W. [Organisation for Economic Co-Operation and Development/HRP, Halden (Norway)

    2001-07-01

    One of the options envisaged for disposing of weapons grade plutonium, declared surplus for national defence in the Russian Federation and Usa, is to burn it in nuclear power reactors. The scientific/technical know-how accumulated in the use of MOX as a fuel for electricity generation is of great relevance for the plutonium disposition programmes. An Expert Group of the OECD/Nea is carrying out a series of benchmarks with the aim of facilitating the use of this know-how for meeting this objective. This paper describes the background that led to establishing the Expert Group, and the present status of results from these benchmarks. The benchmark studies cover a theoretical reactor physics benchmark on a VVER-1000 core loaded with MOX, two experimental benchmarks on MOX lattices and a benchmark concerned with MOX fuel behaviour for both solid and hollow pellets. First conclusions are outlined as well as future work. (author)

  6. Melcor benchmarking against integral severe fuel damage tests

    Energy Technology Data Exchange (ETDEWEB)

    Madni, I.K. [Brookhaven National Lab., Upton, NY (United States)

    1995-09-01

    MELCOR is a fully integrated computer code that models all phases of the progression of severe accidents in light water reactor nuclear power plants, and is being developed for the U.S. Nuclear Regulatory Commission (NRC) by Sandia National Laboratories (SNL). Brookhaven National Laboratory (BNL) has a program with the NRC to provide independent assessment of MELCOR, and a very important part of this program is to benchmark MELCOR against experimental data from integral severe fuel damage tests and predictions of that data from more mechanistic codes such as SCDAP or SCDAP/RELAP5. Benchmarking analyses with MELCOR have been carried out at BNL for five integral severe fuel damage tests, namely, PBF SFD 1-1, SFD 14, and NRU FLHT-2, analyses, and their role in identifying areas of modeling strengths and weaknesses in MELCOR.

  7. Benchmarking Data for the Proposed Signature of Used Fuel Casks

    Energy Technology Data Exchange (ETDEWEB)

    Rauch, Eric Benton [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-09-23

    A set of benchmarking measurements to test facets of the proposed extended storage signature was conducted on May 17, 2016. The measurements were designed to test the overall concept of how the proposed signature can be used to identify a used fuel cask based only on the distribution of neutron sources within the cask. To simulate the distribution, 4 Cf-252 sources were chosen and arranged on a 3x3 grid in 3 different patterns and raw neutron totals counts were taken at 6 locations around the grid. This is a very simplified test of the typical geometry studied previously in simulation with simulated used nuclear fuel.

  8. Benchmarking criticality analysis of TRIGA fuel storage racks.

    Science.gov (United States)

    Robinson, Matthew Loren; DeBey, Timothy M; Higginbotham, Jack F

    2017-01-01

    A criticality analysis was benchmarked to sub-criticality measurements of the hexagonal fuel storage racks at the United States Geological Survey TRIGA MARK I reactor in Denver. These racks, which hold up to 19 fuel elements each, are arranged at 0.61m (2 feet) spacings around the outer edge of the reactor. A 3-dimensional model was created of the racks using MCNP5, and the model was verified experimentally by comparison to measured subcritical multiplication data collected in an approach to critical loading of two of the racks. The validated model was then used to show that in the extreme condition where the entire circumference of the pool was lined with racks loaded with used fuel the storage array is subcritical with a k value of about 0.71; well below the regulatory limit of 0.8. A model was also constructed of the rectangular 2×10 fuel storage array used in many other TRIGA reactors to validate the technique against the original TRIGA licensing sub-critical analysis performed in 1966. The fuel used in this study was standard 20% enriched (LEU) aluminum or stainless steel clad TRIGA fuel.

  9. Description and results of a two-dimensional lattice physics code benchmark for the Canadian Pressure Tube Supercritical Water-cooled Reactor (PT-SCWR)

    Energy Technology Data Exchange (ETDEWEB)

    Hummel, D.W.; Langton, S.E.; Ball, M.R.; Novog, D.R.; Buijs, A., E-mail: hummeld@mcmaster.ca [McMaster Univ., Hamilton, Ontario (Canada)

    2013-07-01

    Discrepancies have been observed among a number of recent reactor physics studies in support of the PT-SCWR pre-conceptual design, including differences in lattice-level predictions of infinite neutron multiplication factor, coolant void reactivity, and radial power profile. As a first step to resolving these discrepancies, a lattice-level benchmark problem was designed based on the 78-element plutonium-thorium PT-SCWR fuel design under a set of prescribed local conditions. This benchmark problem was modeled with a suite of both deterministic and Monte Carlo neutron transport codes. The results of these models are presented here as the basis of a code-to-code comparison. (author)

  10. Reactor Physics and Criticality Benchmark Evaluations for Advanced Nuclear Fuel - Final Technical Report

    Energy Technology Data Exchange (ETDEWEB)

    William Anderson; James Tulenko; Bradley Rearden; Gary Harms

    2008-09-11

    The nuclear industry interest in advanced fuel and reactor design often drives towards fuel with uranium enrichments greater than 5 wt% 235U. Unfortunately, little data exists, in the form of reactor physics and criticality benchmarks, for uranium enrichments ranging between 5 and 10 wt% 235U. The primary purpose of this project is to provide benchmarks for fuel similar to what may be required for advanced light water reactors (LWRs). These experiments will ultimately provide additional information for application to the criticality-safety bases for commercial fuel facilities handling greater than 5 wt% 235U fuel.

  11. HTR Spherical Super Lattice Model for Equilibrium Fuel Cycle Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Gray S. Cahng

    2005-09-01

    Advanced High Temperature gas-cooled Reactors (HTR) currently being developed (GFR, VHTR - Very High Temperature gas-cooled Reactor, PBMR, and GT-MHR) are able to achieve a simplification of safety through reliance on innovative features and passive systems. One of the innovative features in these HTRs is reliance on ceramic-coated fuel particles to retain the fission products even under extreme accident conditions. The effect of the random fuel kernel distribution in the fuel pebble / block is addressed through the use of the Dancoff correction factor in the resonance treatment. In addition, the Dancoff correction factor is a function of burnup and fuel kernel packing factor, which requires that the Dancoff correction factor be updated during Equilibrium Fuel Cycle (EqFC) analysis. Although HTR fuel is rather homogeneously dispersed in the fuel graphite matrix, the heterogeneity effects in between fuel kernels and pebbles cannot be ignored. The double-heterogeneous lattice model recently developed at the Idaho National Engineering and Environmental Laboratory (INEEL) contains tens of thousands of cubic fuel kernel cells, which makes it very difficult to deplete the fuel, kernel by kernel (KbK), for the EqFC analysis. In addition, it is not possible to preserve the cubic size and packing factor in a spherical fuel pebble. To avoid these difficulties, a newly developed and validated HTR pebble-bed Kernel-by-Kernel spherical (KbK-sph) model, has been developed and verified in this study. The objective of this research is to introduce the KbK-sph model and super whole Pebble lattice model (PLM). The verified double-heterogeneous KbK-sph and pebble homogeneous lattice model (HLM) are used for the fuel burnup chracteristics analysis and important safety parameters validation. This study summarizes and compares the KbK-sph and HLM burnup analyzed results. Finally, we discus the Monte-Carlo coupling with a fuel depletion and buildup code - Origen-2 as a fuel burnup

  12. Benchmark Evaluation of Fuel Effect and Material Worth Measurements for a Beryllium-Reflected Space Reactor Mockup

    Energy Technology Data Exchange (ETDEWEB)

    Marshall, Margaret A. [Idaho National Lab. (INL), Idaho Falls, ID (United States). Center for Space Nuclear Research; Bess, John D. [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2015-02-01

    The critical configuration of the small, compact critical assembly (SCCA) experiments performed at the Oak Ridge Critical Experiments Facility (ORCEF) in 1962-1965 have been evaluated as acceptable benchmark experiments for inclusion in the International Handbook of Evaluated Criticality Safety Benchmark Experiments. The initial intent of these experiments was to support the design of the Medium Power Reactor Experiment (MPRE) program, whose purpose was to study “power plants for the production of electrical power in space vehicles.” The third configuration in this series of experiments was a beryllium-reflected assembly of stainless-steel-clad, highly enriched uranium (HEU)-O2 fuel mockup of a potassium-cooled space power reactor. Reactivity measurements cadmium ratio spectral measurements and fission rate measurements were measured through the core and top reflector. Fuel effect worth measurements and neutron moderating and absorbing material worths were also measured in the assembly fuel region. The cadmium ratios, fission rate, and worth measurements were evaluated for inclusion in the International Handbook of Evaluated Criticality Safety Benchmark Experiments. The fuel tube effect and neutron moderating and absorbing material worth measurements are the focus of this paper. Additionally, a measurement of the worth of potassium filling the core region was performed but has not yet been evaluated Pellets of 93.15 wt.% enriched uranium dioxide (UO2) were stacked in 30.48 cm tall stainless steel fuel tubes (0.3 cm tall end caps). Each fuel tube had 26 pellets with a total mass of 295.8 g UO2 per tube. 253 tubes were arranged in 1.506-cm triangular lattice. An additional 7-tube cluster critical configuration was also measured but not used for any physics measurements. The core was surrounded on all side by a beryllium reflector. The fuel effect worths were measured by removing fuel tubes at various radius. An accident scenario

  13. Lattice Wess-Zumino model with Ginsparg-Wilson fermions: One-loop results and GPU benchmarks

    CERN Document Server

    Chen, Chen; Giedt, Joel

    2010-01-01

    We numerically evaluate the one-loop counterterms for the four-dimensional Wess-Zumino model formulated on the lattice using Ginsparg-Wilson fermions of the overlap (Neuberger) variety, such that a lattice version of U(1)_R symmetry is exactly preserved in the limit of vanishing bare mass. We confirm previous findings by other authors that at one loop there is no renormalization of the superpotential in the lattice theory. We discuss aspects of the simulation of this model that is planned for a follow-up work, and outline a strategy for nonperturbative improvement of the lattice supercurrent through measurements of \\susy\\ Ward identities. Related to this, some benchmarks for our graphics processing unit code are provided. An initial simulation finds a nearly vanishing vacuum expectation value for the auxiliary field, consistent with approximate supersymmetry.

  14. Calculational Benchmark Problems for VVER-1000 Mixed Oxide Fuel Cycle

    Energy Technology Data Exchange (ETDEWEB)

    Emmett, M.B.

    2000-03-17

    Standard problems were created to test the ability of American and Russian computational methods and data regarding the analysis of the storage and handling of Russian pressurized water reactor (VVER) mixed oxide fuel. Criticality safety and radiation shielding problems were analyzed. Analysis of American and Russian multiplication factors for fresh fuel storage for low-enriched uranium (UOX), weapons- (MOX-W) and reactor-grade (MOX-R) MOX differ by less than 2% for all variations of water density. For shielding calculations for fresh fuel, the ORNL results for the neutron source differ from the Russian results by less than 1% for UOX and MOX-R and by approximately 3% for MOX-W. For shielding calculations for fresh fuel assemblies, neutron dose rates at the surface of the assemblies differ from the Russian results by 5% to 9%; the level of agreement for gamma dose varies depending on the type of fuel, with UOX differing by the largest amount. The use of different gamma group structures and instantaneous versus asymptotic decay assumptions also complicate the comparison. For the calculation of dose rates from spent fuel in a shipping cask, the neutron source for UOX after 3-year cooling is within 1% and for MOX-W within 5% of one of the Russian results while the MOX-R difference is the largest at over 10%. These studies are a portion of the documentation required by the Russian nuclear regulatory authority, GAN, in order to certify Russian programs and data as being acceptably accurate for the analysis of mixed oxide fuels.

  15. Benchmarking Pt-based electrocatalysts for low temperature fuel cell reactions with the rotating disk electrode

    DEFF Research Database (Denmark)

    Pedersen, Christoffer Mølleskov; Escribano, Maria Escudero; Velazquez-Palenzuela, Amado Andres

    2015-01-01

    We present up-to-date benchmarking methods for testing electrocatalysts for polymer exchange membrane fuel cells (PEMFC), using the rotating disk electrode (RDE) method. We focus on the oxygen reduction reaction (ORR) and the hydrogen oxidation reaction (HOR) in the presence of CO. We have chosen...

  16. Benchmarking Benchmarks

    NARCIS (Netherlands)

    D.C. Blitz (David)

    2011-01-01

    textabstractBenchmarking benchmarks is a bundle of six studies that are inspired by the prevalence of benchmarking in academic finance research as well as in investment practice. Three studies examine if current benchmark asset pricing models adequately describe the cross-section of stock returns.

  17. Benchmarking Benchmarks

    NARCIS (Netherlands)

    D.C. Blitz (David)

    2011-01-01

    textabstractBenchmarking benchmarks is a bundle of six studies that are inspired by the prevalence of benchmarking in academic finance research as well as in investment practice. Three studies examine if current benchmark asset pricing models adequately describe the cross-section of stock returns. W

  18. Benchmarking DFT and semiempirical methods on structures and lattice energies for ten ice polymorphs

    Science.gov (United States)

    Brandenburg, Jan Gerit; Maas, Tilo; Grimme, Stefan

    2015-03-01

    Water in different phases under various external conditions is very important in bio-chemical systems and for material science at surfaces. Density functional theory methods and approximations thereof have to be tested system specifically to benchmark their accuracy regarding computed structures and interaction energies. In this study, we present and test a set of ten ice polymorphs in comparison to experimental data with mass densities ranging from 0.9 to 1.5 g/cm3 and including explicit corrections for zero-point vibrational and thermal effects. London dispersion inclusive density functionals at the generalized gradient approximation (GGA), meta-GGA, and hybrid level as well as alternative low-cost molecular orbital methods are considered. The widely used functional of Perdew, Burke and Ernzerhof (PBE) systematically overbinds and overall provides inconsistent results. All other tested methods yield reasonable to very good accuracy. BLYP-D3atm gives excellent results with mean absolute errors for the lattice energy below 1 kcal/mol (7% relative deviation). The corresponding optimized structures are very accurate with mean absolute relative deviations (MARDs) from the reference unit cell volume below 1%. The impact of Axilrod-Teller-Muto (atm) type three-body dispersion and of non-local Fock exchange is small but on average their inclusion improves the results. While the density functional tight-binding model DFTB3-D3 performs well for low density phases, it does not yield good high density structures. As low-cost alternative for structure related problems, we recommend the recently introduced minimal basis Hartree-Fock method HF-3c with a MARD of about 3%.

  19. Uncertainty Analysis for OECD-NEA-UAM Benchmark Problem of TMI-1 PWR Fuel Assembly

    Energy Technology Data Exchange (ETDEWEB)

    Kwon, Hyuk; Kim, S. J.; Seo, K.W.; Hwang, D. H. [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2015-10-15

    A quantification of code uncertainty is one of main questions that is continuously asked by the regulatory body like KINS. Utility and code developers solve the issue case by case because the general answer about this question is still opened. Under the circumference, OECD-NEA has attracted the global consensus on the uncertainty quantification through the UAM benchmark program. OECD-NEA benchmark II-2 problem is a problem on the uncertainty quantification of subchannel code. It is a problem that the uncertainty of fuel temperature and ONB location on the TMI-1 fuel assembly are estimated on the transient and steady condition. In this study, the uncertainty quantification of MATRA code is performed on the problem. Workbench platform is developed to produce the large set of inputs that is needed to estimate the uncertainty quantification on the benchmark problem. Direct Monte Carlo sampling is used to the random sampling from sample PDF. Uncertainty analysis of MATRA code on OECD-NEA benchmark problem is estimated using the developed tool and MATRA code. Uncertainty analysis on OECD-NEA benchmark II-2 problem was performed to quantify the uncertainty of MATRA code. Direct Monte Carlo sampling is used to extract 2000 random parameters. Workbench program is developed to generate input files and post process of calculation results. Uncertainty affected by input parameters was estimated on the DNBR, the cladding and the coolant temperatures.

  20. OECD/NEA Sandia Fuel Project phase I: Benchmark of the ignition testing

    Energy Technology Data Exchange (ETDEWEB)

    Adorni, Martina, E-mail: martina_adorni@hotmail.it [UNIPI (Italy); Herranz, Luis E. [CIEMAT (Spain); Hollands, Thorsten [GRS (Germany); Ahn, Kwang-II [KAERI (Korea, Republic of); Bals, Christine [GRS (Germany); D' Auria, Francesco [UNIPI (Italy); Horvath, Gabor L. [NUBIKI (Hungary); Jaeckel, Bernd S. [PSI (Switzerland); Kim, Han-Chul; Lee, Jung-Jae [KINS (Korea, Republic of); Ogino, Masao [JNES (Japan); Techy, Zsolt [NUBIKI (Hungary); Velazquez-Lozad, Alexander; Zigh, Abdelghani [USNRC (United States); Rehacek, Radomir [OECD/NEA (France)

    2016-10-15

    Highlights: • A unique PWR spent fuel pool experimental project is analytically investigated. • Predictability of fuel clad ignition in case of a complete loss of coolant in SFPs is assessed. • Computer codes reasonably estimate peak cladding temperature and time of ignition. - Abstract: The OECD/NEA Sandia Fuel Project provided unique thermal-hydraulic experimental data associated with Spent Fuel Pool (SFP) complete drain down. The study conducted at Sandia National Laboratories (SNL) was successfully completed (July 2009 to February 2013). The accident conditions of interest for the SFP were simulated in a full scale prototypic fashion (electrically heated, prototypic assemblies in a prototypic SFP rack) so that the experimental results closely represent actual fuel assembly responses. A major impetus for this work was to facilitate severe accident code validation and to reduce modeling uncertainties within the codes. Phase I focused on axial heating and burn propagation in a single PWR 17 × 17 assembly (i.e. “hot neighbors” configuration). Phase II addressed axial and radial heating and zirconium fire propagation including effects of fuel rod ballooning in a 1 × 4 assembly configuration (i.e. single, hot center assembly and four, “cooler neighbors”). This paper summarizes the comparative analysis regarding the final destructive ignition test of the phase I of the project. The objective of the benchmark is to evaluate and compare the predictive capabilities of computer codes concerning the ignition testing of PWR fuel assemblies. Nine institutions from eight different countries were involved in the benchmark calculations. The time to ignition and the maximum temperature are adequately captured by the calculations. It is believed that the benchmark constitutes an enlargement of the validation range for the codes to the conditions tested, thus enhancing the code applicability to other fuel assembly designs and configurations. The comparison of

  1. Benchmark experiment for physics parameters of metallic-fueled LMFBR at FCA

    Energy Technology Data Exchange (ETDEWEB)

    Iijima, S.; Oigawa, H.; Sakurai, T.; Nemoto, T.; Okajima, S. [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment

    1996-09-01

    The calculated prediction for reactor physics parameters in a metallic-fueled LMFBR was tested using the benchmark experiments performed at FCA. The reactivity feedback parameters such as sodium void worth, Doppler reactivity worth and {sup 238}U-capture-to-{sup 239}Pu -fission ratio have been measured. The fuel expansion reactivity has also measured. Direct comparison with the results from similar oxide fuel assembly was made. Analysis was done with the JENDL-2 cross section library and JENDL-3.2. Prediction of reactor physics parameters with JENDL-3.2 in the metallic-fueled core agreed reasonably well with the measured values and showed similar trend to the results in the oxide fuel core. (author)

  2. Fuel lattice design in a boiling water reactor using a knowledge-based automation system

    Energy Technology Data Exchange (ETDEWEB)

    Tung, Wu-Hsiung, E-mail: wstong@iner.gov.tw; Lee, Tien-Tso; Kuo, Weng-Sheng; Yaur, Shung-Jung

    2015-11-15

    Highlights: • An automation system was developed for the fuel lattice radial design of BWRs. • An enrichment group peaking equalizing method is applied to optimize the design. • Several heuristic rules and restrictions are incorporated to facilitate the design. • The CPU time for the system to design a 10x10 lattice was less than 1.2 h. • The beginning-of-life LPF was improved from 1.319 to 1.272 for one of the cases. - Abstract: A knowledge-based fuel lattice design automation system for BWRs is developed and applied to the design of 10 × 10 fuel lattices. The knowledge implemented in this fuel lattice design automation system includes the determination of gadolinium fuel pin location, the determination of fuel pin enrichment and enrichment distribution. The optimization process starts by determining the gadolinium distribution based on the pin power distribution of a flat enrichment lattice and some heuristic rules. Next, a pin power distribution flattening and an enrichment grouping process are introduced to determine the enrichment of each fuel pin enrichment type and the initial enrichment distribution of a fuel lattice design. Finally, enrichment group peaking equalizing processes are performed to achieve lower lattice peaking. Several fuel lattice design constraints are also incorporated in the automation system such that the system can accomplish a design which meets the requirements of practical use. Depending on the axial position of the lattice, a different method is applied in the design of the fuel lattice. Two typical fuel lattices with U{sup 235} enrichment of 4.471% and 4.386% were taken as references. Application of the method demonstrates that improved lattice designs can be achieved through the enrichment grouping and the enrichment group peaking equalizing method. It takes about 11 min and 1 h 11 min of CPU time for the automation system to accomplish two design cases on an HP-8000 workstation, including the execution of CASMO-4

  3. Generation IV benchmarking of TRISO fuel performance models under accident conditions: Modeling input data

    Energy Technology Data Exchange (ETDEWEB)

    Collin, Blaise P. [Idaho National Laboratory (INL), Idaho Falls, ID (United States)

    2014-09-01

    This document presents the benchmark plan for the calculation of particle fuel performance on safety testing experiments that are representative of operational accidental transients. The benchmark is dedicated to the modeling of fission product release under accident conditions by fuel performance codes from around the world, and the subsequent comparison to post-irradiation experiment (PIE) data from the modeled heating tests. The accident condition benchmark is divided into three parts: the modeling of a simplified benchmark problem to assess potential numerical calculation issues at low fission product release; the modeling of the AGR-1 and HFR-EU1bis safety testing experiments; and, the comparison of the AGR-1 and HFR-EU1bis modeling results with PIE data. The simplified benchmark case, thereafter named NCC (Numerical Calculation Case), is derived from ''Case 5'' of the International Atomic Energy Agency (IAEA) Coordinated Research Program (CRP) on coated particle fuel technology [IAEA 2012]. It is included so participants can evaluate their codes at low fission product release. ''Case 5'' of the IAEA CRP-6 showed large code-to-code discrepancies in the release of fission products, which were attributed to ''effects of the numerical calculation method rather than the physical model''[IAEA 2012]. The NCC is therefore intended to check if these numerical effects subsist. The first two steps imply the involvement of the benchmark participants with a modeling effort following the guidelines and recommendations provided by this document. The third step involves the collection of the modeling results by Idaho National Laboratory (INL) and the comparison of these results with the available PIE data. The objective of this document is to provide all necessary input data to model the benchmark cases, and to give some methodology guidelines and recommendations in order to make all results suitable for comparison

  4. GEN-IV BENCHMARKING OF TRISO FUEL PERFORMANCE MODELS UNDER ACCIDENT CONDITIONS MODELING INPUT DATA

    Energy Technology Data Exchange (ETDEWEB)

    Collin, Blaise Paul [Idaho National Laboratory

    2016-09-01

    This document presents the benchmark plan for the calculation of particle fuel performance on safety testing experiments that are representative of operational accidental transients. The benchmark is dedicated to the modeling of fission product release under accident conditions by fuel performance codes from around the world, and the subsequent comparison to post-irradiation experiment (PIE) data from the modeled heating tests. The accident condition benchmark is divided into three parts: • The modeling of a simplified benchmark problem to assess potential numerical calculation issues at low fission product release. • The modeling of the AGR-1 and HFR-EU1bis safety testing experiments. • The comparison of the AGR-1 and HFR-EU1bis modeling results with PIE data. The simplified benchmark case, thereafter named NCC (Numerical Calculation Case), is derived from “Case 5” of the International Atomic Energy Agency (IAEA) Coordinated Research Program (CRP) on coated particle fuel technology [IAEA 2012]. It is included so participants can evaluate their codes at low fission product release. “Case 5” of the IAEA CRP-6 showed large code-to-code discrepancies in the release of fission products, which were attributed to “effects of the numerical calculation method rather than the physical model” [IAEA 2012]. The NCC is therefore intended to check if these numerical effects subsist. The first two steps imply the involvement of the benchmark participants with a modeling effort following the guidelines and recommendations provided by this document. The third step involves the collection of the modeling results by Idaho National Laboratory (INL) and the comparison of these results with the available PIE data. The objective of this document is to provide all necessary input data to model the benchmark cases, and to give some methodology guidelines and recommendations in order to make all results suitable for comparison with each other. The participants should read

  5. Generation IV benchmarking of TRISO fuel performance models under accident conditions: Modeling input data

    Energy Technology Data Exchange (ETDEWEB)

    Collin, Blaise P. [Idaho National Laboratory (INL), Idaho Falls, ID (United States)

    2014-09-01

    This document presents the benchmark plan for the calculation of particle fuel performance on safety testing experiments that are representative of operational accidental transients. The benchmark is dedicated to the modeling of fission product release under accident conditions by fuel performance codes from around the world, and the subsequent comparison to post-irradiation experiment (PIE) data from the modeled heating tests. The accident condition benchmark is divided into three parts: the modeling of a simplified benchmark problem to assess potential numerical calculation issues at low fission product release; the modeling of the AGR-1 and HFR-EU1bis safety testing experiments; and, the comparison of the AGR-1 and HFR-EU1bis modeling results with PIE data. The simplified benchmark case, thereafter named NCC (Numerical Calculation Case), is derived from ''Case 5'' of the International Atomic Energy Agency (IAEA) Coordinated Research Program (CRP) on coated particle fuel technology [IAEA 2012]. It is included so participants can evaluate their codes at low fission product release. ''Case 5'' of the IAEA CRP-6 showed large code-to-code discrepancies in the release of fission products, which were attributed to ''effects of the numerical calculation method rather than the physical model''[IAEA 2012]. The NCC is therefore intended to check if these numerical effects subsist. The first two steps imply the involvement of the benchmark participants with a modeling effort following the guidelines and recommendations provided by this document. The third step involves the collection of the modeling results by Idaho National Laboratory (INL) and the comparison of these results with the available PIE data. The objective of this document is to provide all necessary input data to model the benchmark cases, and to give some methodology guidelines and recommendations in order to make all results suitable for comparison

  6. Verify Super Double-Heterogeneous Spherical Lattice Model for Equilibrium Fuel Cycle Analysis AND HTR Spherical Super Lattice Model for Equilibrium Fuel Cycle Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Gray S. Chang

    2005-11-01

    The currently being developed advanced High Temperature gas-cooled Reactors (HTR) is able to achieve a simplification of safety through reliance on innovative features and passive systems. One of the innovative features in these HTRs is reliance on ceramic-coated fuel particles to retain the fission products even under extreme accident conditions. Traditionally, the effect of the random fuel kernel distribution in the fuel pebble / block is addressed through the use of the Dancoff correction factor in the resonance treatment. However, the Dancoff correction factor is a function of burnup and fuel kernel packing factor, which requires that the Dancoff correction factor be updated during Equilibrium Fuel Cycle (EqFC) analysis. An advanced KbK-sph model and whole pebble super lattice model (PSLM), which can address and update the burnup dependent Dancoff effect during the EqFC analysis. The pebble homogeneous lattice model (HLM) is verified by the burnup characteristics with the double-heterogeneous KbK-sph lattice model results. This study summarizes and compares the KbK-sph lattice model and HLM burnup analyzed results. Finally, we discuss the Monte-Carlo coupling with a fuel depletion and buildup code - ORIGEN-2 as a fuel burnup analysis tool and its PSLM calculated results for the HTR EqFC burnup analysis.

  7. Lattice Wess-Zumino model with Ginsparg-Wilson fermions: One-loop results and GPU benchmarks

    Science.gov (United States)

    Chen, Chen; Dzienkowski, Eric; Giedt, Joel

    2010-10-01

    We numerically evaluate the one-loop counterterms for the four-dimensional Wess-Zumino model formulated on the lattice using Ginsparg-Wilson fermions of the overlap (Neuberger) variety, together with an auxiliary fermion (plus superpartners), such that a lattice version of U(1)R symmetry is exactly preserved in the limit of vanishing bare mass. We confirm previous findings by other authors that at one loop there is no renormalization of the superpotential in the lattice theory, but that there is a mismatch in the wave-function renormalization of the auxiliary field. We study the range of the Dirac operator that results when the auxiliary fermion is integrated out, and show that localization does occur, but that it is less pronounced than the exponential localization of the overlap operator. We also present preliminary simulation results for this model, and outline a strategy for nonperturbative improvement of the lattice supercurrent through measurements of supersymmetry Ward identities. Related to this, some benchmarks for our graphics processing unit code are provided. Our simulation results find a nearly vanishing vacuum expectation value for the auxiliary field, consistent with approximate supersymmetry at weak coupling.

  8. Neutronic evaluation of two proposed fuel lattice pitches for ET-RR-1 reactor

    Energy Technology Data Exchange (ETDEWEB)

    Ashoub, N.; Saleh, H.G

    2000-04-01

    The present fuel element of the ET-RR1 research reactor has a 1.75 cm lattice pitch. The neutronic studies were proved that, this lattice pitch is over moderated and not the suitable one from the fuel economic point of view. Two fuel lattice pitches are proposed, one has 1.4 cm lattice pitch with 10% U{sup 235} enrichment and the other has 1.75 cm lattice pitch with 15% U{sup 235} enrichment. The comparative neutronic study was done between these two proposed fuel lattice pitches against the present one in two cases, one for the complete core configuration of the ET-RR-1 which includes 52 fuel elements and the other for one of the actual core configuration load contains 47 fuel elements. This study is included the calculations of different neutronic parameters as the infinite and effective multiplication factor, the multi-group neutron flux along the reactor core, and the power peaking factor. The above factors were calculated by using the WIMSD4 code for lattice cell calculation, and the DIXY2 code for diffusion calculations. The results are represented in some tables and figures.

  9. Environmental benchmarking of the largest fossil-fueled electricity generating plants in the U.S

    Science.gov (United States)

    Sarkis, Joseph

    2004-02-01

    Environmental management, to be effective, requires performance evaluation and process improvement. This is especially the case in fossil-fueled electricity generating plants. Although eco-efficient management of these types of organizations are critical to local, national and global environmental issues, few studies have focused on performance measurement and eco-efficiency improvements in this industry. This study evaluates the eco-efficiencies of the top 100 major U.S. fossil-fueled electricity generating plants from 1998 data. Using a multi-criteria non-parametric productivity model (data envelopment analysis) efficiency scores are determined. These efficiency scores are treated by a clustering method in identifying benchmarks for improving poorly performing plants. Efficiency measures are based on three resource input measures including boiler generating capacity, total fuel heat used, and total generator capacity, and four output measures including actual energy generated, SO2, NOx, and CO2 emissions. The purpose of this paper is two-fold, to introduce the methodology"s application to eco-efficiency performance measurement and show some characteristics of the benchmarked plants and groups.

  10. Sensitivity analysis on various parameters for lattice analysis of DUPIC fuel with WIMS-AECL code

    Energy Technology Data Exchange (ETDEWEB)

    Roh, Gyu Hong; Choi, Hang Bok; Park, Jee Won [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    1997-12-31

    The code WIMS-AECL has been used for the lattice analysis of DUPIC fuel. The lattice parameters calculated by the code is sensitive to the choice of number of parameters, such as the number of tracking lines, number of condensed groups, mesh spacing in the moderator region, other parameters vital to the calculation of probabilities and burnup analysis. We have studied this sensitivity with respect to these parameters and recommend their proper values which are necessary for carrying out the lattice analysis of DUPIC fuel.

  11. Proposal and analysis of the benchmark problem suite for reactor physics study of LWR next generation fuels

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2001-10-01

    In order to investigate the calculation accuracy of the nuclear characteristics of LWR next generation fuels, the Research Committee on Reactor Physics organized by JAERI has established the Working Party on Reactor Physics for LWR Next Generation Fuels. The next generation fuels mean the ones aiming for further extended burn-up such as 70 GWd/t over the current design. The Working Party has proposed six benchmark problems, which consists of pin-cell, PWR fuel assembly and BWR fuel assembly geometries loaded with uranium and MOX fuels, respectively. The specifications of the benchmark problem neglect some of the current limitations such as 5 wt% {sup 235}U to achieve the above-mentioned target. Eleven organizations in the Working Party have carried out the analyses of the benchmark problems. As a result, status of accuracy with the current data and method and some problems to be solved in the future were clarified. In this report, details of the benchmark problems, result by each organization, and their comparisons are presented. (author)

  12. Benchmark of SCALE (SAS2H) isotopic predictions of depletion analyses for San Onofre PWR MOX fuel

    Energy Technology Data Exchange (ETDEWEB)

    Hermann, O.W.

    2000-02-01

    The isotopic composition of mixed-oxide (MOX) fuel, fabricated with both uranium and plutonium, after discharge from reactors is of significant interest to the Fissile Materials Disposition Program. The validation of the SCALE (SAS2H) depletion code for use in the prediction of isotopic compositions of MOX fuel, similar to previous validation studies on uranium-only fueled reactors, has corresponding significance. The EEI-Westinghouse Plutonium Recycle Demonstration Program examined the use of MOX fuel in the San Onofre PWR, Unit 1, during cycles 2 and 3. Isotopic analyses of the MOX spent fuel were conducted on 13 actinides and {sup 148}Nd by either mass or alpha spectrometry. Six fuel pellet samples were taken from four different fuel pins of an irradiated MOX assembly. The measured actinide inventories from those samples has been used to benchmark SAS2H for MOX fuel applications. The average percentage differences in the code results compared with the measurement were {minus}0.9% for {sup 235}U and 5.2% for {sup 239}Pu. The differences for most of the isotopes were significantly larger than in the cases for uranium-only fueled reactors. In general, comparisons of code results with alpha spectrometer data had extreme differences, although the differences in the calculations compared with mass spectrometer analyses were not extremely larger than that of uranium-only fueled reactors. This benchmark study should be useful in estimating uncertainties of inventory, criticality and dose calculations of MOX spent fuel.

  13. Proton Exchange Membrane Fuel Cell Engineering Model Powerplant. Test Report: Benchmark Tests in Three Spatial Orientations

    Science.gov (United States)

    Loyselle, Patricia; Prokopius, Kevin

    2011-01-01

    Proton exchange membrane (PEM) fuel cell technology is the leading candidate to replace the aging alkaline fuel cell technology, currently used on the Shuttle, for future space missions. This test effort marks the final phase of a 5-yr development program that began under the Second Generation Reusable Launch Vehicle (RLV) Program, transitioned into the Next Generation Launch Technologies (NGLT) Program, and continued under Constellation Systems in the Exploration Technology Development Program. Initially, the engineering model (EM) powerplant was evaluated with respect to its performance as compared to acceptance tests carried out at the manufacturer. This was to determine the sensitivity of the powerplant performance to changes in test environment. In addition, a series of tests were performed with the powerplant in the original standard orientation. This report details the continuing EM benchmark test results in three spatial orientations as well as extended duration testing in the mission profile test. The results from these tests verify the applicability of PEM fuel cells for future NASA missions. The specifics of these different tests are described in the following sections.

  14. EVALUATION OF U10MO FUEL PLATE IRRADIATION BEHAVIOR VIA NUMERICAL AND EXPERIMENTAL BENCHMARKING

    Energy Technology Data Exchange (ETDEWEB)

    Samuel J. Miller; Hakan Ozaltun

    2012-11-01

    This article analyzes dimensional changes due to irradiation of monolithic plate-type nuclear fuel and compares results with finite element analysis of the plates during fabrication and irradiation. Monolithic fuel plates tested in the Advanced Test Reactor (ATR) at Idaho National Lab (INL) are being used to benchmark proposed fuel performance for several high power research reactors. Post-irradiation metallographic images of plates sectioned at the midpoint were analyzed to determine dimensional changes of the fuel and the cladding response. A constitutive model of the fabrication process and irradiation behavior of the tested plates was developed using the general purpose commercial finite element analysis package, Abaqus. Using calculated burn-up profiles of irradiated plates to model the power distribution and including irradiation behaviors such as swelling and irradiation enhanced creep, model simulations allow analysis of plate parameters that are either impossible or infeasible in an experimental setting. The development and progression of fabrication induced stress concentrations at the plate edges was of primary interest, as these locations have a unique stress profile during irradiation. Additionally, comparison between 2D and 3D models was performed to optimize analysis methodology. In particular, the ability of 2D and 3D models account for out of plane stresses which result in 3-dimensional creep behavior that is a product of these components. Results show that assumptions made in 2D models for the out-of-plane stresses and strains cannot capture the 3-dimensional physics accurately and thus 2D approximations are not computationally accurate. Stress-strain fields are dependent on plate geometry and irradiation conditions, thus, if stress based criteria is used to predict plate behavior (as opposed to material impurities, fine micro-structural defects, or sharp power gradients), unique 3D finite element formulation for each plate is required.

  15. Calculation of the Thermal Radiation Benchmark Problems for a CANDU Fuel Channel Analysis Using the CFX-10 Code

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Hyoung Tae; Park, Joo Hwan; Rhee, Bo Wook

    2006-07-15

    To justify the use of a commercial Computational Fluid Dynamics (CFD) code for a CANDU fuel channel analysis, especially for the radiation heat transfer dominant conditions, the CFX-10 code is tested against three benchmark problems which were used for the validation of a radiation heat transfer in the CANDU analysis code, a CATHENA. These three benchmark problems are representative of the CANDU fuel channel configurations from a simple geometry to whole fuel channel geometry. With assumptions of a non-participating medium completely enclosed with the diffuse, gray and opaque surfaces, the solutions of the benchmark problems are obtained by the concept of surface resistance to radiation accounting for the view factors and the emissivities. The view factors are calculated by the program MATRIX version 1.0 avoiding the difficulty of hand calculation for the complex geometries. For the solutions of the benchmark problems, the temperature or the net radiation heat flux boundary conditions are prescribed for each radiating surface to determine the radiation heat transfer rate or the surface temperature, respectively by using the network method. The Discrete Transfer Model (DTM) is used for the CFX-10 radiation model and its calculation results are compared with the solutions of the benchmark problems. The CFX-10 results for the three benchmark problems are in close agreement with these solutions, so it is concluded that the CFX-10 with a DTM radiation model can be applied to the CANDU fuel channel analysis where a surface radiation heat transfer is a dominant mode of the heat transfer.

  16. Validation of CENDL and JEFF evaluated nuclear data files for TRIGA calculations through the analysis of integral parameters of TRX and BAPL benchmark lattices of thermal reactors

    Energy Technology Data Exchange (ETDEWEB)

    Uddin, M.N. [Department of Physics, Jahangirnagar University, Dhaka (Bangladesh); Sarker, M.M. [Reactor Physics and Engineering Division, Institute of Nuclear Science and Technology, Atomic Energy Research Establishment, Savar, GPO Box 3787, Dhaka 1000 (Bangladesh); Khan, M.J.H. [Reactor Physics and Engineering Division, Institute of Nuclear Science and Technology, Atomic Energy Research Establishment, Savar, GPO Box 3787, Dhaka 1000 (Bangladesh)], E-mail: jahirulkhan@yahoo.com; Islam, S.M.A. [Department of Physics, Jahangirnagar University, Dhaka (Bangladesh)

    2009-10-15

    The aim of this paper is to present the validation of evaluated nuclear data files CENDL-2.2 and JEFF-3.1.1 through the analysis of the integral parameters of TRX and BAPL benchmark lattices of thermal reactors for neutronics analysis of TRIGA Mark-II Research Reactor at AERE, Bangladesh. In this process, the 69-group cross-section library for lattice code WIMS was generated using the basic evaluated nuclear data files CENDL-2.2 and JEFF-3.1.1 with the help of nuclear data processing code NJOY99.0. Integral measurements on the thermal reactor lattices TRX-1, TRX-2, BAPL-UO{sub 2}-1, BAPL-UO{sub 2}-2 and BAPL-UO{sub 2}-3 served as standard benchmarks for testing nuclear data files and have also been selected for this analysis. The integral parameters of the said lattices were calculated using the lattice transport code WIMSD-5B based on the generated 69-group cross-section library. The calculated integral parameters were compared to the measured values as well as the results of Monte Carlo Code MCNP. It was found that in most cases, the values of integral parameters show a good agreement with the experiment and MCNP results. Besides, the group constants in WIMS format for the isotopes U-235 and U-238 between two data files have been compared using WIMS library utility code WILLIE and it was found that the group constants are identical with very insignificant difference. Therefore, this analysis reflects the validation of evaluated nuclear data files CENDL-2.2 and JEFF-3.1.1 through benchmarking the integral parameters of TRX and BAPL lattices and can also be essential to implement further neutronic analysis of TRIGA Mark-II research reactor at AERE, Dhaka, Bangladesh.

  17. Fuel lattice design in a boiling water reactor using an ant-colony-based system

    Energy Technology Data Exchange (ETDEWEB)

    Montes, Jose Luis, E-mail: joseluis.montes@inin.gob.mx [Instituto Nacional de Investigaciones Nucleares, Carr. Mexico-Toluca S/N, La Marquesa, Ocoyoacac, Estado de Mexico, CP 52750 (Mexico); Facultad de Ciencias, Universidad Autonoma del Estado de Mexico (Mexico); Francois, Juan-Luis, E-mail: juan.luis.francois@gmail.com [Departamento de Sistemas Energeticos, Facultad de Ingenieria, Universidad Nacional Autonoma de Mexico, Paseo Cuauhnahuac 8532, Jiutepec, Mor., CP 62550 (Mexico); Ortiz, Juan Jose, E-mail: juanjose.ortiz@inin.gob.mx [Instituto Nacional de Investigaciones Nucleares, Carr. Mexico-Toluca S/N, La Marquesa, Ocoyoacac, Estado de Mexico, CP 52750 (Mexico); Martin-del-Campo, Cecilia, E-mail: cecilia.martin.del.campo@gmail.com [Departamento de Sistemas Energeticos, Facultad de Ingenieria, Universidad Nacional Autonoma de Mexico, Paseo Cuauhnahuac 8532, Jiutepec, Mor., CP 62550 (Mexico); Perusquia, Raul, E-mail: raul.perusquia@inin.gob.mx [Instituto Nacional de Investigaciones Nucleares, Carr. Mexico-Toluca S/N, La Marquesa, Ocoyoacac, Estado de Mexico, CP 52750 (Mexico)

    2011-06-15

    Research highlights: > We present an ant-colony-based system for BWR fuel lattice design and optimization. > Assessment of candidate solutions at 0.0 MWd/kg {sup 235}U seems to have a limited scope. > Suitable heuristic rules enable more realistic fuel lattice designs. > The election of the objective has a large impact in CPU time. > ACS enables an important decrease of the initial average U-235 enrichment. - Abstract: This paper presents a new approach to deal with the boiling water reactor radial fuel lattice design. The goal is to optimize the distribution of both, the fissionable material, and the reactivity control poison material inside the fuel lattice at the beginning of its life. An ant-colony-based system was used to search for either: the optimum location of the poisoned pin inside the lattice, or the U{sup 235} enrichment and Gd{sub 2}O{sub 3} concentrations. In the optimization process, in order to know the parameters of the candidate solutions, the neutronic simulator CASMO-4 transport code was used. A typical 10 x 10 BWR fuel lattice with an initial average U{sup 235} enrichment of 4.1%, used in the current operation of Laguna Verde Nuclear Power Plant was taken as a reference. With respect to that reference lattice, it was possible to decrease the average U{sup 235} enrichment up to 3.949%, this obtained value represents a decrease of 3.84% with respect to the reference U{sup 235} enrichment; whereas, the k-infinity was inside the {+-}100 pcm's range, and there was a difference of 0.94% between the local power peaking factor and the lattice reference value. Particular emphasis was made on defining the objective function which is used for making the assessment of candidate solutions. In a typical desktop personal computer, about four hours of CPU time were necessary for the algorithm to fulfill the goals of the optimization process. The results obtained with the application of the implemented system showed that the proposed approach represents a

  18. Development and Experimental Benchmark of Simulations to Predict Used Nuclear Fuel Cladding Temperatures during Drying and Transfer Operations

    Energy Technology Data Exchange (ETDEWEB)

    Greiner, Miles [Univ. of Nevada, Reno, NV (United States)

    2017-03-31

    Radial hydride formation in high-burnup used fuel cladding has the potential to radically reduce its ductility and suitability for long-term storage and eventual transport. To avoid this formation, the maximum post-reactor temperature must remain sufficiently low to limit the cladding hoop stress, and so that hydrogen from the existing circumferential hydrides will not dissolve and become available to re-precipitate into radial hydrides under the slow cooling conditions during drying, transfer and early dry-cask storage. The objective of this research is to develop and experimentallybenchmark computational fluid dynamics simulations of heat transfer in post-pool-storage drying operations, when high-burnup fuel cladding is likely to experience its highest temperature. These benchmarked tools can play a key role in evaluating dry cask storage systems for extended storage of high-burnup fuels and post-storage transportation, including fuel retrievability. The benchmarked tools will be used to aid the design of efficient drying processes, as well as estimate variations of surface temperatures as a means of inferring helium integrity inside the canister or cask. This work will be conducted effectively because the principal investigator has experience developing these types of simulations, and has constructed a test facility that can be used to benchmark them.

  19. Evaluation of anode (electro)catalytic materials for the direct borohydride fuel cell: Methods and benchmarks

    Science.gov (United States)

    Olu, Pierre-Yves; Job, Nathalie; Chatenet, Marian

    2016-09-01

    In this paper, different methods are discussed for the evaluation of the potential of a given catalyst, in view of an application as a direct borohydride fuel cell DBFC anode material. Characterizations results in DBFC configuration are notably analyzed at the light of important experimental variables which influence the performances of the DBFC. However, in many practical DBFC-oriented studies, these various experimental variables prevent one to isolate the influence of the anode catalyst on the cell performances. Thus, the electrochemical three-electrode cell is a widely-employed and useful tool to isolate the DBFC anode catalyst and to investigate its electrocatalytic activity towards the borohydride oxidation reaction (BOR) in the absence of other limitations. This article reviews selected results for different types of catalysts in electrochemical cell containing a sodium borohydride alkaline electrolyte. In particular, propositions of common experimental conditions and benchmarks are given for practical evaluation of the electrocatalytic activity towards the BOR in three-electrode cell configuration. The major issue of gaseous hydrogen generation and escape upon DBFC operation is also addressed through a comprehensive review of various results depending on the anode composition. At last, preliminary concerns are raised about the stability of potential anode catalysts upon DBFC operation.

  20. CFD Validation Benchmark Dataset for Natural Convection in Nuclear Fuel Rod Bundles

    Science.gov (United States)

    Smith, Barton; Jones, Kyle

    2016-11-01

    The present study provide CFD validation benchmark data for coupled fluid flow/convection heat transfer on the exterior of heated rods arranged in a 2 × 2 array. The rod model incorporates grids with swirling veins to resemble a nuclear fuel bundle. The four heated aluminum rods are suspended in an open-circuit wind tunnel. Boundary conditions (BCs) are measured and uncertainties calculated to provide all quantities necessary to successfully conduct a CFD validation exercise. System response quantities (SRQs) are measured for comparing the simulation output to the experiment. Stereoscopic Particle Image Velocimetry (SPIV) is used to non-intrusively measure 3-component velocity fields. A through-plane measurement is used for the inflow while laser sheet planes aligned with the flow direction at several downstream locations are used for system response quantities. Two constant heat flux rod surface conditions are presented (400 W/m2 and 700 W/m2) achieving a peak Rayleigh number of 1010 . Uncertainty for all measured variables is reported. The boundary conditions, system response, and all material properties are now available online for download. The U.S. Department of Energy Nuclear Engineering University Program provided the funding for these experiments under Grant 00128493.

  1. Reactor physics analysis for the design of nuclear fuel lattices with burnable poisons

    Energy Technology Data Exchange (ETDEWEB)

    Espinosa-Paredes, G. [Area de Ingenieria en Recursos Energeticos, Universidad Autonoma Metropolitana-Iztapalapa, Av. San Rafael Atlixco 186, Col. Vicentina, 09340 Mexico, D.F. (Mexico); Guzman, Juan R., E-mail: maestro_juan_rafael@hotmail.com [Departamento de Fisica y Matematicas, Instituto Politecnico Nacional, Adolfo Lopez Mateos, San Pedro Zacatenco, 07738 Mexico, D.F. (Mexico)

    2011-12-15

    Highlights: Black-Right-Pointing-Pointer A fuel rod optimization for the coupled bundle-core design in a BWR is developed. Black-Right-Pointing-Pointer An algorithm to minimize the rod power peaking factor is used. Black-Right-Pointing-Pointer The fissile content is divided in two factors. Black-Right-Pointing-Pointer A reactor physics analysis of these factors is performed. Black-Right-Pointing-Pointer The algorithm is applied to a typical BWR fuel lattice. - Abstract: The main goals in nuclear fuel lattice design are: (1) minimizing the rod power peaking factor (PPF) in order that the power level distribution is the most uniform; (2) obtaining a prescribed target value for the multiplication factor (k) at the end of the irradiation in order that the fuel lattice reaches the desired reactivity; and (3) obtaining a prescribed target value for the k at the beginning of the irradiation in order that the reactivity excess is neither a high value (to ease the maneuvering of the control systems) nor a low value (to avoid the penalization of the high cost of the burnable poison content). In this work a simple algorithm to design the burnable poison bearing nuclear fuel lattice is presented. This algorithm is based on a reactor physics analysis. The algorithm is focused on finding the radial distribution of the fuel rods having different fissile and burnable poison contents in order to obtain: (1) an adequate minimum PPF; (2) a prescribed target value of the k at the end of the irradiation; and (3) a prescribed target value of the k at the beginning of the irradiation. This algorithm is based on the factorization of the fissile and burnable poison contents of each fuel rod and on the application of the first-order perturbation theory. The performance of the algorithm is demonstrated with the design of a fuel lattice composed of uranium dioxide (UO{sub 2}) and gadolinium dioxide (Gd{sub 2}O{sub 3}) for boiling water reactors (BWR). This algorithm has been accomplished

  2. Simulations and measurements of adiabatic annular flows in triangular, tight lattice nuclear fuel bundle model

    Energy Technology Data Exchange (ETDEWEB)

    Saxena, Abhishek, E-mail: asaxena@lke.mavt.ethz.ch [ETH Zurich, Laboratory for Nuclear Energy Systems, Department of Mechanical and Process Engineering, Sonneggstrasse 3, 8092 Zürich (Switzerland); Zboray, Robert [Laboratory for Thermal-hydraulics, Nuclear Energy and Safety Department, Paul Scherrer Institute, 5232 Villigen PSI (Switzerland); Prasser, Horst-Michael [ETH Zurich, Laboratory for Nuclear Energy Systems, Department of Mechanical and Process Engineering, Sonneggstrasse 3, 8092 Zürich (Switzerland); Laboratory for Thermal-hydraulics, Nuclear Energy and Safety Department, Paul Scherrer Institute, 5232 Villigen PSI (Switzerland)

    2016-04-01

    High conversion light water reactors (HCLWR) having triangular, tight-lattice fuels bundles could enable improved fuel utilization compared to present day LWRs. However, the efficient cooling of a tight lattice bundle has to be still proven. Major concern is the avoidance of high-quality boiling crisis (film dry-out) by the use of efficient functional spacers. For this reason, we have carried out experiments on adiabatic, air-water annular two-phase flows in a tight-lattice, triangular fuel bundle model using generic spacers. A high-spatial-resolution, non-intrusive measurement technology, cold neutron tomography, has been utilized to resolve the distribution of the liquid film thickness on the virtual fuel pin surfaces. Unsteady CFD simulations have also been performed to replicate and compare with the experiments using the commercial code STAR-CCM+. Large eddies have been resolved on the grid level to capture the dominant unsteady flow features expected to drive the liquid film thickness distribution downstream of a spacer while the subgrid scales have been modeled using the Wall Adapting Local Eddy (WALE) subgrid model. A Volume of Fluid (VOF) method, which directly tracks the interface and does away with closure relationship models for interfacial exchange terms, has also been employed. The present paper shows first comparison of the measurement with the simulation results.

  3. Square lattice honeycomb tri-carbide fuels for 50 to 250 KN variable thrust NTP design

    Science.gov (United States)

    Anghaie, Samim; Knight, Travis; Gouw, Reza; Furman, Eric

    2001-02-01

    Ultrahigh temperature solid solution of tri-carbide fuels are used to design an ultracompact nuclear thermal rocket generating 950 seconds of specific impulse with scalable thrust level in range of 50 to 250 kilo Newtons. Solid solutions of tri-carbide nuclear fuels such as uranium-zirconium-niobium carbide. UZrNbC, are processed to contain certain mixing ratio between uranium carbide and two stabilizing carbides. Zirconium or niobium in the tri-carbide could be replaced by tantalum or hafnium to provide higher chemical stability in hot hydrogen environment or to provide different nuclear design characteristics. Recent studies have demonstrated the chemical compatibility of tri-carbide fuels with hydrogen propellant for a few to tens of hours of operation at temperatures ranging from 2800 K to 3300 K, respectively. Fuel elements are fabricated from thin tri-carbide wafers that are grooved and locked into a square-lattice honeycomb (SLHC) shape. The hockey puck shaped SLHC fuel elements are stacked up in a grooved graphite tube to form a SLHC fuel assembly. A total of 18 fuel assemblies are arranged circumferentially to form two concentric rings of fuel assemblies with zirconium hydride filling the space between assemblies. For 50 to 250 kilo Newtons thrust operations, the reactor diameter and length including reflectors are 57 cm and 60 cm, respectively. Results of the nuclear design and thermal fluid analyses of the SLHC nuclear thermal propulsion system are presented. .

  4. Neutronics Benchmarks for the Utilization of Mixed-Oxide Fuel: Joint US/Russian Progress Report for Fiscal 1997. Volume 3 - Calculations Performed in the Russian Federation

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1998-06-01

    This volume of the progress report provides documentation of reactor physics and criticality safety studies conducted in the Russian Federation during fiscal year 1997 and sponsored by the Fissile Materials Disposition Program of the US Department of Energy. Descriptions of computational and experimental benchmarks for the verification and validation of computer programs for neutron physics analyses are included. All benchmarks include either plutonium, uranium, or mixed uranium and plutonium fuels. Calculated physics parameters are reported for all of the contaminated benchmarks that the United States and Russia mutually agreed in November 1996 were applicable to mixed-oxide fuel cycles for light-water reactors.

  5. Comparison of Homogeneous and Heterogeneous CFD Fuel Models for Phase I of the IAEA CRP on HTR Uncertainties Benchmark

    Energy Technology Data Exchange (ETDEWEB)

    Gerhard Strydom; Su-Jong Yoon

    2014-04-01

    Computational Fluid Dynamics (CFD) evaluation of homogeneous and heterogeneous fuel models was performed as part of the Phase I calculations of the International Atomic Energy Agency (IAEA) Coordinate Research Program (CRP) on High Temperature Reactor (HTR) Uncertainties in Modeling (UAM). This study was focused on the nominal localized stand-alone fuel thermal response, as defined in Ex. I-3 and I-4 of the HTR UAM. The aim of the stand-alone thermal unit-cell simulation is to isolate the effect of material and boundary input uncertainties on a very simplified problem, before propagation of these uncertainties are performed in subsequent coupled neutronics/thermal fluids phases on the benchmark. In many of the previous studies for high temperature gas cooled reactors, the volume-averaged homogeneous mixture model of a single fuel compact has been applied. In the homogeneous model, the Tristructural Isotropic (TRISO) fuel particles in the fuel compact were not modeled directly and an effective thermal conductivity was employed for the thermo-physical properties of the fuel compact. On the contrary, in the heterogeneous model, the uranium carbide (UCO), inner and outer pyrolytic carbon (IPyC/OPyC) and silicon carbide (SiC) layers of the TRISO fuel particles are explicitly modeled. The fuel compact is modeled as a heterogeneous mixture of TRISO fuel kernels embedded in H-451 matrix graphite. In this study, a steady-state and transient CFD simulations were performed with both homogeneous and heterogeneous models to compare the thermal characteristics. The nominal values of the input parameters are used for this CFD analysis. In a future study, the effects of input uncertainties in the material properties and boundary parameters will be investigated and reported.

  6. Creation of a Full-Core HTR Benchmark with the Fort St. Vrain Initial Core and Assessment of Uncertainties in the FSV Fuel Composition and Geometry

    Energy Technology Data Exchange (ETDEWEB)

    Martin, William R.; Lee, John C.; baxter, Alan; Wemple, Chuck

    2012-03-31

    Information and measured data from the intial Fort St. Vrain (FSV) high temperature gas reactor core is used to develop a benchmark configuration to validate computational methods for analysis of a full-core, commercial HTR configuration. Large uncertainties in the geometry and composition data for the FSV fuel and core are identified, including: (1) the relative numbers of fuel particles for the four particle types, (2) the distribution of fuel kernel diameters for the four particle types, (3) the Th:U ratio in the initial FSV core, (4) and the buffer thickness for the fissile and fertile particles. Sensitivity studies were performed to assess each of these uncertainties. A number of methods were developed to assist in these studies, including: (1) the automation of MCNP5 input files for FSV using Python scripts, (2) a simple method to verify isotopic loadings in MCNP5 input files, (3) an automated procedure to conduct a coupled MCNP5-RELAP5 analysis for a full-core FSV configuration with thermal-hydraulic feedback, and (4) a methodology for sampling kernel diameters from arbitrary power law and Gaussian PDFs that preserved fuel loading and packing factor constraints. A reference FSV fuel configuration was developed based on having a single diameter kernel for each of the four particle types, preserving known uranium and thorium loadings and packing factor (58%). Three fuel models were developed, based on representing the fuel as a mixture of kernels with two diameters, four diameters, or a continuous range of diameters. The fuel particles were put into a fuel compact using either a lattice-bsed approach or a stochastic packing methodology from RPI, and simulated with MCNP5. The results of the sensitivity studies indicated that the uncertainties in the relative numbers and sizes of fissile and fertile kernels were not important nor were the distributions of kernel diameters within their diameter ranges. The uncertainty in the Th:U ratio in the intial FSV core was

  7. Numerical simulation of direct methanol fuel cells using lattice Boltzmann method

    Energy Technology Data Exchange (ETDEWEB)

    Delavar, Mojtaba Aghajani; Farhadi, Mousa; Sedighi, Kurosh [Faculty of Mechanical Engineering, Babol University of Technology, Babol, P.O. Box 484 (Iran)

    2010-09-15

    In this study Lattice Boltzmann Method (LBM) as an alternative of conventional computational fluid dynamics method is used to simulate Direct Methanol Fuel Cell (DMFC). A two dimensional lattice Boltzmann model with 9 velocities, D2Q9, is used to solve the problem. The computational domain includes all seven parts of DMFC: anode channel, catalyst and diffusion layers, membrane and cathode channel, catalyst and diffusion layers. The model has been used to predict the flow pattern and concentration fields of different species in both clear and porous channels to investigate cell performance. The results have been compared well with results in literature for flow in porous and clear channels and cell polarization curves of the DMFC at different flow speeds and feed methanol concentrations. (author)

  8. Two phase flow simulation in a channel of a polymer electrolyte membrane fuel cell using the lattice Boltzmann method

    OpenAIRE

    Ben Salah, Yasser; Tabe, Yutaka; Chikahisa, Takemi

    2012-01-01

    Water management in polymer electrolyte (PEM) fuel cells is important for fuel cell performance and durability. Numerical simulations using the lattice Boltzmann method (LBM) are developed to elucidate the dynamic behavior of condensed water and gas flows in a polymer electrolyte membrane (PEM) fuel cell gas channel. A scheme for two-phase flow with large density differences was applied to establish the optimum gas channel design for different gas channel heights, droplet positions, and gas c...

  9. Analytic Methods for Benchmarking Hydrogen and Fuel Cell Technologies; NREL (National Renewable Energy Laboratory)

    Energy Technology Data Exchange (ETDEWEB)

    Melaina, Marc; Saur, Genevieve; Ramsden, Todd; Eichman, Joshua

    2015-05-28

    This presentation summarizes NREL's hydrogen and fuel cell analysis work in three areas: resource potential, greenhouse gas emissions and cost of delivered energy, and influence of auxiliary revenue streams. NREL's hydrogen and fuel cell analysis projects focus on low-­carbon and economic transportation and stationary fuel cell applications. Analysis tools developed by the lab provide insight into the degree to which bridging markets can strengthen the business case for fuel cell applications.

  10. OECD/NEA burnup credit criticality benchmarks phase IIIA: Criticality calculations of BWR spent fuel assemblies in storage and transport

    Energy Technology Data Exchange (ETDEWEB)

    Okuno, Hiroshi; Naito, Yoshitaka [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment; Ando, Yoshihira [Toshiba Corp., Kawasaki, Kanagawa (Japan)

    2000-09-01

    The report describes the final results of Phase IIIA Benchmarks conducted by the Burnup Credit Criticality Calculation Working Group under the auspices of the Nuclear Energy Agency of the Organization for Economic Cooperation and Development (OECD/NEA). The benchmarks are intended to confirm the predictive capability of the current computer code and data library combinations for the neutron multiplication factor (k{sub eff}) of a layer of irradiated BWR fuel assembly array model. In total 22 benchmark problems are proposed for calculations of k{sub eff}. The effects of following parameters are investigated: cooling time, inclusion/exclusion of FP nuclides and axial burnup profile, and inclusion of axial profile of void fraction or constant void fractions during burnup. Axial profiles of fractional fission rates are further requested for five cases out of the 22 problems. Twenty-one sets of results are presented, contributed by 17 institutes from 9 countries. The relative dispersion of k{sub eff} values calculated by the participants from the mean value is almost within the band of {+-}1%{delta}k/k. The deviations from the averaged calculated fission rate profiles are found to be within {+-}5% for most cases. (author)

  11. Lattice-strain control of the activity in dealloyed core-shell fuel cell catalysts.

    Science.gov (United States)

    Strasser, Peter; Koh, Shirlaine; Anniyev, Toyli; Greeley, Jeff; More, Karren; Yu, Chengfei; Liu, Zengcai; Kaya, Sarp; Nordlund, Dennis; Ogasawara, Hirohito; Toney, Michael F; Nilsson, Anders

    2010-06-01

    Electrocatalysis will play a key role in future energy conversion and storage technologies, such as water electrolysers, fuel cells and metal-air batteries. Molecular interactions between chemical reactants and the catalytic surface control the activity and efficiency, and hence need to be optimized; however, generalized experimental strategies to do so are scarce. Here we show how lattice strain can be used experimentally to tune the catalytic activity of dealloyed bimetallic nanoparticles for the oxygen-reduction reaction, a key barrier to the application of fuel cells and metal-air batteries. We demonstrate the core-shell structure of the catalyst and clarify the mechanistic origin of its activity. The platinum-rich shell exhibits compressive strain, which results in a shift of the electronic band structure of platinum and weakening chemisorption of oxygenated species. We combine synthesis, measurements and an understanding of strain from theory to generate a reactivity-strain relationship that provides guidelines for tuning electrocatalytic activity.

  12. Lattice Boltzmann modeling of transport phenomena in fuel cells and flow batteries

    Science.gov (United States)

    Xu, Ao; Shyy, Wei; Zhao, Tianshou

    2017-06-01

    Fuel cells and flow batteries are promising technologies to address climate change and air pollution problems. An understanding of the complex multiscale and multiphysics transport phenomena occurring in these electrochemical systems requires powerful numerical tools. Over the past decades, the lattice Boltzmann (LB) method has attracted broad interest in the computational fluid dynamics and the numerical heat transfer communities, primarily due to its kinetic nature making it appropriate for modeling complex multiphase transport phenomena. More importantly, the LB method fits well with parallel computing due to its locality feature, which is required for large-scale engineering applications. In this article, we review the LB method for gas-liquid two-phase flows, coupled fluid flow and mass transport in porous media, and particulate flows. Examples of applications are provided in fuel cells and flow batteries. Further developments of the LB method are also outlined.

  13. Simulation of sound waves using the Lattice Boltzmann Method for fluid flow: Benchmark cases for outdoor sound propagation

    NARCIS (Netherlands)

    Salomons, E.M.; Lohman, W.J.A.; Zhou, H.

    2016-01-01

    Propagation of sound waves in air can be considered as a special case of fluid dynamics. Consequently, the lattice Boltzmann method (LBM) for fluid flow can be used for simulating sound propagation. In this article application of the LBM to sound propagation is illustrated for various cases: free-fi

  14. A proposal of benchmark calculation on reactor physics for metallic fueled and MOX fueled LMFBR based upon mack-up experiment at FCA

    Energy Technology Data Exchange (ETDEWEB)

    Oigawa, Hiroyuki; Iijima, Susumu; Sakurai, Takeshi; Okajima, Shigeaki; Andoh, Masaki; Nemoto, Tatsuo; Kato, Yuichi; Osugi, Toshitaka [Dept. of Nuclear Energy System, Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan)

    2000-02-01

    In order to assess the validity of the cross section library for fast reactor physics, a set of benchmark calculation is proposed. The benchmark calculation is based upon mock-up experiments at three FCA cores with various compositions of central test regions, two of which were mock-ups of metallic fueled LMFBR's, and the other was a mock-up of a mixed oxide fueled LMFBR. One of the metallic cores included enriched uranium in the test region, while the others did not. Physics parameters to be calculated are criticality, reaction rate ratios, plutonium and B{sub 4}C sample worth, sodium void reactivity worth, and Doppler reactivity worth of {sup 238}U. Homogenized atomic number densities and various correction factors are given so that anyone can easily perform diffusion calculation in two-dimensional RZ-model and compare the results with the experiments. The validity of the correction factors are proved by changing the calculation method and used nuclear data file. (author)

  15. Delayed Fission Product Gamma-Ray Transmission Through Low Enriched UO2 Fuel Pin Lattices in Air

    Energy Technology Data Exchange (ETDEWEB)

    Trumbull, TH [Rensselaer Polytechnic Inst., Troy, NY (United States)

    2004-10-18

    The transmission of delayed fission-product gamma rays through various arrangements of low-enriched UO2 fuel pin lattices in an air medium was studied. Experimental measurements, point-kernel and Monte Carlo photon transport calculations were performed to demonstrate the shielding effect of ordered lattices of fuel pins on the resulting gamma-ray dose to a detector outside the lattice. The variation of the gamma-ray dose on the outside of the lattice as a function of radial position, the so-called “channeling” effect, was analyzed. Techniques for performing experimental measurements and data reduction at Rensselaer Polytechnic Institute’s Reactor Critical Facility (RCF) were derived. An experimental apparatus was constructed to hold the arrangements of fuel pins for the measurements. A gamma-ray spectroscopy system consisting of a sodium-iodide scintillation detector was used to collect data. Measurements were made with and without a collimator installed. A point-kernel transport code was developed to map the radial dependence of the gamma-ray flux. Input files for the Monte Carlo code, MCNP, were also developed to accurately model the experimental measurements. The results of the calculations were compared to the experimental measurements. In order to determine the delayed fission-product gamma-ray source for the calculations, a technique was developed using a previously written code, DELBG and the reactor state-point data obtained during the experimental measurements. Calculations were performed demonstrating the effects of material homogenization on the gamma-ray transmission through the fuel pin lattice.Homogeneous and heterogeneous calculations were performed for all RCF fuel pin lattices as well as for a typical commercial pressurized water reactor fuel bundle. The results of the study demonstrated the effectiveness of the experimental measurements to isolate the channeling effect of delayed fission-product gamma-rays through lattices of RCF fuel pins

  16. Lattice Boltzmann Simulation of Multiphase Transport in Nanostructured PEM Fuel Cells

    Science.gov (United States)

    Stiles, Christopher D.

    As the fossil fuel crisis becomes more critical, it is imperative to develop renewable sources of power generation. Polymer electrolyte membrane (PEM) fuel cells are considered a viable option. However, the cost of the platinum catalyst has hindered their commercialization. PEM fuel cells with platinum loading of >0.4 mg cm2 are common. Efforts towards further reducing this loading are currently underway utilizing nanostructured electrodes. A consequence of increased platinum utilization per unit area and thinner nanostructured electrodes is flooding, which is detrimental to fuel cell performance. Flooding causes a two-fold impact on cell performance: a drop in cell voltage and a rise in parasitic pumping power to overcome the increased pressure drop, which together result in a significant reduction in system efficiency. Proper water management is therefore crucial for optimum performance of the fuel cell and also for enhancing membrane durability. The goal of this thesis is to simulate the multiphase fluid transport in the nanostructured PEMFC of H2O in air with realistic density ratios. In order to pursue this goal, the ability of the pseudopotential based multiphase lattice Boltzmann method to realistically model the coexistence of the gas and liquid phases of H2O at low temperatures is explored. This method is expanded to include a gas mixture of O2 and N 2 into the multiphase H2O systems. Beginning with the examination of the phase transition region described by the current implementation of the multiphase pseudopotential lattice Boltzmann model. Following this, a modified form of the pressure term with the use of a scalar multiplier kappa for the Peng-Robinson equation of state is thoroughly investigated. This method proves to be very effective at enabling numerically stable simulations at low temperatures with large density ratios. It is found that for decreasing values of kappa, this model leads to an increase in multiphase interface thickness and a

  17. Simulation of Sound Waves Using the Lattice Boltzmann Method for Fluid Flow: Benchmark Cases for Outdoor Sound Propagation

    OpenAIRE

    Erik M. Salomons; Lohman, Walter J. A.; Han Zhou

    2016-01-01

    Propagation of sound waves in air can be considered as a special case of fluid dynamics. Consequently, the lattice Boltzmann method (LBM) for fluid flow can be used for simulating sound propagation. In this article application of the LBM to sound propagation is illustrated for various cases: free-field propagation, propagation over porous and non-porous ground, propagation over a noise barrier, and propagation in an atmosphere with wind. LBM results are compared with solutions of the equation...

  18. Best Practices and Testing Protocols for Benchmarking ORR Activities of Fuel Cell Electrocatalysts Using Rotating Disk Electrode

    Energy Technology Data Exchange (ETDEWEB)

    Kocha, Shyam S; Shinozaki, Kazuma; Zack, Jason W; Myers, Deborah J.; Kariuki, Nancy N.; Nowicki, Tammi L.; Stamenkovic, Vojislav; Kang, Yijin; Li, Dongguo; Papageorgopoulos, Dimitrios

    2017-07-01

    Abstract Thin-film-rotating disk electrodes (TF-RDEs) are the half-cell electrochemical system of choice for rapid screening of oxygen reduction reaction (ORR) activity of novel Pt supported on carbon black supports (Pt/C) electrocatalysts. It has been shown that the magnitude of the measured ORR activity and reproducibility are highly dependent on the system cleanliness, evaluation protocols, and operating conditions as well as ink formulation, composition, film drying, and the resultant film thickness and uniformity. Accurate benchmarks of baseline Pt/C catalysts evaluated using standardized protocols and best practices are necessary to expedite ultra-low-platinum group metal (PGM) catalyst development that is crucial for the imminent commercialization of fuel cell vehicles. We report results of evaluation in three independent laboratories of Pt/C electrocatalysts provided by commercial fuel cell catalyst manufacturers (Johnson Matthey, Umicore, Tanaka Kikinzoku Kogyo—TKK). The studies were conducted using identical evaluation protocols/ink formulation/film fabrication albeit employing unique electrochemical cell designs specific to each laboratory. The ORR activities reported in this work provide a baseline and criteria for selection and scale-up of novel high activity ORR electrocatalysts for implementation in proton exchange membrane fuel cells (PEMFCs).

  19. Fuel Cell Development for NASA's Human Exploration Program: Benchmarking with "The Hydrogen Economy"

    Science.gov (United States)

    Scott, John H.

    2007-01-01

    The theoretically high efficiency and low temperature operation of hydrogen-oxygen fuel cells has motivated them to be the subject of much study since their invention in the 19th Century, but their relatively high life cycle costs kept them as a "solution in search of a problem" for many years. The first problem for which fuel cells presented a truly cost effective solution was that of providing a power source for NASA's human spaceflight vehicles in the 1960 s. NASA thus invested, and continues to invest, in the development of fuel cell power plants for this application. This development program continues to place its highest priorities on requirements for minimum system mass and maximum durability and reliability. These priorities drive fuel cell power plant design decisions at all levels, even that of catalyst support. However, since the mid-1990's, prospective environmental regulations have driven increased governmental and industrial interest in "green power" and the "Hydrogen Economy." This has in turn stimulated greatly increased investment in fuel cell development for a variety of commercial applications. This investment is bringing about notable advances in fuel cell technology, but, as these development efforts place their highest priority on requirements for minimum life cycle cost and field safety, these advances are yielding design solutions quite different at almost every level from those needed for spacecraft applications. This environment thus presents both opportunities and challenges for NASA's Human Exploration Program

  20. Development of a scatter search optimization algorithm for BWR fuel lattice design

    Energy Technology Data Exchange (ETDEWEB)

    Francois, J.L.; Martin-del-Campo, C. [Mexico Univ. Nacional Autonoma, Facultad de Ingenieria (Mexico); Morales, L.B.; Palomera, M.A. [Mexico Univ. Nacional Autonoma, Instituto de Investigaciones en Matematicas Aplicadas y Sistemas, D.F. (Mexico)

    2005-07-01

    A basic Scatter Search (SS) method, applied to the optimization of radial enrichment and gadolinia distributions for BWR fuel lattices, is presented in this paper. Scatter search is considered as an evolutionary algorithm that constructs solutions by combining others. The goal of this methodology is to enable the implementation of solution procedures that can derive new solutions from combined elements. The main mechanism for combining solutions is such that a new solution is created from the strategic combination of two other solutions to explore the solutions' space. Results show that the Scatter Search method is an efficient optimization algorithm applied to the BWR design and optimization problem. Its main features are based on the use of heuristic rules since the beginning of the process, which allows directing the optimization process to the solution, and to use the diversity mechanism in the combination operator, which allows covering the search space in an efficient way. (authors)

  1. Benchmark Numerical Simulations of Viscoelastic Fluid Flows with an Efficient Integrated Lattice Boltzmann and Finite Volume Scheme

    Directory of Open Access Journals (Sweden)

    Shun Zou

    2015-02-01

    Full Text Available An efficient IBLF-dts scheme is proposed to integrate the bounce-back LBM and FVM scheme to solve the Navier-Stokes equations and the constitutive equation, respectively, for the simulation of viscoelastic fluid flows. In order to improve the efficiency, the bounce-back boundary treatment for LBM is introduced in to improve the grid mapping of LBM and FVM, and the two processes are also decoupled in different time scales according to the relaxation time of polymer and the time scale of solvent Newtonian effect. Critical numerical simulations have been carried out to validate the integrated scheme in various benchmark flows at vanishingly low Reynolds number with open source CFD toolkits. The results show that the numerical solution with IBLF-dts scheme agrees well with the exact solution and the numerical solution with FVM PISO scheme and the efficiency and scalability could be remarkably improved under equivalent configurations.

  2. Development of a fuzzy logic method to build objective functions in optimization problems: application to BWR fuel lattice design

    Energy Technology Data Exchange (ETDEWEB)

    Martin-del-Campo, C.; Francois, J.L.; Barragan, A.M. [Universidad Nacional Autonoma de Mexico - Facultad de Ingenieria (Mexico); Palomera, M.A. [Universidad Nacional Autonoma de Mexico - Instituto de Investigaciones en Matematicas Aplicadas y Sistema, Mexico, D. F. (Mexico)

    2005-07-01

    In this paper we develop a methodology based on the use of the Fuzzy Logic technique to build multi-objective functions to be used in optimization processes applied to in-core nuclear fuel management. As an example, we selected the problem of determining optimal radial fuel enrichment and gadolinia distributions in a typical 'Boiling Water Reactor (BWR)' fuel lattice. The methodology is based on the use of the mathematical capability of Fuzzy Logic to model nonlinear functions of arbitrary complexity. The utility of Fuzzy Logic is to map an input space into an output space, and the primary mechanism for doing this is a list of if-then statements called rules. The rules refer to variables and adjectives that describe those variables and, the Fuzzy Logic technique interprets the values in the input vectors and, based on the set of rules assigns values to the output vector. The methodology was developed for the radial optimization of a BWR lattice where the optimization algorithm employed is Tabu Search. The global objective is to find the optimal distribution of enrichments and burnable poison concentrations in a 10*10 BWR lattice. In order to do that, a fuzzy control inference system was developed using the Fuzzy Logic Toolbox of Matlab and it has been linked to the Tabu Search optimization process. Results show that Tabu Search combined with Fuzzy Logic performs very well, obtaining lattices with optimal fuel utilization. (authors)

  3. Simulation of Sound Waves Using the Lattice Boltzmann Method for Fluid Flow: Benchmark Cases for Outdoor Sound Propagation.

    Directory of Open Access Journals (Sweden)

    Erik M Salomons

    Full Text Available Propagation of sound waves in air can be considered as a special case of fluid dynamics. Consequently, the lattice Boltzmann method (LBM for fluid flow can be used for simulating sound propagation. In this article application of the LBM to sound propagation is illustrated for various cases: free-field propagation, propagation over porous and non-porous ground, propagation over a noise barrier, and propagation in an atmosphere with wind. LBM results are compared with solutions of the equations of acoustics. It is found that the LBM works well for sound waves, but dissipation of sound waves with the LBM is generally much larger than real dissipation of sound waves in air. To circumvent this problem it is proposed here to use the LBM for assessing the excess sound level, i.e. the difference between the sound level and the free-field sound level. The effect of dissipation on the excess sound level is much smaller than the effect on the sound level, so the LBM can be used to estimate the excess sound level for a non-dissipative atmosphere, which is a useful quantity in atmospheric acoustics. To reduce dissipation in an LBM simulation two approaches are considered: i reduction of the kinematic viscosity and ii reduction of the lattice spacing.

  4. Simulation of Sound Waves Using the Lattice Boltzmann Method for Fluid Flow: Benchmark Cases for Outdoor Sound Propagation.

    Science.gov (United States)

    Salomons, Erik M; Lohman, Walter J A; Zhou, Han

    2016-01-01

    Propagation of sound waves in air can be considered as a special case of fluid dynamics. Consequently, the lattice Boltzmann method (LBM) for fluid flow can be used for simulating sound propagation. In this article application of the LBM to sound propagation is illustrated for various cases: free-field propagation, propagation over porous and non-porous ground, propagation over a noise barrier, and propagation in an atmosphere with wind. LBM results are compared with solutions of the equations of acoustics. It is found that the LBM works well for sound waves, but dissipation of sound waves with the LBM is generally much larger than real dissipation of sound waves in air. To circumvent this problem it is proposed here to use the LBM for assessing the excess sound level, i.e. the difference between the sound level and the free-field sound level. The effect of dissipation on the excess sound level is much smaller than the effect on the sound level, so the LBM can be used to estimate the excess sound level for a non-dissipative atmosphere, which is a useful quantity in atmospheric acoustics. To reduce dissipation in an LBM simulation two approaches are considered: i) reduction of the kinematic viscosity and ii) reduction of the lattice spacing.

  5. Neutronics Benchmarks for the Utilization of Mixed-Oxide Fuel: Joint U.S./Russian Progress Report for Fiscal Year 1997 Volume 2-Calculations Performed in the United States

    Energy Technology Data Exchange (ETDEWEB)

    Primm III, RT

    2002-05-29

    This volume of the progress report provides documentation of reactor physics and criticality safety studies conducted in the US during fiscal year 1997 and sponsored by the Fissile Materials Disposition Program of the US Department of Energy. Descriptions of computational and experimental benchmarks for the verification and validation of computer programs for neutron physics analyses are included. All benchmarks include either plutonium, uranium, or mixed uranium and plutonium fuels. Calculated physics parameters are reported for all of the computational benchmarks and for those experimental benchmarks that the US and Russia mutually agreed in November 1996 were applicable to mixed-oxide fuel cycles for light-water reactors.

  6. Positron annihilation spectroscopy study of lattice defects in non-irradiated doped and un-doped fuels

    Directory of Open Access Journals (Sweden)

    Chollet Mélanie

    2017-01-01

    Full Text Available Fission gas behavior within the fuel structure plays a major role for the safety of nuclear fuels during operation in the nuclear power plant. Fission gas distribution and retention is determined by both, micro- and lattice-structure of the fuel matrix. The ADOPT (Advanced Doped Pellet Technology fuel, containing chromium and aluminum additives, shows larger grain sizes than standard (undoped UO2 fuel, enhancing the fission gas retention properties of the matrix. However, the additions of such trivalent cations shall also induce defects in the lattice. In this study, we investigated the microstructure of such doped fuels as well as a reference standard UO2 by positron annihilation spectroscopy (PAS. Although this technique is particularly sensitive to lattice point defects in materials, a wider application in the UO2 research is still missing. The PAS-lifetime components were measured in the hotlab facility of PSI using a 22Na source sandwiched between two 500-μm-thin sample discs. The values of lifetime at the center and the rim of both samples, examined to check at the radial homogeneity of the pellets, are not significantly different. The mean lifetimes were found to be longer in the ADOPT material, 220 ps, than in standard UO2, 190 ps, which indicates a larger presence of additional defects, presumably generated by the dopants. While two-component decomposition (bulk + one defect component could be performed for the standard material, only one lifetime component was found in the doped material. The absence of the bulk component in the ADOPT sample refers to a saturated positron trapping (i.e., all positrons are trapped at defects. In order to associate a type of lattice defect to each PAS component, interpretation of the PAS experimental observations was conducted with respect to existing experimental and modeling studies. This work has shown the efficiency of PAS to detect lattice point defects in UO2 produced by Cr and Al oxides

  7. Lattice-Strain Control of Exceptional Activity in Dealloyed Core-Shell Fuel Cell Catalysts

    Energy Technology Data Exchange (ETDEWEB)

    Strasser, Peter

    2011-08-19

    We present a combined experimental and theoretical approach to demonstrate how lattice strain can be used to continuously tune the catalytic activity of the oxygen reduction reaction (ORR) on bimetallic nanoparticles that have been dealloyed. The sluggish kinetics of the ORR is a key barrier to the adaptation of fuel cells and currently limits their widespread use. Dealloyed Pt-Cu bimetallic nanoparticles, however, have been shown to exhibit uniquely high reactivity for this reaction. We first present evidence for the formation of a core-shell structure during dealloying, which involves removal of Cu from the surface and subsurface of the precursor nanoparticles. We then show that the resulting Pt-rich surface shell exhibits compressive strain that depends on the composition of the precursor alloy. We next demonstrate the existence of a downward shift of the Pt d-band, resulting in weakening of the bond strength of intermediate oxygenated species due to strain. Finally, we combine synthesis, strain, and catalytic reactivity in an experimental/theoretical reactivity-strain relationship which provides guidelines for the rational design of strained oxygen reduction electrocatalysts. The stoichiometry of the precursor, together with the dealloying conditions, provides experimental control over the resulting surface strain and thereby allows continuous tuning of the surface electrocatalytic reactivity - a concept that can be generalized to other catalytic reactions.

  8. OECD/NRC PSBT Benchmark: Investigating the CATHARE2 Capability to Predict Void Fraction in PWR Fuel Bundle

    Directory of Open Access Journals (Sweden)

    A. Del Nevo

    2012-01-01

    Full Text Available Accurate prediction of steam volume fraction and of the boiling crisis (either DNB or dryout occurrence is a key safety-relevant issue. Decades of experience have been built up both in experimental investigation and code development and qualification; however, there is still a large margin to improve and refine the modelling approaches. The qualification of the traditional methods (system codes can be further enhanced by validation against high-quality experimental data (e.g., including measurement of local parameters. One of these databases, related to the void fraction measurements, is the pressurized water reactor subchannel and bundle tests (PSBT conducted by the Nuclear Power Engineering Corporation (NUPEC in Japan. Selected experiments belonging to this database are used for the OECD/NRC PSBT benchmark. The activity presented in the paper is connected with the improvement of current approaches by comparing system code predictions with measured data on void production in PWR-type fuel bundles. It is aimed at contributing to the validation of the numerical models of CATHARE 2 code, particularly for the prediction of void fraction distribution both at subchannel and bundle scale, for different test bundle configurations and thermal-hydraulic conditions, both in steady-state and transient conditions.

  9. Multi-phase micro-scale flow simulation in the electrodes of a PEM fuel cell by lattice Boltzmann method

    Science.gov (United States)

    Park, J.; Li, X.

    The gas diffusion layer of a polymer electrolyte membrane (PEM) fuel cell is a porous medium generally made of carbon cloth or paper. The gas diffusion layer has been modeled conventionally as a homogeneous porous medium with a constant permeability in the literature of PEM fuel cell. However, in fact, the permeability of such fibrous porous medium is strongly affected by the fiber orientation having non-isotropic permeability. In this work, the lattice Boltzmann (LB) method is applied to the multi-phase flow phenomenon in the inhomogeneous gas diffusion layer of a PEM fuel cell. The inhomogeneous porous structure of the carbon cloth and carbon paper has been modeled as void space and porous area using Stokes/Brinkman formulation and void space and impermeable fiber distributions obtained from various microscopic images. The permeability of the porous medium is calculated and compared to the experimental measurements in literature showing a good agreement. Simulation results for various fiber distributions indicate that the permeability of the medium is strongly influenced by the effect of fiber orientation. Present lattice Boltzmann flow models are applied to the multi-phase flow simulations by incorporating multi-component LB model with inter-particle interaction forces. The model successfully simulates the complicated unsteady behaviors of liquid droplet motion in the porous medium providing a useful tool to investigate the mechanism of liquid water accumulation/removal in a gas diffusion layer of a PEM fuel cell.

  10. Benchmarking ENDF/B-VII.0

    Science.gov (United States)

    van der Marck, Steven C.

    2006-12-01

    The new major release VII.0 of the ENDF/B nuclear data library has been tested extensively using benchmark calculations. These were based upon MCNP-4C3 continuous-energy Monte Carlo neutronics simulations, together with nuclear data processed using the code NJOY. Three types of benchmarks were used, viz., criticality safety benchmarks, (fusion) shielding benchmarks, and reference systems for which the effective delayed neutron fraction is reported. For criticality safety, more than 700 benchmarks from the International Handbook of Criticality Safety Benchmark Experiments were used. Benchmarks from all categories were used, ranging from low-enriched uranium, compound fuel, thermal spectrum ones (LEU-COMP-THERM), to mixed uranium-plutonium, metallic fuel, fast spectrum ones (MIX-MET-FAST). For fusion shielding many benchmarks were based on IAEA specifications for the Oktavian experiments (for Al, Co, Cr, Cu, LiF, Mn, Mo, Si, Ti, W, Zr), Fusion Neutronics Source in Japan (for Be, C, N, O, Fe, Pb), and Pulsed Sphere experiments at Lawrence Livermore National Laboratory (for 6Li, 7Li, Be, C, N, O, Mg, Al, Ti, Fe, Pb, D 2O, H 2O, concrete, polyethylene and teflon). For testing delayed neutron data more than thirty measurements in widely varying systems were used. Among these were measurements in the Tank Critical Assembly (TCA in Japan) and IPEN/MB-01 (Brazil), both with a thermal spectrum, and two cores in Masurca (France) and three cores in the Fast Critical Assembly (FCA, Japan), all with fast spectra. In criticality safety, many benchmarks were chosen from the category with a thermal spectrum, low-enriched uranium, compound fuel (LEU-COMP-THERM), because this is typical of most current-day reactors, and because these benchmarks were previously underpredicted by as much as 0.5% by most nuclear data libraries (such as ENDF/B-VI.8, JEFF-3.0). The calculated results presented here show that this underprediction is no longer there for ENDF/B-VII.0. The average over 257

  11. A pore-scale model for the cathode electrode of a proton exchange membrane fuel cell by lattice Boltzmann method

    Energy Technology Data Exchange (ETDEWEB)

    Molaeimanesh, Gholam Reza; Akbari, Mohammad Hadi [Shiraz University, Shiraz (Iran, Islamic Republic of)

    2015-03-15

    A pore-scale model based on the lattice Boltzmann method (LBM) is proposed for the cathode electrode of a PEM fuel cell with heterogeneous and anisotropic porous gas diffusion layer (GDL) and interdigitated flow field. An active approach is implemented to model multi-component transport in GDL, which leads to enhanced accuracy, especially at higher activation over-potentials. The core of the paper is the implementation of an electrochemical reaction with an active approach in a multi-component lattice Boltzmann model for the first time. After model validation, the capability of the presented model is demonstrated through a parametric study. Effects of activation over-potential, pressure differential between inlet and outlet gas channels, land width to channel width ratio, and channel width are investigated. The results show the significant influence of GDL microstructure on the oxygen distribution and current density profile.

  12. Solution of a benchmark set problems for BWR and PWR reactors with UO{sub 2} and MOX fuels using CASMO-4; Solucion de un Conjunto de Problemas Benchmark para Reactores BWR y PWR con Combustible UO{sub 2} y MOX Usando CASMO-4

    Energy Technology Data Exchange (ETDEWEB)

    Martinez F, M.A.; Valle G, E. del; Alonso V, G. [IPN, ESFM, 07738 Mexico D.F. (Mexico)]. e-mail: mike_ipn_esfm@hotmail. com

    2007-07-01

    In this work some of the results for a group of benchmark problems of light water reactors that allow to study the physics of the fuels of these reactors are presented. These benchmark problems were proposed by Akio Yamamoto and collaborators in 2002 and they include two fuel types; uranium dioxide (UO{sub 2}) and mixed oxides (MOX). The range of problems that its cover embraces three different configurations: unitary cell for a fuel bar, fuel assemble of PWR and fuel assemble of BWR what allows to carry out an understanding analysis of the problems related with the fuel performance of new generation in light water reactors with high burnt. Also these benchmark problems help to understand the fuel administration in core of a BWR like of a PWR. The calculations were carried out with CMS (of their initials in English Core Management Software), particularly with CASMO-4 that is a code designed to carry out analysis of fuels burnt of fuel bars cells as well as fuel assemblies as much for PWR as for BWR and that it is part in turn of the CMS code. (Author)

  13. The effect of coupled mass transport and internal reforming on modeling of solid oxide fuel cells part II: Benchmarking transient response and dynamic model fidelity assessment

    Science.gov (United States)

    Albrecht, Kevin J.; Braun, Robert J.

    2016-02-01

    One- and 'quasi' two-dimensional (2-D) dynamic, interface charge transport models of a solid oxide fuel cell (SOFC) developed previously in a companion paper, are benchmarked against other models and simulated to evaluate the effects of coupled transport and chemistry. Because the reforming reaction can distort the concentration profiles of the species within the anode, a 'quasi' 2-D model that captures porous media mass transport and electrochemistry is required. The impact of a change in concentration at the triple-phase boundary is twofold wherein the local Nernst potential and anode exchange current densities are influenced, thereby altering the current density and temperature distributions of the cell. Thus, the dynamic response of the cell models are compared, and benchmarked against previous channel-level models to gauge the relative importance of capturing in-situ reforming phenomena on cell performance. Simulation results indicate differences in the transient electrochemical response for a step in current density where the 'quasi' 2-D model predicts a slower rise and fall in cell potential due to the additional volume of the porous media and mass transport dynamics. Delays in fuel flow rate are shown to increase the difference observed in the electrochemical response of the cells.

  14. Benchmark physics experiment of metallic-fueled LMFBR at FCA. 2; Experiments of FCA assembly XVI-1 and their analyses

    Energy Technology Data Exchange (ETDEWEB)

    Iijima, Susumu; Oigawa, Hiroyuki; Ohno, Akio; Sakurai, Takeshi; Nemoto, Tatsuo; Osugi, Toshitaka; Satoh, Kunio; Hayasaka, Katsuhisa [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment; Bando, Masaru

    1993-10-01

    An availability of data and method for a design of metallic-fueled LMFBR is examined by using the experiment results of FCA assembly XVI-1. Experiment included criticality and reactivity coefficients such as Doppler, sodium void, fuel shifting and fuel expansion. Reaction rate ratios, sample worth and control rod worth were also measured. Analysis was made by using three-dimensional diffusion calculations and JENDL-2 cross sections. Predictions of assembly XVI-1 reactor physics parameters agree reasonably well with the measured values, but for some reactivity coefficients such as Doppler, large zone sodium void and fuel shifting further improvement of calculation method was need. (author).

  15. Foreign travel report: Visits to UK, Belgium, Germany, and France to benchmark European spent fuel and waste management technology

    Energy Technology Data Exchange (ETDEWEB)

    Ermold, L.F.; Knecht, D.A.

    1993-08-01

    The ICPP WINCO Spent Fuel and Waste Management Development Program recently was funded by DOE-EM to develop new technologies for immobilizing ICPP spent fuels, sodium-bearing liquid waste, and calcine to a form suitable for disposal. European organizations are heavily involved, in some cases on an industrial scale in areas of waste management, including spent fuel disposal and HLW vitrification. The purpose of this trip was to acquire first-hand European efforts in handling of spent reactor fuel and nuclear waste management, including their processing and technical capabilities as well as their future planning. Even though some differences exist in European and U.S. DOE waste compositions and regulations, many aspects of the European technologies may be applicable to the U.S. efforts, and several areas offer potential for technical collaboration.

  16. Investigation of GDL compression effects on the performance of a PEM fuel cell cathode by lattice Boltzmann method

    Science.gov (United States)

    Molaeimanesh, G. R.; Nazemian, M.

    2017-08-01

    Proton exchange membrane (PEM) fuel cells with a great potential for application in vehicle propulsion systems will have a promising future. However, to overcome the exiting challenges against their wider commercialization further fundamental research is inevitable. The effects of gas diffusion layer (GDL) compression on the performance of a PEM fuel cell is not well-recognized; especially, via pore-scale simulation technique capturing the fibrous microstructure of the GDL. In the current investigation, a stochastic microstructure reconstruction method is proposed which can capture GDL microstructure changes by compression. Afterwards, lattice Boltzmann pore-scale simulation technique is adopted to simulate the reactive gas flow through 10 different cathode electrodes with dissimilar carbon paper GDLs produced from five different compression levels and two different carbon fiber diameters. The distributions of oxygen mole fraction, water vapor mole fraction and current density for the simulated cases are presented and analyzed. The results of simulations demonstrate that when the fiber diameter is 9 μm adding compression leads to lower average current density while when the fiber diameter is 7 μm the compression effect is not monotonic.

  17. Preliminary study of the tight lattice pressured heavy water reactor loaded with Pu/U and Th/U mixed fuels

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    To improve nuclear fuel utilization efficiency and prolong fuel cycle burn-up, a tight pitch lattice pressured heavy water reactor was investigated as an alternative of next generation of power reactors. It is shown that the high conversion ratio and negative coolant void reactivity coefficient are challenges in the reactor core physics designs. Various techniques were proposed to solve these problems. In this work, a tight pitch lattice and mixed fuel assemblies pressured heavy water reactor concept was investigated. By utilizing numerical simulation technique, it is demonstrated that reactor core mixed with Pu/U and Th/U assemblies can achieve high conversion ratio (0.98), long burn-up (60 GWD/t) and negative void reactivity coefficients.

  18. Comparison/Validation Study of Lattice Boltzmann and Navier Stokes for Various Benchmark Applications: Report 1 in Discrete Nano-Scale Mechanics and Simulations Series

    Science.gov (United States)

    2014-09-15

    Lattice Boltzmann Method (LBM) has become increasingly popular as an alternative approach to traditional NS-based techniques for modeling various...CAVS: Center for Advanced Vehicular Systems • CFD : computational fluid dynamics • DEM: discrete element method • FDM: finite difference method...Mach number • MRT: multiple relaxation time • NS: Navier-Stokes method • PISO: pressure implicit with splitting operator • Re: Reynolds number

  19. Library Benchmarking

    Directory of Open Access Journals (Sweden)

    Wiji Suwarno

    2017-02-01

    Full Text Available The term benchmarking has been encountered in the implementation of total quality (TQM or in Indonesian termed holistic quality management because benchmarking is a tool to look for ideas or learn from the library. Benchmarking is a processof measuring and comparing for continuous business process of systematic and continuous measurement, the process of measuring and comparing for continuous business process of an organization to get information that can help these organization improve their performance efforts.

  20. Financial Benchmarking

    OpenAIRE

    2012-01-01

    This bachelor's thesis is focused on financial benchmarking of TULIPA PRAHA s.r.o. The aim of this work is to evaluate financial situation of the company, identify its strengths and weaknesses and to find out how efficient is the performance of this company in comparison with top companies within the same field by using INFA benchmarking diagnostic system of financial indicators. The theoretical part includes the characteristic of financial analysis, which financial benchmarking is based on a...

  1. Benchmark selection

    DEFF Research Database (Denmark)

    Hougaard, Jens Leth; Tvede, Mich

    2002-01-01

    Within a production theoretic framework, this paper considers an axiomatic approach to benchmark selection. It is shown that two simple and weak axioms; efficiency and comprehensive monotonicity characterize a natural family of benchmarks which typically becomes unique. Further axioms are added...

  2. Application of lattice Boltzmann method to a micro-scale flow simulation in the porous electrode of a PEM fuel cell

    Science.gov (United States)

    Park, J.; Matsubara, M.; Li, X.

    The electrode of a PEM fuel cell is a porous medium generally made of carbon cloth or paper. Such a porous electrode has been widely modeled as a homogeneous porous medium with a constant permeability in the literature of PEM fuel cell. In fact, most of gas diffusion media are not homogeneous having non-isotropic permeability. In case of carbon cloth, the porous structure consists of carbon fiber tows, the bundles of carbon fiber, and void spaces among tows. The combinational effect of the void space and tow permeability results in the effective permeability of the porous electrode. In this work, the lattice Boltzmann method is applied to the simulation of the flow in the electrode of a PEM fuel cell. The electrode is modeled as void space and porous region which has certain permeability and the Stokes and Brinkman equations are solved in the flow field using the lattice Boltzmann model. The effective permeability of the porous medium is calculated and compared to an analytical calculation showing a good agreement. It has been shown that the permeability of porous medium is strongly dependant on the fiber tow orientation in three-dimensional simulations. The lattice Boltzmann method is an efficient and effective numerical scheme to analyze the flow in a complicated geometry such as the porous medium.

  3. Participation in benchmark MATIS-H of NEA/OCDE: uses CFD codes applied to nuclear safety. Study of the spacer grids in the fuel elements; Participacion en el Benchmark Matis-H de la NEA/OCDE: usos de codigos CFD aplicados a seguridad nuclear. Estudio de las rejillas espaciadoras en los elementos combustibles

    Energy Technology Data Exchange (ETDEWEB)

    Pena-Monferrer, C.; Chiva, S.; Munoz-cobo, J. L.; Vela, E.

    2012-07-01

    This paper develops participation in benchmark MATIS-H, promoted by the NEA / OECD-KAERI, involving the study of turbulent flow in a rod beam with spacers in an experimental installation. Its aim is the analysis of hydraulic behavior of turbulent flow in the subchannels of the fuel elements, essential for the improvement of safety margins in normal and transient operations and to maximize the use of nuclear energy through an optimal design of grids.

  4. A three-dimensional pore-scale model of the cathode electrode in polymer-electrolyte membrane fuel cell by lattice Boltzmann method

    Science.gov (United States)

    Molaeimanesh, G. R.; Akbari, M. H.

    2014-07-01

    High power density, low operation temperature, high efficiency and low emissions have granted proton exchange membrane (PEM) fuel cells the most promising future among all types of fuel cells. The porous electrodes of PEM fuel cells have a complicated, non-homogeneous, anisotropic microstructure. Therefore, pore-scale modeling techniques such as lattice Boltzmann method, which can capture non-homogeneous and anisotropic microstructures, have recently gained a great attention. In the present study, a three-dimensional lattice Boltzmann model of a PEM fuel cell cathode electrode is proposed in which electrochemical reaction on the catalyst layer and microstructure of GDL are taken into account. The model enables us to simulate single-phase, multi-species reactive flow in a heterogeneous, anisotropic gas diffusion layer through an active approach. To show the capability of the proposed model, reactive flow in three reconstructed GDLs with different anisotropic characteristics is simulated to investigate the effects of GDL microstructure on species and current density distributions. The results demonstrate that when carbon fibers are more likely oriented normal to the catalyst layer, species density distribution is thicker and more disturbed. Current density also experiences a larger variation on the catalyst layer in such a case.

  5. Lattice parameter changes associated with the rim-structure formation in high burn-up UO 2 fuels by micro X-ray diffraction

    Science.gov (United States)

    Spino, J.; Papaioannou, D.

    2000-10-01

    Radial variations of the lattice parameter and peak width of two high burn-up UO 2-fuels (67 and 80 GWd/tM) were measured by a specially developed micro-X-ray diffraction technique, allowing spectra acquisition with 30 μm spatial resolution. The results showed a significant but constant peak broadening, and a lattice parameter that increased towards the pellet edge and decreased again within the rim-zone. This lattice contraction coincided with other property changes in the rim region, i.e., porosity increase, hardness decrease and Xe depletion. In terms of local burn-ups, the lattice contraction followed the rate of the matrix Xe depletion measured by EMPA, exceeding greatly the contraction rate due to dissolved fission products. The observed behaviour can be equally explained by a saturation of single interstitials with subsequent recombination with excess vacancies, as by the saturation and enlargement of dislocation loops. The concentration and sizes of defects involved and their possible relation to the rim structure formation are discussed.

  6. Interactive benchmarking

    DEFF Research Database (Denmark)

    Lawson, Lartey; Nielsen, Kurt

    2005-01-01

    distance functions. The frontier is given by an explicit quantile, e.g. “the best 90 %”. Using the explanatory model of the inefficiency, the user can adjust the frontiers by submitting state variables that influence the inefficiency. An efficiency study of Danish dairy farms is implemented......We discuss individual learning by interactive benchmarking using stochastic frontier models. The interactions allow the user to tailor the performance evaluation to preferences and explore alternative improvement strategies by selecting and searching the different frontiers using directional...... in the suggested benchmarking tool. The study investigates how different characteristics on dairy farms influences the technical efficiency....

  7. Comparison of MCNP4B and WIMS-AECL calculations of coolant-void-reactivity effects for uniform lattices of CANDU fuel

    Energy Technology Data Exchange (ETDEWEB)

    Kozier, K.S

    1999-05-01

    This paper compares the results of coolant-void reactivity (CVR) reactor-physics calculations performed using the Monte Carlo N-particle transport code, MCNP version 4B, with those obtained using Atomic Energy of Canada Limited's (AECL's) latest version of the Winfrith improved multigroup scheme (WIMS) code, WIMS-AECL version 2-5c. Cross sections derived from the evaluated nuclear data file version B-VI (ENDF/B-VI) are used for both the WIMS-AECL and MCNP4B calculations. The comparison is made for uniform lattices at room temperature containing either fresh natural uranium or mixed oxide (MOX) 37-element CANDU fuel. The MOX fuel composition corresponds roughly to that of irradiated CANDU fuel at a burnup of about 4500 MWd/tU. The level of agreement between the CVR predictions of WIMS-AECL and MCNP4B is studied as a function of lattice buckling (a measure of the curvature of the neutron-flux distribution) over the range from 0.0 to 4.1 m{sup -2} . For the cases studied, it is found that the absolute k values calculated by WIMS-AECL are higher than those of MCNP4B by several mk (1 mk is a change of 0.001 in k), amounts that depend on the fuel type being modelled and the particular cross-section data used. However, the agreement between WIMS-AECL and MCNP4B is much better for the CVR (i.e., the {delta}k on coolant voiding), and is relatively insensitive to the fuel type. (author)

  8. Comparison of MCNP4B and WIMS-AECL calculations of coolant-void-reactivity effects for uniform lattices of CANDU fuel

    Energy Technology Data Exchange (ETDEWEB)

    Kozier, K.S. [Atomic Energy of Canada Limited, Chalk River, Ontario (Canada)

    1999-07-01

    This paper compares the results of coolant-void reactivity (CVR) reactor-physics calculations performed using the Monte Carlo N-particle transport code, MCNP version 4B, with those obtained using Atomic Energy of Canada Limited's (AECL's) latest version of the Winfrith improved multigroup scheme (WIMS) code, WIMS-AECL version 2-5c. Cross sections derived from the evaluated nuclear data file version B-VT (ENDF/B-VI) are used for both the WIMS-AECL and MCNP4B calculations. The comparison is made for uniform lattices at room temperature containing either fresh natural uranium or mixed oxide (MOX) 37-element CANDU fuel. The MOX fuel composition corresponds roughly to that of irradiated CANDU fuel at a burnup of about 4500 MWd/tU. The level of agreement between the CVR predictions of WIMS-AECL and MCNP4B is studied as a function of lattice buckling (a measure of the curvature of the neutron-flux distribution) over the range from 0.0 to 4.1 m{sup -2}. For the cases studied, it is found that the absolute keff values calculated by WIMS-AECL are higher than those of MCNP4B by several mk (1 mk is a change of 0.001 in keff), amounts that depend on the fuel type being modelled and the particular cross-section data used. However, the agreement between WIMS-AECL and MCNP4B is much better for the CVR (i.e., the {delta}keff on coolant voiding), and is relatively insensitive to the fuel type. (author)

  9. Technical benchmarking of fossil energy sources to regenerative substitute fuels for medium speed for-stroke diesel engines; Technisches Benchmark fossiler Energietraeger zu regenerativen Substitutbrennstoffen fuer mittelschnelllaufende 4-Takt Dieselgeneratoren

    Energy Technology Data Exchange (ETDEWEB)

    Schillings, Hubert

    2010-07-01

    Diesel engines were actually designed for fossil fuels. For this reason an operation with substitute fuels poses a special challenge. For example native oils/fats are elements comprising a range of more than 5000 chemically different substances. Each of these has individual characteristics which have to be considered during operation. Hence, an adapted operation is necessary. Typical damages include: 1. Cavitation pitting in the fuel injection system. 2. Precipitation of combustion residues in inlet and outlet. 3. Engine damages caused by lube oil dilution. 4. Engine damages caused by agglutination of lube oil. 5. Plugging of fuel conduit and filter systems caused by polymerization reactions. Practice has shown that the operational mode of engines are not generally reproducible. It is a fact that engines of the same type and manufacturer show a different operating performance. This is due to catalytic effects which can be traced back to the consistence of the material grade. Traditionally the material grades of these engines are designed for basic fuels. In contrast to that most of the substitute fuels (oils/fats) have distinctive acidic characteristics. The question in how far the catalytically active surfaces boost or avoid polymerization is part of current researches. (orig.)

  10. Teledyne Energy Systems, Inc., Proton Exchange Member (PEM) Fuel Cell Engineering Model Powerplant. Test Report: Initial Benchmark Tests in the Original Orientation

    Science.gov (United States)

    Loyselle, Patricia; Prokopius, Kevin

    2011-01-01

    Proton Exchange Membrane (PEM) fuel cell technology is the leading candidate to replace the alkaline fuel cell technology, currently used on the Shuttle, for future space missions. During a 5-yr development program, a PEM fuel cell powerplant was developed. This report details the initial performance evaluation test results of the powerplant.

  11. Extended calculations of OECD/NEA phase II-C burnup credit criticality benchmark problem for PWR spent fuel transport cask by using MCNP-4B2 code and JENDL-3.2 library

    Energy Technology Data Exchange (ETDEWEB)

    Kuroishi, Takeshi; Hoang, Anh Tuan; Nomura, Yasushi; Okuno, Hiroshi [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment

    2003-03-01

    The reactivity effect of the asymmetry of axial burnup profile in burnup credit criticality safety is studied for a realistic PWR spent fuel transport cask proposed in the current OECD/NEA Phase II-C benchmark problem. The axial burnup profiles are simulated in 21 material zones based on in-core flux measurements varying from strong asymmetry to more or less no asymmetry. Criticality calculations in a 3-D model have been performed using the continuous energy Monte Carlo code MCNP-4B2 and the nuclear data library JENDL-3.2. Calculation conditions are determined with consideration of the axial fission source convergence. Calculations are carried out not only for cases proposed in the benchmark but also for additional cases assuming symmetric burnup profile. The actinide-only approach supposed for first domestic introduction of burnup credit into criticality evaluation is also considered in addition to the actinide plus fission product approach adopted in the benchmark. The calculated results show that k{sub eff} and the end effect increase almost linearly with increasing burnup axial offset that is defined as one of typical parameters showing the intensity of axial burnup asymmetry. The end effect is more sensitive to the asymmetry of burnup profile for the higher burnup. For an axially distributed burnup, the axial fission source distribution becomes strongly asymmetric as its peak shifts toward the top end of the fuel's active zone where the local burnup is less than that of the bottom end. The peak of fission source distribution becomes higher with the increase of either the asymmetry of burnup profile or the assembly-averaged burnup. The conservatism of the assumption of uniform axial burnup based on the actinide-only approach is estimated quantitatively in comparison with the k{sub eff} result calculated with experiment-based strongest asymmetric axial burnup profile with the actinide plus fission product approach. (author)

  12. Extended calculations of OECD/NEA phase II-C burnup credit criticality benchmark problem for PWR spent fuel transport cask by using MCNP-4B2 code and JENDL-3.2 library

    Energy Technology Data Exchange (ETDEWEB)

    Kuroishi, Takeshi; Hoang, Anh Tuan; Nomura, Yasushi; Okuno, Hiroshi [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment

    2003-03-01

    The reactivity effect of the asymmetry of axial burnup profile in burnup credit criticality safety is studied for a realistic PWR spent fuel transport cask proposed in the current OECD/NEA Phase II-C benchmark problem. The axial burnup profiles are simulated in 21 material zones based on in-core flux measurements varying from strong asymmetry to more or less no asymmetry. Criticality calculations in a 3-D model have been performed using the continuous energy Monte Carlo code MCNP-4B2 and the nuclear data library JENDL-3.2. Calculation conditions are determined with consideration of the axial fission source convergence. Calculations are carried out not only for cases proposed in the benchmark but also for additional cases assuming symmetric burnup profile. The actinide-only approach supposed for first domestic introduction of burnup credit into criticality evaluation is also considered in addition to the actinide plus fission product approach adopted in the benchmark. The calculated results show that k{sub eff} and the end effect increase almost linearly with increasing burnup axial offset that is defined as one of typical parameters showing the intensity of axial burnup asymmetry. The end effect is more sensitive to the asymmetry of burnup profile for the higher burnup. For an axially distributed burnup, the axial fission source distribution becomes strongly asymmetric as its peak shifts toward the top end of the fuel's active zone where the local burnup is less than that of the bottom end. The peak of fission source distribution becomes higher with the increase of either the asymmetry of burnup profile or the assembly-averaged burnup. The conservatism of the assumption of uniform axial burnup based on the actinide-only approach is estimated quantitatively in comparison with the k{sub eff} result calculated with experiment-based strongest asymmetric axial burnup profile with the actinide plus fission product approach. (author)

  13. Nanoscale study of reactive transport in catalyst layer of proton exchange membrane fuel cells with precious and non-precious catalysts using lattice Boltzmann method

    CERN Document Server

    Chen, Li; Kang, Qinjun; Holby, Edward F; Tao, Wen-Quan

    2014-01-01

    High-resolution porous structures of catalyst layer (CL) with multicomponent in proton exchange membrane fuel cells are reconstructed using a reconstruction method called quartet structure generation set. Characterization analyses of nanoscale structures are implemented including pore size distribution, specific area and phase connectivity. Pore-scale simulation methods based on the lattice Boltzmann method are developed and used to predict the macroscopic transport properties including effective diffusivity and proton conductivity. Nonuniform distributions of ionomer in CL generates more tortuous pathway for reactant transport and greatly reduces the effective diffusivity. Tortuosity of CL is much higher than conventional Bruggeman equation adopted. Knudsen diffusion plays a significant role in oxygen diffusion and significantly reduces the effective diffusivity. Reactive transport inside the CL is also investigated. Although the reactive surface area of non-precious metal catalyst (NPMC) CL is much higher t...

  14. Kvantitativ benchmark - Produktionsvirksomheder

    DEFF Research Database (Denmark)

    Sørensen, Ole H.; Andersen, Vibeke

    Rapport med resultatet af kvantitativ benchmark over produktionsvirksomhederne i VIPS projektet.......Rapport med resultatet af kvantitativ benchmark over produktionsvirksomhederne i VIPS projektet....

  15. Benchmarking in Student Affairs.

    Science.gov (United States)

    Mosier, Robert E.; Schwarzmueller, Gary J.

    2002-01-01

    Discusses the use of benchmarking in student affairs, focusing on issues related to student housing. Provides examples of how benchmarking has influenced administrative practice at many institutions. (EV)

  16. Benchmarking biofuels; Biobrandstoffen benchmarken

    Energy Technology Data Exchange (ETDEWEB)

    Croezen, H.; Kampman, B.; Bergsma, G.

    2012-03-15

    A sustainability benchmark for transport biofuels has been developed and used to evaluate the various biofuels currently on the market. For comparison, electric vehicles, hydrogen vehicles and petrol/diesel vehicles were also included. A range of studies as well as growing insight are making it ever clearer that biomass-based transport fuels may have just as big a carbon footprint as fossil fuels like petrol or diesel, or even bigger. At the request of Greenpeace Netherlands, CE Delft has brought together current understanding on the sustainability of fossil fuels, biofuels and electric vehicles, with particular focus on the performance of the respective energy carriers on three sustainability criteria, with the first weighing the heaviest: (1) Greenhouse gas emissions; (2) Land use; and (3) Nutrient consumption [Dutch] Greenpeace Nederland heeft CE Delft gevraagd een duurzaamheidsmeetlat voor biobrandstoffen voor transport te ontwerpen en hierop de verschillende biobrandstoffen te scoren. Voor een vergelijk zijn ook elektrisch rijden, rijden op waterstof en rijden op benzine of diesel opgenomen. Door onderzoek en voortschrijdend inzicht blijkt steeds vaker dat transportbrandstoffen op basis van biomassa soms net zoveel of zelfs meer broeikasgassen veroorzaken dan fossiele brandstoffen als benzine en diesel. CE Delft heeft voor Greenpeace Nederland op een rijtje gezet wat de huidige inzichten zijn over de duurzaamheid van fossiele brandstoffen, biobrandstoffen en elektrisch rijden. Daarbij is gekeken naar de effecten van de brandstoffen op drie duurzaamheidscriteria, waarbij broeikasgasemissies het zwaarst wegen: (1) Broeikasgasemissies; (2) Landgebruik; en (3) Nutriëntengebruik.

  17. Validation of VHTRC calculation benchmark of critical experiment using the MCB code

    Directory of Open Access Journals (Sweden)

    Stanisz Przemysław

    2016-01-01

    Full Text Available The calculation benchmark problem Very High Temperature Reactor Critical (VHTR a pin-in-block type core critical assembly has been investigated with the Monte Carlo Burnup (MCB code in order to validate the latest version of Nuclear Data Library based on ENDF format. Executed benchmark has been made on the basis of VHTR benchmark available from the International Handbook of Evaluated Reactor Physics Benchmark Experiments. This benchmark is useful for verifying the discrepancies in keff values between various libraries and experimental values. This allows to improve accuracy of the neutron transport calculations that may help in designing the high performance commercial VHTRs. Almost all safety parameters depend on the accuracy of neutron transport calculation results that, in turn depend on the accuracy of nuclear data libraries. Thus, evaluation of the libraries applicability to VHTR modelling is one of the important subjects. We compared the numerical experiment results with experimental measurements using two versions of available nuclear data (ENDF-B-VII.1 and JEFF-3.2 prepared for required temperatures. Calculations have been performed with the MCB code which allows to obtain very precise representation of complex VHTR geometry, including the double heterogeneity of a fuel element. In this paper, together with impact of nuclear data, we discuss also the impact of different lattice modelling inside the fuel pins. The discrepancies of keff have been successfully observed and show good agreement with each other and with the experimental data within the 1 σ range of the experimental uncertainty. Because some propagated discrepancies observed, we proposed appropriate corrections in experimental constants which can improve the reactivity coefficient dependency. Obtained results confirm the accuracy of the new Nuclear Data Libraries.

  18. Benchmarking v ICT

    OpenAIRE

    Blecher, Jan

    2009-01-01

    The aim of this paper is to describe benefits of benchmarking IT in wider context and benchmarking scope at all. I specify benchmarking as a process and mention basic rules and guidelines. Further I define IT benchmarking domains and describe possibilities of their use. Best known type of IT benchmark is cost benchmark which represents only a subset of benchmark opportunities. In this paper, is cost benchmark rather an imaginary first step to benchmarking contribution to company. IT benchmark...

  19. DSP Platform Benchmarking : DSP Platform Benchmarking

    OpenAIRE

    Xinyuan, Luo

    2009-01-01

    Benchmarking of DSP kernel algorithms was conducted in the thesis on a DSP processor for teaching in the course TESA26 in the department of Electrical Engineering. It includes benchmarking on cycle count and memory usage. The goal of the thesis is to evaluate the quality of a single MAC DSP instruction set and provide suggestions for further improvement in instruction set architecture accordingly. The scope of the thesis is limited to benchmark the processor only based on assembly coding. The...

  20. Validation of WIMS-CANDU using Pin-Cell Lattices

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Won Young; Min, Byung Joo; Park, Joo Hwan [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    2006-07-01

    The WIMS-CANDU is a lattice code which has a depletion capability for the analysis of reactor physics problems related to a design and safety. The WIMS-CANDU code has been developed from the WIMSD5B, a version of the WIMS code released from the OECD/NEA data bank in 1998. The lattice code POWDERPUFS-V (PPV) has been used for the physics design and analysis of a natural uranium fuel for the CANDU reactor. However since the application of PPV is limited to a fresh fuel due to its empirical correlations, the WIMS-AECL code has been developed by AECL to substitute the PPV. Also, the WIMS-CANDU code is being developed to perform the physics analysis of the present operating CANDU reactors as a replacement of PPV. As one of the developing work of WIMS-CANDU, the U{sup 238} absorption cross-section in the nuclear data library of WIMS-CANDU was updated and WIMS-CANDU was validated using the benchmark problems for pin-cell lattices such as TRX-1, TRX-2, Bapl-1, Bapl-2 and Bapl-3. The results by the WIMS-CANDU and the WIMS-AECL were compared with the experimental data.

  1. Large scale simulation of liquid water transport in a gas diffusion layer of polymer electrolyte membrane fuel cells using the lattice Boltzmann method

    Science.gov (United States)

    Sakaida, Satoshi; Tabe, Yutaka; Chikahisa, Takemi

    2017-09-01

    A method for the large-scale simulation with the lattice Boltzmann method (LBM) is proposed for liquid water movement in a gas diffusion layer (GDL) of polymer electrolyte membrane fuel cells. The LBM is able to analyze two-phase flows in complex structures, however the simulation domain is limited due to heavy computational loads. This study investigates a variety means to reduce computational loads and increase the simulation areas. One is applying an LBM treating two-phases as having the same density, together with keeping numerical stability with large time steps. The applicability of this approach is confirmed by comparing the results with rigorous simulations using actual density. The second is establishing the maximum limit of the Capillary number that maintains flow patterns similar to the precise simulation; this is attempted as the computational load is inversely proportional to the Capillary number. The results show that the Capillary number can be increased to 3.0 × 10-3, where the actual operation corresponds to Ca = 10-5∼10-8. The limit is also investigated experimentally using an enlarged scale model satisfying similarity conditions for the flow. Finally, a demonstration is made of the effects of pore uniformity in GDL as an example of a large-scale simulation covering a channel.

  2. Modeling of mass and charge transport in a solid oxide fuel cell anode structure by a 3D lattice Boltzmann approach

    Science.gov (United States)

    Paradis, Hedvig; Andersson, Martin; Sundén, Bengt

    2016-08-01

    A 3D model at microscale by the lattice Boltzmann method (LBM) is proposed for part of an anode of a solid oxide fuel cell (SOFC) to analyze the interaction between the transport and reaction processes and structural parameters. The equations of charge, momentum, heat and mass transport are simulated in the model. The modeling geometry is created with randomly placed spheres to resemble the part of the anode structure close to the electrolyte. The electrochemical reaction processes are captured at specific sites where spheres representing Ni and YSZ materials are present with void space. This work focuses on analyzing the effect of structural parameters such as porosity, and percentage of active reaction sites on the ionic current density and concentration of H2 using LBM. It is shown that LBM can be used to simulate an SOFC anode at microscale and evaluate the effect of structural parameters on the transport processes to improve the performance of the SOFC anode. It was found that increasing the porosity from 30 to 50 % decreased the ionic current density due to a reduction in the number of reaction sites. Also the consumption of H2 decreased with increasing porosity. When the percentage of active reaction sites was increased while the porosity was kept constant, the ionic current density increased. However, the H2 concentration was slightly reduced when the percentage of active reaction sites was increased. The gas flow tortuosity decreased with increasing porosity.

  3. Numerical simulation of liquid water and gas flow in a channel and a simplified gas diffusion layer model of polymer electrolyte membrane fuel cells using the lattice Boltzmann method

    OpenAIRE

    Tabe, Yutaka; Lee, Yongju; Chikahisa, Takemi; Kozakai, Masaya

    2009-01-01

    Numerical simulations using the lattice Boltzmann method (LBM) are developed to elucidate the dynamic behavior of condensed water and gas flow in a polymer electrolyte membrane (PEM) fuel cell. Here, the calculation process of the LBM simulation is improved to extend the simulation to a porous medium like a gas diffusion layer (GDL), and a stable and reliable simulation of two-phase flow with large density differences in the porous medium is established. It is shown that dynamic capillary fin...

  4. Aquatic Life Benchmarks

    Data.gov (United States)

    U.S. Environmental Protection Agency — The Aquatic Life Benchmarks is an EPA-developed set of criteria for freshwater species. These benchmarks are based on toxicity values reviewed by EPA and used in the...

  5. Benchmarking a DSP processor

    OpenAIRE

    Lennartsson, Per; Nordlander, Lars

    2002-01-01

    This Master thesis describes the benchmarking of a DSP processor. Benchmarking means measuring the performance in some way. In this report, we have focused on the number of instruction cycles needed to execute certain algorithms. The algorithms we have used in the benchmark are all very common in signal processing today. The results we have reached in this thesis have been compared to benchmarks for other processors, performed by Berkeley Design Technology, Inc. The algorithms were programm...

  6. Reevaluation of JACS code system benchmark analyses of the heterogeneous system. Fuel rods in U+Pu nitric acid solution system

    Energy Technology Data Exchange (ETDEWEB)

    Takada, Tomoyuki; Miyoshi, Yoshinori; Katakura, Jun-ichi [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment

    2003-03-01

    In order to perform accuracy evaluation of the critical calculation by the combination of multi-group constant library MGCL and 3-dimensional Monte Carlo code KENO-IV among critical safety evaluation code system JACS, benchmark calculation was carried out from 1980 in 1982. Some cases where the neutron multiplication factor calculated in the heterogeneous system in it was less than 0.95 were seen. In this report, it re-calculated by considering the cause about the heterogeneous system of the U+Pu nitric acid solution systems containing the neutron poison shown in JAERI-M 9859. The present study has shown that the k{sub eff} value less than 0.95 given in JAERI-M 9859 is caused by the fact that the water reflector below a cylindrical container was not taken into consideration in the KENO-IV calculation model. By taking into the water reflector, the KENO-IV calculation gives a k{sub eff} value greater than 0.95 and a good agreement with the experiment. (author)

  7. Analysis of nuclear characteristics and fuel economics for PWR core with homogeneous thorium fuels

    Energy Technology Data Exchange (ETDEWEB)

    Joo, H. K.; Noh, J. M.; Yoo, J. W.; Song, J. S.; Kim, J. C.; Noh, T. W

    2000-12-01

    The nuclear core characteristics and economics of an once-through homogenized thorium cycle for PWR were analyzed. The lattice code, HELIOS has been qualified against BNL and B and W critical experiments and the IAEA numerical benchmark problem in advance of the core analysis. The infinite multiplication factor and the evolution of main isotopes with fuel burnup were investigated for the assessment of depletion charateristics of thorium fuel. The reactivity of thorium fuel at the beginning of irradiation is smaller than that of uranium fuel having the same inventory of {sup 235}U, but it decrease with burnup more slowly than in UO{sub 2} fuel. The gadolinia worth in thorium fuel assembly is also slightly smaller than in UO{sub 2} fuel. The inventory of {sup 233}U which is converted from {sup 232}Th is proportional to the initial mass of {sup 232}Th and is about 13kg per one tones of initial heavy metal mass. The followings are observed for thorium fuel cycle compared with UO{sub 2} cycle ; shorter cycle length, more positive MTC at EOC, more negative FTC, similar boron worth and control rod. Fuel economics of thorium cycle was analyzed by investigating the natural uranium requirements, the separative work requirements, and the cost for burnable poison rods. Even though less number of burnable poison rods are required in thorium fuel cycle, the costs for the natural uranium requirements and the separative work requirements are increased in thorium fuel cycle. So within the scope of this study, once through cycle concept, homogenized fuel concept, the same fuel management scheme as uranium cycle, the thorium fuel cycle for PWR does not have any economic incentives in preference to uranium.

  8. The COST Benchmark

    DEFF Research Database (Denmark)

    Jensen, Christian Søndergaard; Tiesyte, Dalia; Tradisauskas, Nerius

    2006-01-01

    , and more are underway. As a result, there is an increasing need for an independent benchmark for spatio-temporal indexes. This paper characterizes the spatio-temporal indexing problem and proposes a benchmark for the performance evaluation and comparison of spatio-temporal indexes. Notably, the benchmark...

  9. Lattice Bosons

    CERN Document Server

    Chakrabarti, J; Bagchi, B; Chakrabarti, Jayprokas; Basu, Asis; Bagchi, Bijon

    2000-01-01

    Fermions on the lattice have bosonic excitations generated from the underlying periodic background. These, the lattice bosons, arise near the empty band or when the bands are nearly full. They do not depend on the nature of the interactions and exist for any fermion-fermion coupling. We discuss these lattice boson solutions for the Dirac Hamiltonian.

  10. Benchmarking semantic web technology

    CERN Document Server

    García-Castro, R

    2009-01-01

    This book addresses the problem of benchmarking Semantic Web Technologies; first, from a methodological point of view, proposing a general methodology to follow in benchmarking activities over Semantic Web Technologies and, second, from a practical point of view, presenting two international benchmarking activities that involved benchmarking the interoperability of Semantic Web technologies using RDF(S) as the interchange language in one activity and OWL in the other.The book presents in detail how the different resources needed for these interoperability benchmarking activities were defined:

  11. Benchmarking in University Toolbox

    Directory of Open Access Journals (Sweden)

    Katarzyna Kuźmicz

    2015-06-01

    Full Text Available In the face of global competition and rising challenges that higher education institutions (HEIs meet, it is imperative to increase innovativeness and efficiency of their management. Benchmarking can be the appropriate tool to search for a point of reference necessary to assess institution’s competitive position and learn from the best in order to improve. The primary purpose of the paper is to present in-depth analysis of benchmarking application in HEIs worldwide. The study involves indicating premises of using benchmarking in HEIs. It also contains detailed examination of types, approaches and scope of benchmarking initiatives. The thorough insight of benchmarking applications enabled developing classification of benchmarking undertakings in HEIs. The paper includes review of the most recent benchmarking projects and relating them to the classification according to the elaborated criteria (geographical range, scope, type of data, subject, support and continuity. The presented examples were chosen in order to exemplify different approaches to benchmarking in higher education setting. The study was performed on the basis of the published reports from benchmarking projects, scientific literature and the experience of the author from the active participation in benchmarking projects. The paper concludes with recommendations for university managers undertaking benchmarking, derived on the basis of the conducted analysis.

  12. RISKIND verification and benchmark comparisons

    Energy Technology Data Exchange (ETDEWEB)

    Biwer, B.M.; Arnish, J.J.; Chen, S.Y.; Kamboj, S.

    1997-08-01

    This report presents verification calculations and benchmark comparisons for RISKIND, a computer code designed to estimate potential radiological consequences and health risks to individuals and the population from exposures associated with the transportation of spent nuclear fuel and other radioactive materials. Spreadsheet calculations were performed to verify the proper operation of the major options and calculational steps in RISKIND. The program is unique in that it combines a variety of well-established models into a comprehensive treatment for assessing risks from the transportation of radioactive materials. Benchmark comparisons with other validated codes that incorporate similar models were also performed. For instance, the external gamma and neutron dose rate curves for a shipping package estimated by RISKIND were compared with those estimated by using the RADTRAN 4 code and NUREG-0170 methodology. Atmospheric dispersion of released material and dose estimates from the GENII and CAP88-PC codes. Verification results have shown the program to be performing its intended function correctly. The benchmark results indicate that the predictions made by RISKIND are within acceptable limits when compared with predictions from similar existing models.

  13. Polarization response of RHIC electron lens lattices

    Directory of Open Access Journals (Sweden)

    V. H. Ranjbar

    2016-10-01

    Full Text Available Depolarization response for a system of two orthogonal snakes at irrational tunes is studied in depth using lattice independent spin integration. In particular we consider the effect of overlapping spin resonances in this system, to understand the impact of phase, tune, relative location and threshold strengths of the spin resonances. These results are benchmarked and compared to two dimensional direct tracking results for the RHIC e-lens lattice and the standard lattice. Finally we consider the effect of longitudinal motion via chromatic scans using direct six dimensional lattice tracking.

  14. A PWR Thorium Pin Cell Burnup Benchmark

    Energy Technology Data Exchange (ETDEWEB)

    Weaver, Kevan Dean; Zhao, X.; Pilat, E. E; Hejzlar, P.

    2000-05-01

    As part of work to evaluate the potential benefits of using thorium in LWR fuel, a thorium fueled benchmark comparison was made in this study between state-of-the-art codes, MOCUP (MCNP4B + ORIGEN2), and CASMO-4 for burnup calculations. The MOCUP runs were done individually at MIT and INEEL, using the same model but with some differences in techniques and cross section libraries. Eigenvalue and isotope concentrations were compared on a PWR pin cell model up to high burnup. The eigenvalue comparison as a function of burnup is good: the maximum difference is within 2% and the average absolute difference less than 1%. The isotope concentration comparisons are better than a set of MOX fuel benchmarks and comparable to a set of uranium fuel benchmarks reported in the literature. The actinide and fission product data sources used in the MOCUP burnup calculations for a typical thorium fuel are documented. Reasons for code vs code differences are analyzed and discussed.

  15. The Conic Benchmark Format

    DEFF Research Database (Denmark)

    Friberg, Henrik A.

    This document constitutes the technical reference manual of the Conic Benchmark Format with le extension: .cbf or .CBF. It unies linear, second-order cone (also known as conic quadratic) and semidenite optimization with mixed-integer variables. The format has been designed with benchmark libraries...... in mind, and therefore focuses on compact and easily parsable representations. The problem structure is separated from the problem data, and the format moreover facilitate benchmarking of hotstart capability through sequences of changes....

  16. Composite nuclear fuel assembly

    Energy Technology Data Exchange (ETDEWEB)

    Dollard, W.J.; Ferrari, H.M.

    1982-04-27

    An open lattice elongated nuclear fuel assembly including small diameter fuel rods disposed in an array spaced a selected distance above an array of larger diameter fuel rods for use in a nuclear reactor having liquid coolant flowing in an upward direction. Plenums are preferably provided in the upper portion of the upper smaller diameter fuel rods and in the lower portion of the lower larger diameter fuel rods. Lattice grid structures provide lateral support for the fuel rods and preferably the lowest grid about the upper rods is directly and rigidly affixed to the highest grid about the lower rods.

  17. Internet based benchmarking

    DEFF Research Database (Denmark)

    Bogetoft, Peter; Nielsen, Kurt

    2005-01-01

    We discuss the design of interactive, internet based benchmarking using parametric (statistical) as well as nonparametric (DEA) models. The user receives benchmarks and improvement potentials. The user is also given the possibility to search different efficiency frontiers and hereby to explore...

  18. Thermal Performance Benchmarking (Presentation)

    Energy Technology Data Exchange (ETDEWEB)

    Moreno, G.

    2014-11-01

    This project will benchmark the thermal characteristics of automotive power electronics and electric motor thermal management systems. Recent vehicle systems will be benchmarked to establish baseline metrics, evaluate advantages and disadvantages of different thermal management systems, and identify areas of improvement to advance the state-of-the-art.

  19. Handleiding benchmark VO

    NARCIS (Netherlands)

    Blank, j.l.t.

    2008-01-01

    OnderzoeksrapportenArchiefTechniek, Bestuur en Management> Over faculteit> Afdelingen> Innovation Systems> IPSE> Onderzoek> Publicaties> Onderzoeksrapporten> Handleiding benchmark VO Handleiding benchmark VO 25 november 2008 door IPSE Studies Door J.L.T. Blank. Handleiding voor het lezen van de i

  20. Benchmark af erhvervsuddannelserne

    DEFF Research Database (Denmark)

    Bogetoft, Peter; Wittrup, Jesper

    I dette arbejdspapir diskuterer vi, hvorledes de danske erhvervsskoler kan benchmarkes, og vi præsenterer resultaterne af en række beregningsmodeller. Det er begrebsmæssigt kompliceret at benchmarke erhvervsskolerne. Skolerne udbyder en lang række forskellige uddannelser. Det gør det vanskeligt...

  1. Benchmarking af kommunernes sagsbehandling

    DEFF Research Database (Denmark)

    Amilon, Anna

    Fra 2007 skal Ankestyrelsen gennemføre benchmarking af kommuernes sagsbehandlingskvalitet. Formålet med benchmarkingen er at udvikle praksisundersøgelsernes design med henblik på en bedre opfølgning og at forbedre kommunernes sagsbehandling. Dette arbejdspapir diskuterer metoder for benchmarking...

  2. How Activists Use Benchmarks

    DEFF Research Database (Denmark)

    Seabrooke, Leonard; Wigan, Duncan

    2015-01-01

    Non-governmental organisations use benchmarks as a form of symbolic violence to place political pressure on firms, states, and international organisations. The development of benchmarks requires three elements: (1) salience, that the community of concern is aware of the issue and views...... are put to the test. The first is a reformist benchmarking cycle where organisations defer to experts to create a benchmark that conforms with the broader system of politico-economic norms. The second is a revolutionary benchmarking cycle driven by expert-activists that seek to contest strong vested...... interests and challenge established politico-economic norms. Differentiating these cycles provides insights into how activists work through organisations and with expert networks, as well as how campaigns on complex economic issues can be mounted and sustained....

  3. Superradiance Lattice

    CERN Document Server

    Wang, Da-Wei; Zhu, Shi-Yao; Scully, Marlan O

    2014-01-01

    We show that the timed Dicke states of a collection of three-level atoms can form a tight-binding lattice in the momentum space. This lattice, coined the superradiance lattice (SL), can be constructed based on an electromagnetically induced transparency (EIT) system. For a one-dimensional SL, we need the coupling field of the EIT system to be a standing wave. The detuning between the two components of the standing wave introduces an effective electric field. The quantum behaviours of electrons in lattices, such as Bloch oscillations, Wannier-Stark ladders, Bloch band collapsing and dynamic localization can be observed in the SL. The SL can be extended to two, three and even higher dimensions where no analogous real space lattices exist and new physics are waiting to be explored.

  4. Benchmark Evaluation of the NRAD Reactor LEU Core Startup Measurements

    Energy Technology Data Exchange (ETDEWEB)

    J. D. Bess; T. L. Maddock; M. A. Marshall

    2011-09-01

    The Neutron Radiography (NRAD) reactor is a 250-kW TRIGA-(Training, Research, Isotope Production, General Atomics)-conversion-type reactor at the Idaho National Laboratory; it is primarily used for neutron radiography analysis of irradiated and unirradiated fuels and materials. The NRAD reactor was converted from HEU to LEU fuel with 60 fuel elements and brought critical on March 31, 2010. This configuration of the NRAD reactor has been evaluated as an acceptable benchmark experiment and is available in the 2011 editions of the International Handbook of Evaluated Criticality Safety Benchmark Experiments (ICSBEP Handbook) and the International Handbook of Evaluated Reactor Physics Benchmark Experiments (IRPhEP Handbook). Significant effort went into precisely characterizing all aspects of the reactor core dimensions and material properties; detailed analyses of reactor parameters minimized experimental uncertainties. The largest contributors to the total benchmark uncertainty were the 234U, 236U, Er, and Hf content in the fuel; the manganese content in the stainless steel cladding; and the unknown level of water saturation in the graphite reflector blocks. A simplified benchmark model of the NRAD reactor was prepared with a keff of 1.0012 {+-} 0.0029 (1s). Monte Carlo calculations with MCNP5 and KENO-VI and various neutron cross section libraries were performed and compared with the benchmark eigenvalue for the 60-fuel-element core configuration; all calculated eigenvalues are between 0.3 and 0.8% greater than the benchmark value. Benchmark evaluations of the NRAD reactor are beneficial in understanding biases and uncertainties affecting criticality safety analyses of storage, handling, or transportation applications with LEU-Er-Zr-H fuel.

  5. Verification and validation benchmarks.

    Energy Technology Data Exchange (ETDEWEB)

    Oberkampf, William Louis; Trucano, Timothy Guy

    2007-02-01

    Verification and validation (V&V) are the primary means to assess the accuracy and reliability of computational simulations. V&V methods and procedures have fundamentally improved the credibility of simulations in several high-consequence fields, such as nuclear reactor safety, underground nuclear waste storage, and nuclear weapon safety. Although the terminology is not uniform across engineering disciplines, code verification deals with assessing the reliability of the software coding, and solution verification deals with assessing the numerical accuracy of the solution to a computational model. Validation addresses the physics modeling accuracy of a computational simulation by comparing the computational results with experimental data. Code verification benchmarks and validation benchmarks have been constructed for a number of years in every field of computational simulation. However, no comprehensive guidelines have been proposed for the construction and use of V&V benchmarks. For example, the field of nuclear reactor safety has not focused on code verification benchmarks, but it has placed great emphasis on developing validation benchmarks. Many of these validation benchmarks are closely related to the operations of actual reactors at near-safety-critical conditions, as opposed to being more fundamental-physics benchmarks. This paper presents recommendations for the effective design and use of code verification benchmarks based on manufactured solutions, classical analytical solutions, and highly accurate numerical solutions. In addition, this paper presents recommendations for the design and use of validation benchmarks, highlighting the careful design of building-block experiments, the estimation of experimental measurement uncertainty for both inputs and outputs to the code, validation metrics, and the role of model calibration in validation. It is argued that the understanding of predictive capability of a computational model is built on the level of

  6. Thermal Hydraulic Performance of Tight Lattice Bundle

    Science.gov (United States)

    Yamamoto, Yasushi; Akiba, Miyuki; Morooka, Shinichi; Shirakawa, Kenetsu; Abe, Nobuaki

    Recently, the reduced moderation spectrum BWR has been studied. The fast neutron spectrum is obtained through triangular tight lattice fuel. However, there are few thermal hydraulic test data and thermal hydraulic correlation applicable to critical power prediction in such a tight lattice bundle. This study aims to enhance the database of the thermal hydraulic performance of the tight lattice bundle whose rod gap is about 1mm. Therefore, thermal hydraulic performance measurement tests of tight lattice bundles for the critical power, the pressure drop and the counter current flow limiting were performed. Moreover, the correlations to evaluate the thermal-hydraulic performance of the tight lattice bundle were developed.

  7. Toxicological Benchmarks for Wildlife

    Energy Technology Data Exchange (ETDEWEB)

    Sample, B.E. Opresko, D.M. Suter, G.W.

    1993-01-01

    Ecological risks of environmental contaminants are evaluated by using a two-tiered process. In the first tier, a screening assessment is performed where concentrations of contaminants in the environment are compared to no observed adverse effects level (NOAEL)-based toxicological benchmarks. These benchmarks represent concentrations of chemicals (i.e., concentrations presumed to be nonhazardous to the biota) in environmental media (water, sediment, soil, food, etc.). While exceedance of these benchmarks does not indicate any particular level or type of risk, concentrations below the benchmarks should not result in significant effects. In practice, when contaminant concentrations in food or water resources are less than these toxicological benchmarks, the contaminants may be excluded from further consideration. However, if the concentration of a contaminant exceeds a benchmark, that contaminant should be retained as a contaminant of potential concern (COPC) and investigated further. The second tier in ecological risk assessment, the baseline ecological risk assessment, may use toxicological benchmarks as part of a weight-of-evidence approach (Suter 1993). Under this approach, based toxicological benchmarks are one of several lines of evidence used to support or refute the presence of ecological effects. Other sources of evidence include media toxicity tests, surveys of biota (abundance and diversity), measures of contaminant body burdens, and biomarkers. This report presents NOAEL- and lowest observed adverse effects level (LOAEL)-based toxicological benchmarks for assessment of effects of 85 chemicals on 9 representative mammalian wildlife species (short-tailed shrew, little brown bat, meadow vole, white-footed mouse, cottontail rabbit, mink, red fox, and whitetail deer) or 11 avian wildlife species (American robin, rough-winged swallow, American woodcock, wild turkey, belted kingfisher, great blue heron, barred owl, barn owl, Cooper's hawk, and red

  8. Benchmarking expert system tools

    Science.gov (United States)

    Riley, Gary

    1988-01-01

    As part of its evaluation of new technologies, the Artificial Intelligence Section of the Mission Planning and Analysis Div. at NASA-Johnson has made timing tests of several expert system building tools. Among the production systems tested were Automated Reasoning Tool, several versions of OPS5, and CLIPS (C Language Integrated Production System), an expert system builder developed by the AI section. Also included in the test were a Zetalisp version of the benchmark along with four versions of the benchmark written in Knowledge Engineering Environment, an object oriented, frame based expert system tool. The benchmarks used for testing are studied.

  9. Financial Integrity Benchmarks

    Data.gov (United States)

    City of Jackson, Mississippi — This data compiles standard financial integrity benchmarks that allow the City to measure its financial standing. It measure the City's debt ratio and bond ratings....

  10. GeodeticBenchmark_GEOMON

    Data.gov (United States)

    Vermont Center for Geographic Information — The GeodeticBenchmark_GEOMON data layer consists of geodetic control monuments (points) that have a known position or spatial reference. The locations of these...

  11. Diagnostic Algorithm Benchmarking

    Science.gov (United States)

    Poll, Scott

    2011-01-01

    A poster for the NASA Aviation Safety Program Annual Technical Meeting. It describes empirical benchmarking on diagnostic algorithms using data from the ADAPT Electrical Power System testbed and a diagnostic software framework.

  12. Verification Calculation Results to Validate the Procedures and Codes for Pin-by-Pin Power Computation in VVER Type Reactors with MOX Fuel Loading

    Energy Technology Data Exchange (ETDEWEB)

    Chizhikova, Z.N.; Kalashnikov, A.G.; Kapranova, E.N.; Korobitsyn, V.E.; Manturov, G.N.; Tsiboulia, A.A.

    1998-12-01

    One of the important problems for ensuring the VVER type reactor safety when the reactor is partially loaded with MOX fuel is the choice of appropriate physical zoning to achieve the maximum flattening of pin-by-pin power distribution. When uranium fuel is replaced by MOX one provided that the reactivity due to fuel assemblies is kept constant, the fuel enrichment slightly decreases. However, the average neutron spectrum fission microscopic cross-section for {sup 239}Pu is approximately twice that for {sup 235}U. Therefore power peaks occur in the peripheral fuel assemblies containing MOX fuel which are aggravated by the interassembly water. Physical zoning has to be applied to flatten the power peaks in fuel assemblies containing MOX fuel. Moreover, physical zoning cannot be confined to one row of fuel elements as is the case with a uniform lattice of uranium fuel assemblies. Both the water gap and the jump in neutron absorption macroscopic cross-sections which occurs at the interface of fuel assemblies with different fuels make the problem of calculating space-energy neutron flux distribution more complicated since it increases nondiffusibility effects. To solve this problem it is necessary to update the current codes, to develop new codes and to verify all the codes including nuclear-physical constants libraries employed. In so doing it is important to develop and validate codes of different levels--from design codes to benchmark ones. This paper presents the results of the burnup calculation for a multiassembly structure, consisting of MOX fuel assemblies surrounded by uranium dioxide fuel assemblies. The structure concerned can be assumed to model a fuel assembly lattice symmetry element of the VVER-1000 type reactor in which 1/4 of all fuel assemblies contains MOX fuel.

  13. On Big Data Benchmarking

    OpenAIRE

    Han, Rui; Lu, Xiaoyi

    2014-01-01

    Big data systems address the challenges of capturing, storing, managing, analyzing, and visualizing big data. Within this context, developing benchmarks to evaluate and compare big data systems has become an active topic for both research and industry communities. To date, most of the state-of-the-art big data benchmarks are designed for specific types of systems. Based on our experience, however, we argue that considering the complexity, diversity, and rapid evolution of big data systems, fo...

  14. Benchmarking in Foodservice Operations.

    Science.gov (United States)

    2007-11-02

    51. Lingle JH, Schiemann WA. From balanced scorecard to strategic gauges: Is measurement worth it? Mgt Rev. 1996; 85(3):56-61. 52. Struebing L...studies lasted from nine to twelve months, and could extend beyond that time for numerous reasons (49). Benchmarking was not industrial tourism , a...not simply data comparison, a fad, a means for reducing resources, a quick-fix program, or industrial tourism . Benchmarking was a complete process

  15. Benchmarking File System Benchmarking: It *IS* Rocket Science

    OpenAIRE

    Seltzer, Margo I.; Tarasov, Vasily; Bhanage, Saumitra; Zadok, Erez

    2011-01-01

    The quality of file system benchmarking has not improved in over a decade of intense research spanning hundreds of publications. Researchers repeatedly use a wide range of poorly designed benchmarks, and in most cases, develop their own ad-hoc benchmarks. Our community lacks a definition of what we want to benchmark in a file system. We propose several dimensions of file system benchmarking and review the wide range of tools and techniques in widespread use. We experimentally show that even t...

  16. The KMAT: Benchmarking Knowledge Management.

    Science.gov (United States)

    de Jager, Martha

    Provides an overview of knowledge management and benchmarking, including the benefits and methods of benchmarking (e.g., competitive, cooperative, collaborative, and internal benchmarking). Arthur Andersen's KMAT (Knowledge Management Assessment Tool) is described. The KMAT is a collaborative benchmarking tool, designed to help organizations make…

  17. Development of ORIGEN libraries for mixed oxide (MOX) fuel assembly designs

    Energy Technology Data Exchange (ETDEWEB)

    Mertyurek, Ugur, E-mail: mertyureku@ornl.gov; Gauld, Ian C., E-mail: gauldi@ornl.gov

    2016-02-15

    Highlights: • ORIGEN MOX library generation process is described. • SCALE burnup calculations are validated against measured MOX fuel samples from the MALIBU program. • ORIGEN MOX libraries are verified using the OECD Phase IV-B benchmark. • There is good agreement for calculated-to-measured isotopic distributions. - Abstract: ORIGEN cross section libraries for reactor-grade mixed oxide (MOX) fuel assembly designs have been developed to provide fast and accurate depletion calculations to predict nuclide inventories, radiation sources and thermal decay heat information needed in safety evaluations and safeguards verification measurements of spent nuclear fuel. These ORIGEN libraries are generated using two-dimensional lattice physics assembly models that include enrichment zoning and cross section data based on ENDF/B-VII.0 evaluations. Using the SCALE depletion sequence, burnup-dependent cross sections are created for selected commercial reactor assembly designs and a representative range of reactor operating conditions, fuel enrichments, and fuel burnup. The burnup dependent cross sections are then interpolated to provide problem-dependent cross sections for ORIGEN, avoiding the need for time-consuming lattice physics calculations. The ORIGEN libraries for MOX assembly designs are validated against destructive radiochemical assay measurements of MOX fuel from the MALIBU international experimental program. This program included measurements of MOX fuel from a 15 × 15 pressurized water reactor assembly and a 9 × 9 boiling water reactor assembly. The ORIGEN MOX libraries are also compared against detailed assembly calculations from the Phase IV-B numerical MOX fuel burnup credit benchmark coordinated by the Nuclear Energy Agency within the Organization for Economic Cooperation and Development. The nuclide compositions calculated by ORIGEN using the MOX libraries are shown to be in good agreement with other physics codes and with experimental data.

  18. ORIGEN-based Nuclear Fuel Inventory Module for Fuel Cycle Assessment: Final Project Report

    Energy Technology Data Exchange (ETDEWEB)

    Skutnik, Steven E. [Univ. of Tennessee, Knoxville, TN (United States). Dept. of Nuclear Engineering

    2017-06-19

    The goal of this project, “ORIGEN-based Nuclear Fuel Depletion Module for Fuel Cycle Assessment" is to create a physics-based reactor depletion and decay module for the Cyclus nuclear fuel cycle simulator in order to assess nuclear fuel inventories over a broad space of reactor operating conditions. The overall goal of this approach is to facilitate evaluations of nuclear fuel inventories for a broad space of scenarios, including extended used nuclear fuel storage and cascading impacts on fuel cycle options such as actinide recovery in used nuclear fuel, particularly for multiple recycle scenarios. The advantages of a physics-based approach (compared to a recipe-based approach which has been typically employed for fuel cycle simulators) is in its inherent flexibility; such an approach can more readily accommodate the broad space of potential isotopic vectors that may be encountered under advanced fuel cycle options. In order to develop this flexible reactor analysis capability, we are leveraging the Origen nuclear fuel depletion and decay module from SCALE to produce a standalone “depletion engine” which will serve as the kernel of a Cyclus-based reactor analysis module. The ORIGEN depletion module is a rigorously benchmarked and extensively validated tool for nuclear fuel analysis and thus its incorporation into the Cyclus framework can bring these capabilities to bear on the problem of evaluating long-term impacts of fuel cycle option choices on relevant metrics of interest, including materials inventories and availability (for multiple recycle scenarios), long-term waste management and repository impacts, etc. Developing this Origen-based analysis capability for Cyclus requires the refinement of the Origen analysis sequence to the point where it can reasonably be compiled as a standalone sequence outside of SCALE; i.e., wherein all of the computational aspects of Origen (including reactor cross-section library processing and interpolation, input and output

  19. Sphere Lower Bound for Rotated Lattice Constellations in Fading Channels

    CERN Document Server

    Fabregas, Albert Guillen i

    2007-01-01

    We study the error probability performance of rotated lattice constellations in frequency-flat Nakagami-$m$ block-fading channels. In particular, we use the sphere lower bound on the underlying infinite lattice as a performance benchmark. We show that the sphere lower bound has full diversity. We observe that optimally rotated lattices with largest known minimum product distance perform very close to the lower bound, while the ensemble of random rotations is shown to lack diversity and perform far from it.

  20. Benchmarking in Mobarakeh Steel Company

    OpenAIRE

    Sasan Ghasemi; Mohammad Nazemi; Mehran Nejati

    2008-01-01

    Benchmarking is considered as one of the most effective ways of improving performance in companies. Although benchmarking in business organizations is a relatively new concept and practice, it has rapidly gained acceptance worldwide. This paper introduces the benchmarking project conducted in Esfahan's Mobarakeh Steel Company, as the first systematic benchmarking project conducted in Iran. It aims to share the process deployed for the benchmarking project in this company and illustrate how th...

  1. Benchmarking in Mobarakeh Steel Company

    Directory of Open Access Journals (Sweden)

    Sasan Ghasemi

    2008-05-01

    Full Text Available Benchmarking is considered as one of the most effective ways of improving performance incompanies. Although benchmarking in business organizations is a relatively new concept and practice, ithas rapidly gained acceptance worldwide. This paper introduces the benchmarking project conducted in Esfahan’s Mobarakeh Steel Company, as the first systematic benchmarking project conducted in Iran. It aimsto share the process deployed for the benchmarking project in this company and illustrate how the projectsystematic implementation led to succes.

  2. PNNL Information Technology Benchmarking

    Energy Technology Data Exchange (ETDEWEB)

    DD Hostetler

    1999-09-08

    Benchmarking is a methodology for searching out industry best practices that lead to superior performance. It is exchanging information, not just with any organization, but with organizations known to be the best within PNNL, in industry, or in dissimilar industries with equivalent functions. It is used as a continuous improvement tool for business and technical processes, products, and services. Information technology--comprising all computer and electronic communication products and services--underpins the development and/or delivery of many PNNL products and services. This document describes the Pacific Northwest National Laboratory's (PNNL's) approach to information technology (IT) benchmarking. The purpose is to engage other organizations in the collaborative process of benchmarking in order to improve the value of IT services provided to customers. TM document's intended audience consists of other US Department of Energy (DOE) national laboratories and their IT staff. Although the individual participants must define the scope of collaborative benchmarking, an outline of IT service areas for possible benchmarking is described.

  3. Lattice theory

    CERN Document Server

    Donnellan, Thomas; Maxwell, E A; Plumpton, C

    1968-01-01

    Lattice Theory presents an elementary account of a significant branch of contemporary mathematics concerning lattice theory. This book discusses the unusual features, which include the presentation and exploitation of partitions of a finite set. Organized into six chapters, this book begins with an overview of the concept of several topics, including sets in general, the relations and operations, the relation of equivalence, and the relation of congruence. This text then defines the relation of partial order and then partially ordered sets, including chains. Other chapters examine the properti

  4. Benchmarking Pthreads performance

    Energy Technology Data Exchange (ETDEWEB)

    May, J M; de Supinski, B R

    1999-04-27

    The importance of the performance of threads libraries is growing as clusters of shared memory machines become more popular POSIX threads, or Pthreads, is an industry threads library standard. We have implemented the first Pthreads benchmark suite. In addition to measuring basic thread functions, such as thread creation, we apply the L.ogP model to standard Pthreads communication mechanisms. We present the results of our tests for several hardware platforms. These results demonstrate that the performance of existing Pthreads implementations varies widely; parts of nearly all of these implementations could be further optimized. Since hardware differences do not fully explain these performance variations, optimizations could improve the implementations. 2. Incorporating Threads Benchmarks into SKaMPI is an MPI benchmark suite that provides a general framework for performance analysis [7]. SKaMPI does not exhaustively test the MPI standard. Instead, it

  5. Deviating From the Benchmarks

    DEFF Research Database (Denmark)

    Rocha, Vera; Van Praag, Mirjam; Carneiro, Anabela

    survival? The analysis is based on a matched employer-employee dataset and covers about 17,500 startups in manufacturing and services. We adopt a new procedure to estimate individual benchmarks for the quantity and quality of initial human resources, acknowledging correlations between hiring decisions...... the benchmark can be substantial, are persistent over time, and hinder the survival of firms. The implications may, however, vary according to the sector and the ownership structure at entry. Given the stickiness of initial choices, wrong human capital decisions at entry turn out to be a close to irreversible...

  6. Benchmarking for Best Practice

    CERN Document Server

    Zairi, Mohamed

    1998-01-01

    Benchmarking for Best Practice uses up-to-the-minute case-studies of individual companies and industry-wide quality schemes to show how and why implementation has succeeded. For any practitioner wanting to establish best practice in a wide variety of business areas, this book makes essential reading. .It is also an ideal textbook on the applications of TQM since it describes concepts, covers definitions and illustrates the applications with first-hand examples. Professor Mohamed Zairi is an international expert and leading figure in the field of benchmarking. His pioneering work in this area l

  7. HPCS HPCchallenge Benchmark Suite

    Science.gov (United States)

    2007-11-02

    measured HPCchallenge Benchmark performance on various HPC architectures — from Cray X1s to Beowulf clusters — in the presentation and paper...from Cray X1s to Beowulf clusters — using the updated results at http://icl.cs.utk.edu/hpcc/hpcc_results.cgi Even a small percentage of random

  8. Benchmarking Danish Industries

    DEFF Research Database (Denmark)

    Gammelgaard, Britta; Bentzen, Eric; Aagaard Andreassen, Mette

    2003-01-01

    compatible survey. The International Manufacturing Strategy Survey (IMSS) doesbring up the question of supply chain management, but unfortunately, we did not have access to thedatabase. Data from the members of the SCOR-model, in the form of benchmarked performance data,may exist, but are nonetheless...

  9. Benchmarks: WICHE Region 2012

    Science.gov (United States)

    Western Interstate Commission for Higher Education, 2013

    2013-01-01

    Benchmarks: WICHE Region 2012 presents information on the West's progress in improving access to, success in, and financing of higher education. The information is updated annually to monitor change over time and encourage its use as a tool for informed discussion in policy and education communities. To establish a general context for the…

  10. Surveys and Benchmarks

    Science.gov (United States)

    Bers, Trudy

    2012-01-01

    Surveys and benchmarks continue to grow in importance for community colleges in response to several factors. One is the press for accountability, that is, for colleges to report the outcomes of their programs and services to demonstrate their quality and prudent use of resources, primarily to external constituents and governing boards at the state…

  11. Benchmarking and Performance Management

    Directory of Open Access Journals (Sweden)

    Adrian TANTAU

    2010-12-01

    Full Text Available The relevance of the chosen topic is explained by the meaning of the firm efficiency concept - the firm efficiency means the revealed performance (how well the firm performs in the actual market environment given the basic characteristics of the firms and their markets that are expected to drive their profitability (firm size, market power etc.. This complex and relative performance could be due to such things as product innovation, management quality, work organization, some other factors can be a cause even if they are not directly observed by the researcher. The critical need for the management individuals/group to continuously improve their firm/company’s efficiency and effectiveness, the need for the managers to know which are the success factors and the competitiveness determinants determine consequently, what performance measures are most critical in determining their firm’s overall success. Benchmarking, when done properly, can accurately identify both successful companies and the underlying reasons for their success. Innovation and benchmarking firm level performance are critical interdependent activities. Firm level variables, used to infer performance, are often interdependent due to operational reasons. Hence, the managers need to take the dependencies among these variables into account when forecasting and benchmarking performance. This paper studies firm level performance using financial ratio and other type of profitability measures. It uses econometric models to describe and then propose a method to forecast and benchmark performance.

  12. Fast burner reactor benchmark results from the NEA working party on physics of plutonium recycle

    Energy Technology Data Exchange (ETDEWEB)

    Hill, R.N.; Wade, D.C. [Argonne National Lab., IL (United States); Palmiotti, G. [CEA - Cadarache, Saint-Paul-Les-Durance (France)

    1995-12-01

    As part of a program proposed by the OECD/NEA Working Party on Physics of Plutonium Recycling (WPPR) to evaluate different scenarios for the use of plutonium, fast reactor physics benchmarks were developed; fuel cycle scenarios using either PUREX/TRUEX (oxide fuel) or pyrometallurgical (metal fuel) separation technologies were specified. These benchmarks were designed to evaluate the nuclear performance and radiotoxicity impact of a transuranic-burning fast reactor system. International benchmark results are summarized in this paper; and key conclusions are highlighted.

  13. Neutronics Benchmarks for the Utilization of Mixed-Oxide Fuel: Joint US/Russian Progress Report for Fiscal Year 1997, Volume 4, part 4-ESADA Plutonium Program Critical Experiments: Single-Region Core Configurations

    Energy Technology Data Exchange (ETDEWEB)

    Akkurt, H.; Abdurrahman, N.M.

    1999-05-01

    The purpose of this study is to simulate and assess the findings from selected ESADA experiments. It is presented in the format prescribed by the Nuclear Energy Agency Nuclear Science Committee for material to be included in the International Handbook of Evaluated Criticality Safety Benchmark Experiments.

  14. Different Activation Techniques for the Study of Epithermal Spectra, Applied to Heavy Water Lattices of Varying Fuel-To-Moderator Ratio

    Energy Technology Data Exchange (ETDEWEB)

    Sokolowski, E.K.

    1966-06-15

    Spectral indices at the cell boundary have been studied as functions of lattice pitch in the reference core of the Swedish R0 reactor. Epithermal indices were determined by activation of In{sup 115}, employing three different techniques: the two-foil, the cadmium ratio and the sandwich foil methods. The latter of these has the advantage of being independent of assumptions about foil cross sections or spectral functions, and it gives a spectrum index that lends itself readily to comparisons with theoretical multigroup calculations. Alternatively the results can be expressed in terms of the Westcott parameters r and T{sub n} when this is justified by the spectral conditions. The agreement between the three methods investigated is generally good. Good agreement is also found with multigroup collision.

  15. Benchmarking i den offentlige sektor

    DEFF Research Database (Denmark)

    Bukh, Per Nikolaj; Dietrichson, Lars; Sandalgaard, Niels

    2008-01-01

    I artiklen vil vi kort diskutere behovet for benchmarking i fraværet af traditionelle markedsmekanismer. Herefter vil vi nærmere redegøre for, hvad benchmarking er med udgangspunkt i fire forskellige anvendelser af benchmarking. Regulering af forsyningsvirksomheder vil blive behandlet, hvorefter...

  16. IAEA GT-MHR Benchmark Calculations Using the HELIOS/MASTER Two-Step Procedure

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Kyung Hoon; Kim, Kang Seog; Cho, Jin Young; Song, Jae Seung; Noh, Jae Man; Lee, Chung Chan; Zee, Sung Quun

    2007-05-15

    A new two-step procedure based on the HELISO/MASTER code system has been developed for the prismatic VHTR physics analysis. This procedure employs the HELIOS code for the transport lattice calculation to generate a few group constants, and the MASTER code for the 3-dimensional core calculation to perform the reactor physics analysis. Double heterogeneity effect due to the random distribution of the particulate fuel could be dealt with the recently developed reactivity-equivalent physical transformation (RPT) method. The strong spectral effects of the graphite moderated reactor core could be solved both by optimizing the number of energy groups and group boundaries, and by employing a partial core model instead of a single block one to generate a few group cross sections. Burnable poisons in the inner reflector and asymmetrically located large control rod can be treated by adopting the equivalence theory applied for the multi-block models to generate surface dependent discontinuity factors. Effective reflector cross sections were generated by using a simple mini-core model and an equivalence theory. In this study the IAEA GT-MHR benchmark problems with a plutonium fuel were analyzed by using the HELIOS/MASTER code package and the Monte Carlo code MCNP. Benchmark problems include pin, block and core models. The computational results of the HELIOS/MASTER code system were compared with those of MCNP and other participants. The results show that the 2-step procedure using HELIOS/MASTER can be applied to the reactor physics analysis for the prismatic VHTR with a good accuracy.

  17. Radiography benchmark 2014

    Energy Technology Data Exchange (ETDEWEB)

    Jaenisch, G.-R., E-mail: Gerd-Ruediger.Jaenisch@bam.de; Deresch, A., E-mail: Gerd-Ruediger.Jaenisch@bam.de; Bellon, C., E-mail: Gerd-Ruediger.Jaenisch@bam.de [Federal Institute for Materials Research and Testing, Unter den Eichen 87, 12205 Berlin (Germany); Schumm, A.; Lucet-Sanchez, F.; Guerin, P. [EDF R and D, 1 avenue du Général de Gaulle, 92141 Clamart (France)

    2015-03-31

    The purpose of the 2014 WFNDEC RT benchmark study was to compare predictions of various models of radiographic techniques, in particular those that predict the contribution of scattered radiation. All calculations were carried out for homogenous materials and a mono-energetic X-ray point source in the energy range between 100 keV and 10 MeV. The calculations were to include the best physics approach available considering electron binding effects. Secondary effects like X-ray fluorescence and bremsstrahlung production were to be taken into account if possible. The problem to be considered had two parts. Part I examined the spectrum and the spatial distribution of radiation behind a single iron plate. Part II considered two equally sized plates, made of iron and aluminum respectively, only evaluating the spatial distribution. Here we present the results of above benchmark study, comparing them to MCNP as the assumed reference model. The possible origins of the observed deviations are discussed.

  18. Benchmarking of LSTM Networks

    OpenAIRE

    Breuel, Thomas M.

    2015-01-01

    LSTM (Long Short-Term Memory) recurrent neural networks have been highly successful in a number of application areas. This technical report describes the use of the MNIST and UW3 databases for benchmarking LSTM networks and explores the effect of different architectural and hyperparameter choices on performance. Significant findings include: (1) LSTM performance depends smoothly on learning rates, (2) batching and momentum has no significant effect on performance, (3) softmax training outperf...

  19. Lattice QCD on fine lattices

    Energy Technology Data Exchange (ETDEWEB)

    Schaefer, Stefan [DESY (Germany). Neumann Inst. for Computing

    2016-11-01

    These configurations are currently in use in many on-going projects carried out by researchers throughout Europe. In particular this data will serve as an essential input into the computation of the coupling constant of QCD, where some of the simulations are still on-going. But also projects computing the masses of hadrons and investigating their structure are underway as well as activities in the physics of heavy quarks. As this initial project of gauge field generation has been successful, it is worthwhile to extend the currently available ensembles with further points in parameter space. These will allow to further study and control systematic effects like the ones introduced by the finite volume, the non-physical quark masses and the finite lattice spacing. In particular certain compromises have still been made in the region where pion masses and lattice spacing are both small. This is because physical pion masses require larger lattices to keep the effects of the finite volume under control. At light pion masses, a precise control of the continuum extrapolation is therefore difficult, but certainly a main goal of future simulations. To reach this goal, algorithmic developments as well as faster hardware will be needed.

  20. Sensitivity Analysis of OECD Benchmark Tests in BISON

    Energy Technology Data Exchange (ETDEWEB)

    Swiler, Laura Painton [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Gamble, Kyle [Idaho National Lab. (INL), Idaho Falls, ID (United States); Schmidt, Rodney C. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Williamson, Richard [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2015-09-01

    This report summarizes a NEAMS (Nuclear Energy Advanced Modeling and Simulation) project focused on sensitivity analysis of a fuels performance benchmark problem. The benchmark problem was defined by the Uncertainty Analysis in Modeling working group of the Nuclear Science Committee, part of the Nuclear Energy Agency of the Organization for Economic Cooperation and Development (OECD ). The benchmark problem involv ed steady - state behavior of a fuel pin in a Pressurized Water Reactor (PWR). The problem was created in the BISON Fuels Performance code. Dakota was used to generate and analyze 300 samples of 17 input parameters defining core boundary conditions, manuf acturing tolerances , and fuel properties. There were 24 responses of interest, including fuel centerline temperatures at a variety of locations and burnup levels, fission gas released, axial elongation of the fuel pin, etc. Pearson and Spearman correlatio n coefficients and Sobol' variance - based indices were used to perform the sensitivity analysis. This report summarizes the process and presents results from this study.

  1. RECENT ADDITIONS OF CRITICALITY SAFETY RELATED INTEGRAL BENCHMARK DATA TO THE ICSBEP AND IRPHEP HANDBOOKS

    Energy Technology Data Exchange (ETDEWEB)

    J. Blair Briggs; Lori Scott; Yolanda Rugama; Enrico Sartori

    2009-09-01

    High-quality integral benchmark experiments have always been a priority for criticality safety. However, interest in integral benchmark data is increasing as efforts to quantify and reduce calculational uncertainties accelerate to meet the demands of future criticality safety needs to support next generation reactor and advanced fuel cycle concepts. The importance of drawing upon existing benchmark data is becoming more apparent because of dwindling availability of critical facilities worldwide and the high cost of performing new experiments. Integral benchmark data from the International Handbook of Evaluated Criticality Safety Benchmark Experiments and the International Handbook of Reactor Physics Benchmark Experiments are widely used. Benchmark data have been added to these two handbooks since the last Nuclear Criticality Safety Division Topical Meeting in Knoxville, Tennessee (September 2005). This paper highlights these additions.

  2. Partially-reflected water-moderated square-piteched U(6.90)O2 fuel rod lattices with 0.67 fuel to water volume ratio (0.800 CM Pitch)

    Energy Technology Data Exchange (ETDEWEB)

    Harms, Gary A. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-09-01

    The US Department of Energy (DOE) Nuclear Energy Research Initiative funded the design and construction of the Seven Percent Critical Experiment (7uPCX) at Sandia National Laboratories. The start-up of the experiment facility and the execution of the experiments described here were funded by the DOE Nuclear Criticality Safety Program. The 7uPCX is designed to investigate critical systems with fuel for light water reactors in the enrichment range above 5% 235U. The 7uPCX assembly is a water-moderated and -reflected array of aluminum-clad square-pitched U(6.90%)O2 fuel rods.

  3. Benchmark Data Through The International Reactor Physics Experiment Evaluation Project (IRPHEP)

    Energy Technology Data Exchange (ETDEWEB)

    J. Blair Briggs; Dr. Enrico Sartori

    2005-09-01

    The International Reactor Physics Experiments Evaluation Project (IRPhEP) was initiated by the Organization for Economic Cooperation and Development (OECD) Nuclear Energy Agency’s (NEA) Nuclear Science Committee (NSC) in June of 2002. The IRPhEP focus is on the derivation of internationally peer reviewed benchmark models for several types of integral measurements, in addition to the critical configuration. While the benchmarks produced by the IRPhEP are of primary interest to the Reactor Physics Community, many of the benchmarks can be of significant value to the Criticality Safety and Nuclear Data Communities. Benchmarks that support the Next Generation Nuclear Plant (NGNP), for example, also support fuel manufacture, handling, transportation, and storage activities and could challenge current analytical methods. The IRPhEP is patterned after the International Criticality Safety Benchmark Evaluation Project (ICSBEP) and is closely coordinated with the ICSBEP. This paper highlights the benchmarks that are currently being prepared by the IRPhEP that are also of interest to the Criticality Safety Community. The different types of measurements and associated benchmarks that can be expected in the first publication and beyond are described. The protocol for inclusion of IRPhEP benchmarks as ICSBEP benchmarks and for inclusion of ICSBEP benchmarks as IRPhEP benchmarks is detailed. The format for IRPhEP benchmark evaluations is described as an extension of the ICSBEP format. Benchmarks produced by the IRPhEP add new dimension to criticality safety benchmarking efforts and expand the collection of available integral benchmarks for nuclear data testing. The first publication of the "International Handbook of Evaluated Reactor Physics Benchmark Experiments" is scheduled for January of 2006.

  4. Benchmarking: applications to transfusion medicine.

    Science.gov (United States)

    Apelseth, Torunn Oveland; Molnar, Laura; Arnold, Emmy; Heddle, Nancy M

    2012-10-01

    Benchmarking is as a structured continuous collaborative process in which comparisons for selected indicators are used to identify factors that, when implemented, will improve transfusion practices. This study aimed to identify transfusion medicine studies reporting on benchmarking, summarize the benchmarking approaches used, and identify important considerations to move the concept of benchmarking forward in the field of transfusion medicine. A systematic review of published literature was performed to identify transfusion medicine-related studies that compared at least 2 separate institutions or regions with the intention of benchmarking focusing on 4 areas: blood utilization, safety, operational aspects, and blood donation. Forty-five studies were included: blood utilization (n = 35), safety (n = 5), operational aspects of transfusion medicine (n = 5), and blood donation (n = 0). Based on predefined criteria, 7 publications were classified as benchmarking, 2 as trending, and 36 as single-event studies. Three models of benchmarking are described: (1) a regional benchmarking program that collects and links relevant data from existing electronic sources, (2) a sentinel site model where data from a limited number of sites are collected, and (3) an institutional-initiated model where a site identifies indicators of interest and approaches other institutions. Benchmarking approaches are needed in the field of transfusion medicine. Major challenges include defining best practices and developing cost-effective methods of data collection. For those interested in initiating a benchmarking program, the sentinel site model may be most effective and sustainable as a starting point, although the regional model would be the ideal goal.

  5. Impact of the thermal scattering law of H in H2O on the isothermal temperature reactivity coefficients for UOX and MOX fuel lattices in cold operating conditions

    Directory of Open Access Journals (Sweden)

    Scotta Juan Pablo

    2016-01-01

    Full Text Available The contribution of the thermal scattering law of hydrogen in light water to isothermal temperature reactivity coefficients for UOX and MOX lattices was studied in the frame of the MISTRAL critical experiments carried out in the zero power reactor EOLE of CEA Cadarache (France. The interpretation of the core residual reactivity measured between 6 °C to 80 °C (by step of 5 °C was performed with the Monte-Carlo code TRIPOLI4®. The nuclear data from the JEFF-3.1.1 library were used in the calculations. Three different thermal scattering laws of hydrogen in light water were tested in order to evaluate their impact on the MISTRAL calculations. The thermal scattering laws of interest were firstly those recommended in JEFF-3.1.1 and ENDF/B-VII.1 and also that recently produced at the atomic center of Bariloche (CAB, Argentina with molecular dynamic simulations. The present work indicates that the calculation-to-experimpental bias is −0.4 ± 0.3 pcm/°C in the UOX core and −1.0 ± 0.3 pcm/°C in the MOX cores, when the JEFF-3.1.1 library is used. An improvement is observed over the whole temperature range with the CAB model. The calculation-to-experimpental bias vanishes for the UOX core (−0.02 pcm/°C and becomes close to −0.7 pcm/°C for the MOX cores. The magnitude of these bias have to be connected to the typical value of the temperature reactivity coefficient that ranges from −5 pcm/°C at Begining Of Cycle (BOC up to −50 pcm/°C at End Of Cycle (EOC, in PWR conditions.

  6. Benchmarking and Regulation

    DEFF Research Database (Denmark)

    Agrell, Per J.; Bogetoft, Peter

    nchmarking methods, and in particular Data Envelopment Analysis (DEA), have become well-established and informative tools for economic regulation. DEA is now routinely used by European regulators to set reasonable revenue caps for energy transmission and distribution system operators....... The application of benchmarking in regulation, however, requires specific steps in terms of data validation, model specification and outlier detection that are not systematically documented in open publications, leading to discussions about regulatory stability and economic feasibility of these techniques....... In this paper, we review the modern foundations for frontier-based regulation and we discuss its actual use in several jurisdictions....

  7. 2001 benchmarking guide.

    Science.gov (United States)

    Hoppszallern, S

    2001-01-01

    Our fifth annual guide to benchmarking under managed care presents data that is a study in market dynamics and adaptation. New this year are financial indicators on HMOs exiting the market and those remaining. Hospital financial ratios and details on department performance are included. The physician group practice numbers show why physicians are scrutinizing capitated payments. Overall, hospitals in markets with high managed care penetration are more successful in managing labor costs and show productivity gains in imaging services, physical therapy and materials management.

  8. Benchmarking Query Execution Robustness

    Science.gov (United States)

    Wiener, Janet L.; Kuno, Harumi; Graefe, Goetz

    Benchmarks that focus on running queries on a well-tuned database system ignore a long-standing problem: adverse runtime conditions can cause database system performance to vary widely and unexpectedly. When the query execution engine does not exhibit resilience to these adverse conditions, addressing the resultant performance problems can contribute significantly to the total cost of ownership for a database system in over-provisioning, lost efficiency, and increased human administrative costs. For example, focused human effort may be needed to manually invoke workload management actions or fine-tune the optimization of specific queries.

  9. Dual Lattice of ℤ-module Lattice

    Directory of Open Access Journals (Sweden)

    Futa Yuichi

    2017-07-01

    Full Text Available In this article, we formalize in Mizar [5] the definition of dual lattice and their properties. We formally prove that a set of all dual vectors in a rational lattice has the construction of a lattice. We show that a dual basis can be calculated by elements of an inverse of the Gram Matrix. We also formalize a summation of inner products and their properties. Lattice of ℤ-module is necessary for lattice problems, LLL(Lenstra, Lenstra and Lovász base reduction algorithm and cryptographic systems with lattice [20], [10] and [19].

  10. Benchmarking concentrating photovoltaic systems

    Science.gov (United States)

    Duerr, Fabian; Muthirayan, Buvaneshwari; Meuret, Youri; Thienpont, Hugo

    2010-08-01

    Integral to photovoltaics is the need to provide improved economic viability. To achieve this goal, photovoltaic technology has to be able to harness more light at less cost. A large variety of concentrating photovoltaic concepts has provided cause for pursuit. To obtain a detailed profitability analysis, a flexible evaluation is crucial for benchmarking the cost-performance of this variety of concentrating photovoltaic concepts. To save time and capital, a way to estimate the cost-performance of a complete solar energy system is to use computer aided modeling. In this work a benchmark tool is introduced based on a modular programming concept. The overall implementation is done in MATLAB whereas Advanced Systems Analysis Program (ASAP) is used for ray tracing calculations. This allows for a flexible and extendable structuring of all important modules, namely an advanced source modeling including time and local dependence, and an advanced optical system analysis of various optical designs to obtain an evaluation of the figure of merit. An important figure of merit: the energy yield for a given photovoltaic system at a geographical position over a specific period, can be calculated.

  11. Entropy-based benchmarking methods

    OpenAIRE

    2012-01-01

    We argue that benchmarking sign-volatile series should be based on the principle of movement and sign preservation, which states that a bench-marked series should reproduce the movement and signs in the original series. We show that the widely used variants of Denton (1971) method and the growth preservation method of Causey and Trager (1981) may violate this principle, while its requirements are explicitly taken into account in the pro-posed entropy-based benchmarking methods. Our illustrati...

  12. HPC Benchmark Suite NMx Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Intelligent Automation Inc., (IAI) and University of Central Florida (UCF) propose to develop a comprehensive numerical test suite for benchmarking current and...

  13. Benchmarking foreign electronics technologies

    Energy Technology Data Exchange (ETDEWEB)

    Bostian, C.W.; Hodges, D.A.; Leachman, R.C.; Sheridan, T.B.; Tsang, W.T.; White, R.M.

    1994-12-01

    This report has been drafted in response to a request from the Japanese Technology Evaluation Center`s (JTEC) Panel on Benchmarking Select Technologies. Since April 1991, the Competitive Semiconductor Manufacturing (CSM) Program at the University of California at Berkeley has been engaged in a detailed study of quality, productivity, and competitiveness in semiconductor manufacturing worldwide. The program is a joint activity of the College of Engineering, the Haas School of Business, and the Berkeley Roundtable on the International Economy, under sponsorship of the Alfred P. Sloan Foundation, and with the cooperation of semiconductor producers from Asia, Europe and the United States. Professors David A. Hodges and Robert C. Leachman are the project`s Co-Directors. The present report for JTEC is primarily based on data and analysis drawn from that continuing program. The CSM program is being conducted by faculty, graduate students and research staff from UC Berkeley`s Schools of Engineering and Business, and Department of Economics. Many of the participating firms are represented on the program`s Industry Advisory Board. The Board played an important role in defining the research agenda. A pilot study was conducted in 1991 with the cooperation of three semiconductor plants. The research plan and survey documents were thereby refined. The main phase of the CSM benchmarking study began in mid-1992 and will continue at least through 1997. reports are presented on the manufacture of integrated circuits; data storage; wireless technology; human-machine interfaces; and optoelectronics. Selected papers are indexed separately for inclusion in the Energy Science and Technology Database.

  14. Benchmarking monthly homogenization algorithms

    Directory of Open Access Journals (Sweden)

    V. K. C. Venema

    2011-08-01

    Full Text Available The COST (European Cooperation in Science and Technology Action ES0601: Advances in homogenization methods of climate series: an integrated approach (HOME has executed a blind intercomparison and validation study for monthly homogenization algorithms. Time series of monthly temperature and precipitation were evaluated because of their importance for climate studies and because they represent two important types of statistics (additive and multiplicative. The algorithms were validated against a realistic benchmark dataset. The benchmark contains real inhomogeneous data as well as simulated data with inserted inhomogeneities. Random break-type inhomogeneities were added to the simulated datasets modeled as a Poisson process with normally distributed breakpoint sizes. To approximate real world conditions, breaks were introduced that occur simultaneously in multiple station series within a simulated network of station data. The simulated time series also contained outliers, missing data periods and local station trends. Further, a stochastic nonlinear global (network-wide trend was added.

    Participants provided 25 separate homogenized contributions as part of the blind study as well as 22 additional solutions submitted after the details of the imposed inhomogeneities were revealed. These homogenized datasets were assessed by a number of performance metrics including (i the centered root mean square error relative to the true homogeneous value at various averaging scales, (ii the error in linear trend estimates and (iii traditional contingency skill scores. The metrics were computed both using the individual station series as well as the network average regional series. The performance of the contributions depends significantly on the error metric considered. Contingency scores by themselves are not very informative. Although relative homogenization algorithms typically improve the homogeneity of temperature data, only the best ones improve

  15. Benchmark job – Watch out!

    CERN Multimedia

    Staff Association

    2017-01-01

    On 12 December 2016, in Echo No. 259, we already discussed at length the MERIT and benchmark jobs. Still, we find that a couple of issues warrant further discussion. Benchmark job – administrative decision on 1 July 2017 On 12 January 2017, the HR Department informed all staff members of a change to the effective date of the administrative decision regarding benchmark jobs. The benchmark job title of each staff member will be confirmed on 1 July 2017, instead of 1 May 2017 as originally announced in HR’s letter on 18 August 2016. Postponing the administrative decision by two months will leave a little more time to address the issues related to incorrect placement in a benchmark job. Benchmark job – discuss with your supervisor, at the latest during the MERIT interview In order to rectify an incorrect placement in a benchmark job, it is essential that the supervisor and the supervisee go over the assigned benchmark job together. In most cases, this placement has been done autom...

  16. Benchmark for Strategic Performance Improvement.

    Science.gov (United States)

    Gohlke, Annette

    1997-01-01

    Explains benchmarking, a total quality management tool used to measure and compare the work processes in a library with those in other libraries to increase library performance. Topics include the main groups of upper management, clients, and staff; critical success factors for each group; and benefits of benchmarking. (Author/LRW)

  17. Internal Benchmarking for Institutional Effectiveness

    Science.gov (United States)

    Ronco, Sharron L.

    2012-01-01

    Internal benchmarking is an established practice in business and industry for identifying best in-house practices and disseminating the knowledge about those practices to other groups in the organization. Internal benchmarking can be done with structures, processes, outcomes, or even individuals. In colleges or universities with multicampuses or a…

  18. Entropy-based benchmarking methods

    NARCIS (Netherlands)

    Temurshoev, Umed

    2012-01-01

    We argue that benchmarking sign-volatile series should be based on the principle of movement and sign preservation, which states that a bench-marked series should reproduce the movement and signs in the original series. We show that the widely used variants of Denton (1971) method and the growth pre

  19. Applications of Integral Benchmark Data

    Energy Technology Data Exchange (ETDEWEB)

    Giuseppe Palmiotti; Teruhiko Kugo; Fitz Trumble; Albert C. (Skip) Kahler; Dale Lancaster

    2014-10-09

    The International Reactor Physics Experiment Evaluation Project (IRPhEP) and the International Criticality Safety Benchmark Evaluation Project (ICSBEP) provide evaluated integral benchmark data that may be used for validation of reactor physics / nuclear criticality safety analytical methods and data, nuclear data testing, advanced modeling and simulation, and safety analysis licensing activities. The handbooks produced by these programs are used in over 30 countries. Five example applications are presented in this paper: (1) Use of IRPhEP Data in Uncertainty Analyses and Cross Section Adjustment, (2) Uncertainty Evaluation Methods for Reactor Core Design at JAEA Using Reactor Physics Experimental Data, (3) Application of Benchmarking Data to a Broad Range of Criticality Safety Problems, (4) Cross Section Data Testing with ICSBEP Benchmarks, and (5) Use of the International Handbook of Evaluated Reactor Physics Benchmark Experiments to Support the Power Industry.

  20. Benchmarking & European Sustainable Transport Policies

    DEFF Research Database (Denmark)

    Gudmundsson, H.

    2003-01-01

    , Benchmarking is one of the management tools that have recently been introduced in the transport sector. It is rapidly being applied to a wide range of transport operations, services and policies. This paper is a contribution to the discussion of the role of benchmarking in the future efforts...... to support Sustainable European Transport Policies. The key message is that transport benchmarking has not yet been developed to cope with the challenges of this task. Rather than backing down completely, the paper suggests some critical conditions for applying and adopting benchmarking for this purpose. One...... way forward is to ensure a higher level of environmental integration in transport policy benchmarking. To this effect the paper will discuss the possible role of the socalled Transport and Environment Reporting Mechanism developed by the European Environment Agency. The paper provides an independent...

  1. Benchmarking and Sustainable Transport Policy

    DEFF Research Database (Denmark)

    Gudmundsson, Henrik; Wyatt, Andrew; Gordon, Lucy

    2004-01-01

    is generally not advised. Several other ways in which benchmarking and policy can support one another are identified in the analysis. This leads to a range of recommended initiatives to exploit the benefits of benchmarking in transport while avoiding some of the lurking pitfalls and dead ends......Order to learn from the best. In 2000 the European Commission initiated research to explore benchmarking as a tool to promote policies for ‘sustainable transport’. This paper reports findings and recommendations on how to address this challenge. The findings suggest that benchmarking is a valuable...... tool that may indeed help to move forward the transport policy agenda. However, there are major conditions and limitations. First of all it is not always so straightforward to delimit, measure and compare transport services in order to establish a clear benchmark. Secondly ‘sustainable transport...

  2. Free µ-Lattices

    DEFF Research Database (Denmark)

    Santocanale, Luigi

    2002-01-01

    A μ-lattice is a lattice with the property that every unary polynomial has both a least and a greatest fix-point. In this paper we define the quasivariety of μ-lattices and, for a given partially ordered set P, we construct a μ-lattice JP whose elements are equivalence classes of games in a preor...

  3. ENDF/B-VII.1 Neutron Cross Section Data Testing with Critical Assembly Benchmarks and Reactor Experiments

    Science.gov (United States)

    Kahler, A. C.; MacFarlane, R. E.; Mosteller, R. D.; Kiedrowski, B. C.; Frankle, S. C.; Chadwick, M. B.; McKnight, R. D.; Lell, R. M.; Palmiotti, G.; Hiruta, H.; Herman, M.; Arcilla, R.; Mughabghab, S. F.; Sublet, J. C.; Trkov, A.; Trumbull, T. H.; Dunn, M.

    2011-12-01

    The ENDF/B-VII.1 library is the latest revision to the United States' Evaluated Nuclear Data File (ENDF). The ENDF library is currently in its seventh generation, with ENDF/B-VII.0 being released in 2006. This revision expands upon that library, including the addition of new evaluated files (was 393 neutron files previously, now 423 including replacement of elemental vanadium and zinc evaluations with isotopic evaluations) and extension or updating of many existing neutron data files. Complete details are provided in the companion paper [M. B. Chadwick et al., "ENDF/B-VII.1 Nuclear Data for Science and Technology: Cross Sections, Covariances, Fission Product Yields and Decay Data," Nuclear Data Sheets, 112, 2887 (2011)]. This paper focuses on how accurately application libraries may be expected to perform in criticality calculations with these data. Continuous energy cross section libraries, suitable for use with the MCNP Monte Carlo transport code, have been generated and applied to a suite of nearly one thousand critical benchmark assemblies defined in the International Criticality Safety Benchmark Evaluation Project's International Handbook of Evaluated Criticality Safety Benchmark Experiments. This suite covers uranium and plutonium fuel systems in a variety of forms such as metallic, oxide or solution, and under a variety of spectral conditions, including unmoderated (i.e., bare), metal reflected and water or other light element reflected. Assembly eigenvalues that were accurately predicted with ENDF/B-VII.0 cross sections such as unmoderated and uranium reflected 235U and 239Pu assemblies, HEU solution systems and LEU oxide lattice systems that mimic commercial PWR configurations continue to be accurately calculated with ENDF/B-VII.1 cross sections, and deficiencies in predicted eigenvalues for assemblies containing selected materials, including titanium, manganese, cadmium and tungsten are greatly reduced. Improvements are also confirmed for selected

  4. ENDF/B-VII.1 Neutron Cross Section Data Testing with Critical Assembly Benchmarks and Reactor Experiments

    Energy Technology Data Exchange (ETDEWEB)

    Kahler, A.C.; Herman, M.; Kahler,A.C.; MacFarlane,R.E.; Mosteller,R.D.; Kiedrowski,B.C.; Frankle,S.C.; Chadwick,M.B.; McKnight,R.D.; Lell,R.M.; Palmiotti,G.; Hiruta,H.; Herman,M.; Arcilla,R.; Mughabghab,S.F.; Sublet,J.C.; Trkov,A.; Trumbull,T.H.; Dunn,M.

    2011-12-01

    The ENDF/B-VII.1 library is the latest revision to the United States Evaluated Nuclear Data File (ENDF). The ENDF library is currently in its seventh generation, with ENDF/B-VII.0 being released in 2006. This revision expands upon that library, including the addition of new evaluated files (was 393 neutron files previously, now 423 including replacement of elemental vanadium and zinc evaluations with isotopic evaluations) and extension or updating of many existing neutron data files. Complete details are provided in the companion paper [M. B. Chadwick et al., 'ENDF/B-VII.1 Nuclear Data for Science and Technology: Cross Sections, Covariances, Fission Product Yields and Decay Data,' Nuclear Data Sheets, 112, 2887 (2011)]. This paper focuses on how accurately application libraries may be expected to perform in criticality calculations with these data. Continuous energy cross section libraries, suitable for use with the MCNP Monte Carlo transport code, have been generated and applied to a suite of nearly one thousand critical benchmark assemblies defined in the International Criticality Safety Benchmark Evaluation Project's International Handbook of Evaluated Criticality Safety Benchmark Experiments. This suite covers uranium and plutonium fuel systems in a variety of forms such as metallic, oxide or solution, and under a variety of spectral conditions, including unmoderated (i.e., bare), metal reflected and water or other light element reflected. Assembly eigenvalues that were accurately predicted with ENDF/B-VII.0 cross sections such as unmoderated and uranium reflected {sup 235}U and {sup 239}Pu assemblies, HEU solution systems and LEU oxide lattice systems that mimic commercial PWR configurations continue to be accurately calculated with ENDF/B-VII.1 cross sections, and deficiencies in predicted eigenvalues for assemblies containing selected materials, including titanium, manganese, cadmium and tungsten are greatly reduced. Improvements are also

  5. ENDF/B-VII.1 Neutron Cross Section Data Testing with Critical Assembly Benchmarks and Reactor Experiments

    Energy Technology Data Exchange (ETDEWEB)

    Kahler, A. [Los Alamos National Laboratory (LANL); Macfarlane, R E [Los Alamos National Laboratory (LANL); Mosteller, R D [Los Alamos National Laboratory (LANL); Kiedrowski, B C [Los Alamos National Laboratory (LANL); Frankle, S C [Los Alamos National Laboratory (LANL); Chadwick, M. B. [Los Alamos National Laboratory (LANL); Mcknight, R D [Argonne National Laboratory (ANL); Lell, R M [Argonne National Laboratory (ANL); Palmiotti, G [Idaho National Laboratory (INL); Hiruta, h [Idaho National Laboratory (INL); Herman, Micheal W [Brookhaven National Laboratory (BNL); Arcilla, r [Brookhaven National Laboratory (BNL); Mughabghab, S F [Brookhaven National Laboratory (BNL); Sublet, J C [Culham Science Center, Abington, UK; Trkov, A. [Jozef Stefan Institute, Slovenia; Trumbull, T H [Knolls Atomic Power Laboratory; Dunn, Michael E [ORNL

    2011-01-01

    The ENDF/B-VII.1 library is the latest revision to the United States' Evaluated Nuclear Data File (ENDF). The ENDF library is currently in its seventh generation, with ENDF/B-VII.0 being released in 2006. This revision expands upon that library, including the addition of new evaluated files (was 393 neutron files previously, now 423 including replacement of elemental vanadium and zinc evaluations with isotopic evaluations) and extension or updating of many existing neutron data files. Complete details are provided in the companion paper [1]. This paper focuses on how accurately application libraries may be expected to perform in criticality calculations with these data. Continuous energy cross section libraries, suitable for use with the MCNP Monte Carlo transport code, have been generated and applied to a suite of nearly one thousand critical benchmark assemblies defined in the International Criticality Safety Benchmark Evaluation Project's International Handbook of Evaluated Criticality Safety Benchmark Experiments. This suite covers uranium and plutonium fuel systems in a variety of forms such as metallic, oxide or solution, and under a variety of spectral conditions, including unmoderated (i.e., bare), metal reflected and water or other light element reflected. Assembly eigenvalues that were accurately predicted with ENDF/B-VII.0 cross sections such as unrnoderated and uranium reflected (235)U and (239)Pu assemblies, HEU solution systems and LEU oxide lattice systems that mimic commercial PWR configurations continue to be accurately calculated with ENDF/B-VII.1 cross sections, and deficiencies in predicted eigenvalues for assemblies containing selected materials, including titanium, manganese, cadmium and tungsten are greatly reduced. Improvements are also confirmed for selected actinide reaction rates such as (236)U; (238,242)Pu and (241,243)Am capture in fast systems. Other deficiencies, such as the overprediction of Pu solution system critical

  6. Benchmarking of energy time series

    Energy Technology Data Exchange (ETDEWEB)

    Williamson, M.A.

    1990-04-01

    Benchmarking consists of the adjustment of time series data from one source in order to achieve agreement with similar data from a second source. The data from the latter source are referred to as the benchmark(s), and often differ in that they are observed at a lower frequency, represent a higher level of temporal aggregation, and/or are considered to be of greater accuracy. This report provides an extensive survey of benchmarking procedures which have appeared in the statistical literature, and reviews specific benchmarking procedures currently used by the Energy Information Administration (EIA). The literature survey includes a technical summary of the major benchmarking methods and their statistical properties. Factors influencing the choice and application of particular techniques are described and the impact of benchmark accuracy is discussed. EIA applications and procedures are reviewed and evaluated for residential natural gas deliveries series and coal production series. It is found that the current method of adjusting the natural gas series is consistent with the behavior of the series and the methods used in obtaining the initial data. As a result, no change is recommended. For the coal production series, a staged approach based on a first differencing technique is recommended over the current procedure. A comparison of the adjustments produced by the two methods is made for the 1987 Indiana coal production series. 32 refs., 5 figs., 1 tab.

  7. Benchmarking in academic pharmacy departments.

    Science.gov (United States)

    Bosso, John A; Chisholm-Burns, Marie; Nappi, Jean; Gubbins, Paul O; Ross, Leigh Ann

    2010-10-11

    Benchmarking in academic pharmacy, and recommendations for the potential uses of benchmarking in academic pharmacy departments are discussed in this paper. Benchmarking is the process by which practices, procedures, and performance metrics are compared to an established standard or best practice. Many businesses and industries use benchmarking to compare processes and outcomes, and ultimately plan for improvement. Institutions of higher learning have embraced benchmarking practices to facilitate measuring the quality of their educational and research programs. Benchmarking is used internally as well to justify the allocation of institutional resources or to mediate among competing demands for additional program staff or space. Surveying all chairs of academic pharmacy departments to explore benchmarking issues such as department size and composition, as well as faculty teaching, scholarly, and service productivity, could provide valuable information. To date, attempts to gather this data have had limited success. We believe this information is potentially important, urge that efforts to gather it should be continued, and offer suggestions to achieve full participation.

  8. Correlational effect size benchmarks.

    Science.gov (United States)

    Bosco, Frank A; Aguinis, Herman; Singh, Kulraj; Field, James G; Pierce, Charles A

    2015-03-01

    Effect size information is essential for the scientific enterprise and plays an increasingly central role in the scientific process. We extracted 147,328 correlations and developed a hierarchical taxonomy of variables reported in Journal of Applied Psychology and Personnel Psychology from 1980 to 2010 to produce empirical effect size benchmarks at the omnibus level, for 20 common research domains, and for an even finer grained level of generality. Results indicate that the usual interpretation and classification of effect sizes as small, medium, and large bear almost no resemblance to findings in the field, because distributions of effect sizes exhibit tertile partitions at values approximately one-half to one-third those intuited by Cohen (1988). Our results offer information that can be used for research planning and design purposes, such as producing better informed non-nil hypotheses and estimating statistical power and planning sample size accordingly. We also offer information useful for understanding the relative importance of the effect sizes found in a particular study in relationship to others and which research domains have advanced more or less, given that larger effect sizes indicate a better understanding of a phenomenon. Also, our study offers information about research domains for which the investigation of moderating effects may be more fruitful and provide information that is likely to facilitate the implementation of Bayesian analysis. Finally, our study offers information that practitioners can use to evaluate the relative effectiveness of various types of interventions. PsycINFO Database Record (c) 2015 APA, all rights reserved.

  9. Benchmarking in water project analysis

    Science.gov (United States)

    Griffin, Ronald C.

    2008-11-01

    The with/without principle of cost-benefit analysis is examined for the possible bias that it brings to water resource planning. Theory and examples for this question are established. Because benchmarking against the demonstrably low without-project hurdle can detract from economic welfare and can fail to promote efficient policy, improvement opportunities are investigated. In lieu of the traditional, without-project benchmark, a second-best-based "difference-making benchmark" is proposed. The project authorizations and modified review processes instituted by the U.S. Water Resources Development Act of 2007 may provide for renewed interest in these findings.

  10. Fuel flexible fuel injector

    Science.gov (United States)

    Tuthill, Richard S; Davis, Dustin W; Dai, Zhongtao

    2015-02-03

    A disclosed fuel injector provides mixing of fuel with airflow by surrounding a swirled fuel flow with first and second swirled airflows that ensures mixing prior to or upon entering the combustion chamber. Fuel tubes produce a central fuel flow along with a central airflow through a plurality of openings to generate the high velocity fuel/air mixture along the axis of the fuel injector in addition to the swirled fuel/air mixture.

  11. C5 Benchmark Problem with Discrete Ordinate Radiation Transport Code DENOVO

    Energy Technology Data Exchange (ETDEWEB)

    Yesilyurt, Gokhan [ORNL; Clarno, Kevin T [ORNL; Evans, Thomas M [ORNL; Davidson, Gregory G [ORNL; Fox, Patricia B [ORNL

    2011-01-01

    The C5 benchmark problem proposed by the Organisation for Economic Co-operation and Development/Nuclear Energy Agency was modeled to examine the capabilities of Denovo, a three-dimensional (3-D) parallel discrete ordinates (S{sub N}) radiation transport code, for problems with no spatial homogenization. Denovo uses state-of-the-art numerical methods to obtain accurate solutions to the Boltzmann transport equation. Problems were run in parallel on Jaguar, a high-performance supercomputer located at Oak Ridge National Laboratory. Both the two-dimensional (2-D) and 3-D configurations were analyzed, and the results were compared with the reference MCNP Monte Carlo calculations. For an additional comparison, SCALE/KENO-V.a Monte Carlo solutions were also included. In addition, a sensitivity analysis was performed for the optimal angular quadrature and mesh resolution for both the 2-D and 3-D infinite lattices of UO{sub 2} fuel pin cells. Denovo was verified with the C5 problem. The effective multiplication factors, pin powers, and assembly powers were found to be in good agreement with the reference MCNP and SCALE/KENO-V.a Monte Carlo calculations.

  12. Precise determination of lattice phase shifts and mixing angles

    Science.gov (United States)

    Lu, Bing-Nan; Lähde, Timo A.; Lee, Dean; Meißner, Ulf-G.

    2016-09-01

    We introduce a general and accurate method for determining lattice phase shifts and mixing angles, which is applicable to arbitrary, non-cubic lattices. Our method combines angular momentum projection, spherical wall boundaries and an adjustable auxiliary potential. This allows us to construct radial lattice wave functions and to determine phase shifts at arbitrary energies. For coupled partial waves, we use a complex-valued auxiliary potential that breaks time-reversal invariance. We benchmark our method using a system of two spin-1/2 particles interacting through a finite-range potential with a strong tensor component. We are able to extract phase shifts and mixing angles for all angular momenta and energies, with precision greater than that of extant methods. We discuss a wide range of applications from nuclear lattice simulations to optical lattice experiments.

  13. Water Level Superseded Benchmark Sheets

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Images of National Coast & Geodetic Survey (now NOAA's National Geodetic Survey/NGS) tidal benchmarks which have been superseded by new markers or locations....

  14. Benchmark simulation models, quo vadis?

    DEFF Research Database (Denmark)

    Jeppsson, U.; Alex, J; Batstone, D. J.

    2013-01-01

    As the work of the IWA Task Group on Benchmarking of Control Strategies for wastewater treatment plants (WWTPs) is coming to an end, it is essential to disseminate the knowledge gained. For this reason, all authors of the IWA Scientific and Technical Report on benchmarking have come together to p...... already being done within the context of the benchmarking simulation models (BSMs) or applicable work in the wider literature. Of key importance is increasing capability, usability and transparency of the BSM package while avoiding unnecessary complexity. © IWA Publishing 2013....... and spatial extension, process modifications within the WWTP, the realism of models, control strategy extensions and the potential for new evaluation tools within the existing benchmark system. We find that there are major opportunities for application within all of these areas, either from existing work...

  15. Benchmarking and Sustainable Transport Policy

    DEFF Research Database (Denmark)

    Gudmundsson, Henrik; Wyatt, Andrew; Gordon, Lucy

    2004-01-01

    Order to learn from the best. In 2000 the European Commission initiated research to explore benchmarking as a tool to promote policies for ‘sustainable transport’. This paper reports findings and recommendations on how to address this challenge. The findings suggest that benchmarking is a valuable...... tool that may indeed help to move forward the transport policy agenda. However, there are major conditions and limitations. First of all it is not always so straightforward to delimit, measure and compare transport services in order to establish a clear benchmark. Secondly ‘sustainable transport......’ evokes a broad range of concerns that are hard to address fully at the level of specific practices. Thirdly policies are not directly comparable across space and context. For these reasons attempting to benchmark ‘sustainable transport policies’ against one another would be a highly complex task, which...

  16. Performance Targets and External Benchmarking

    DEFF Research Database (Denmark)

    Friis, Ivar; Hansen, Allan; Vámosi, Tamás S.

    Research on relative performance measures, transfer pricing, beyond budgeting initiatives, target costing, piece rates systems and value based management has for decades underlined the importance of external benchmarking in performance management. Research conceptualises external benchmarking...... the conditions upon which the market mechanism is performing within organizations. This paper aims to contribute to research by providing more insight to the conditions for the use of external benchmarking as an element in performance management in organizations. Our study explores a particular type of external...... towards the conditions for the use of the external benchmarks we provide more insights to some of the issues and challenges that are related to using this mechanism for performance management and advance competitiveness in organizations....

  17. Performance Targets and External Benchmarking

    DEFF Research Database (Denmark)

    Friis, Ivar; Hansen, Allan; Vámosi, Tamás S.

    as a market mechanism that can be brought inside the firm to provide incentives for continuous improvement and the development of competitive advances. However, whereas extant research primarily has focused on the importance and effects of using external benchmarks, less attention has been directed towards...... towards the conditions for the use of the external benchmarks we provide more insights to some of the issues and challenges that are related to using this mechanism for performance management and advance competitiveness in organizations....

  18. Ghera: A Repository of Android App Vulnerability Benchmarks

    OpenAIRE

    Mitra, Joydeep; Ranganath, Venkatesh-Prasad

    2017-01-01

    Security of mobile apps affects the security of their users. This has fueled the development of techniques to automatically detect vulnerabilities in mobile apps and help developers secure their apps; specifically, in the context of Android platform due to openness and ubiquitousness of the platform. Despite a slew of research efforts in this space, there is no comprehensive repository of up-to-date and lean benchmarks that contain most of the known Android app vulnerabilities and, consequent...

  19. Research on computer systems benchmarking

    Science.gov (United States)

    Smith, Alan Jay (Principal Investigator)

    1996-01-01

    This grant addresses the topic of research on computer systems benchmarking and is more generally concerned with performance issues in computer systems. This report reviews work in those areas during the period of NASA support under this grant. The bulk of the work performed concerned benchmarking and analysis of CPUs, compilers, caches, and benchmark programs. The first part of this work concerned the issue of benchmark performance prediction. A new approach to benchmarking and machine characterization was reported, using a machine characterizer that measures the performance of a given system in terms of a Fortran abstract machine. Another report focused on analyzing compiler performance. The performance impact of optimization in the context of our methodology for CPU performance characterization was based on the abstract machine model. Benchmark programs are analyzed in another paper. A machine-independent model of program execution was developed to characterize both machine performance and program execution. By merging these machine and program characterizations, execution time can be estimated for arbitrary machine/program combinations. The work was continued into the domain of parallel and vector machines, including the issue of caches in vector processors and multiprocessors. All of the afore-mentioned accomplishments are more specifically summarized in this report, as well as those smaller in magnitude supported by this grant.

  20. Hybrid lattice Boltzmann method on overlapping grids.

    Science.gov (United States)

    Di Ilio, G; Chiappini, D; Ubertini, S; Bella, G; Succi, S

    2017-01-01

    In this work, a hybrid lattice Boltzmann method (HLBM) is proposed, where the standard lattice Boltzmann implementation based on the Bhatnagar-Gross-Krook (LBGK) approximation is combined together with an unstructured finite-volume lattice Boltzmann model. The method is constructed on an overlapping grid system, which allows the coexistence of a uniform lattice nodes spacing and a coordinate-free lattice structure. The natural adaptivity of the hybrid grid system makes the method particularly suitable to handle problems involving complex geometries. Moreover, the provided scheme ensures a high-accuracy solution near walls, given the capability of the unstructured submodel of achieving the desired level of refinement in a very flexible way. For these reasons, the HLBM represents a prospective tool for solving multiscale problems. The proposed method is here applied to the benchmark problem of a two-dimensional flow past a circular cylinder for a wide range of Reynolds numbers and its numerical performances are measured and compared with the standard LBGK ones.

  1. INTEGRAL BENCHMARKS AVAILABLE THROUGH THE INTERNATIONAL REACTOR PHYSICS EXPERIMENT EVALUATION PROJECT AND THE INTERNATIONAL CRITICALITY SAFETY BENCHMARK EVALUATION PROJECT

    Energy Technology Data Exchange (ETDEWEB)

    J. Blair Briggs; Lori Scott; Enrico Sartori; Yolanda Rugama

    2008-09-01

    Interest in high-quality integral benchmark data is increasing as efforts to quantify and reduce calculational uncertainties accelerate to meet the demands of next generation reactor and advanced fuel cycle concepts. The International Reactor Physics Experiment Evaluation Project (IRPhEP) and the International Criticality Safety Benchmark Evaluation Project (ICSBEP) continue to expand their efforts and broaden their scope to identify, evaluate, and provide integral benchmark data for method and data validation. Benchmark model specifications provided by these two projects are used heavily by the international reactor physics, nuclear data, and criticality safety communities. Thus far, 14 countries have contributed to the IRPhEP, and 20 have contributed to the ICSBEP. The status of the IRPhEP and ICSBEP is discussed in this paper, and the future of the two projects is outlined and discussed. Selected benchmarks that have been added to the IRPhEP and ICSBEP handbooks since PHYSOR’06 are highlighted, and the future of the two projects is discussed.

  2. Ultralocality on the lattice

    CERN Document Server

    Campos, R G; Campos, Rafael G.; Tututi, Eduardo S.

    2002-01-01

    It is shown that the nonlocal Dirac operator yielded by a lattice model that preserves chiral symmetry and uniqueness of fields, approaches to an ultralocal and invariant under translations operator when the size of the lattice tends to zero.

  3. New integrable lattice hierarchies

    Energy Technology Data Exchange (ETDEWEB)

    Pickering, Andrew [Area de Matematica Aplicada, ESCET, Universidad Rey Juan Carlos, c/ Tulipan s/n, 28933 Mostoles, Madrid (Spain); Zhu Zuonong [Departamento de Matematicas, Universidad de Salamanca, Plaza de la Merced 1, 37008 Salamanca (Spain) and Department of Mathematics, Shanghai Jiao Tong University, Shanghai 200030 (China)]. E-mail: znzhu2@yahoo.com.cn

    2006-01-23

    In this Letter we give a new integrable four-field lattice hierarchy, associated to a new discrete spectral problem. We obtain our hierarchy as the compatibility condition of this spectral problem and an associated equation, constructed herein, for the time-evolution of eigenfunctions. We consider reductions of our hierarchy, which also of course admit discrete zero curvature representations, in detail. We find that our hierarchy includes many well-known integrable hierarchies as special cases, including the Toda lattice hierarchy, the modified Toda lattice hierarchy, the relativistic Toda lattice hierarchy, and the Volterra lattice hierarchy. We also obtain here a new integrable two-field lattice hierarchy, to which we give the name of Suris lattice hierarchy, since the first equation of this hierarchy has previously been given by Suris. The Hamiltonian structure of the Suris lattice hierarchy is obtained by means of a trace identity formula.

  4. Sober Topological Molecular Lattices

    Institute of Scientific and Technical Information of China (English)

    张德学; 李永明

    2003-01-01

    A topological molecular lattice (TML) is a pair (L, T), where L is a completely distributive lattice and r is a subframe of L. There is an obvious forgetful functor from the category TML of TML's to the category Loc of locales. In this note,it is showed that this forgetful functor has a right adjoint. Then, by this adjunction,a special kind of topological molecular lattices called sober topological molecular lattices is introduced and investigated.

  5. Laguna Verde BWRs operational experience: steady-state fuel performance

    Energy Technology Data Exchange (ETDEWEB)

    Cuevas V, G. F.; Bravo S, J. M. [Global Nuclear Fuel - Americas, 3901 Castle Hayne Road, Wilmington, 28401 North Carolina (United States); Casillas, J. L., E-mail: gabriel.cuevas-vivas@gnf.co [General Electric Hitachi Nuclear Energy, 1989 Little Orchard St. Romm 239, San Jose, 95125 California (United States)

    2010-10-15

    The two BWR at Laguna Verde nuclear power station are finishing 21 and 15 years of continuous successful operation as of 2010. During Unit 1 and 2 commercial operations only Ge/GNF fuel designs have been employed; fuel lattice designs 8 x 8 and 10 x 10 were used at the reactor, with an original licensed thermal power (OLTP: 1931 MWt) and the reactor's first power up-rates of 5%. GNF fuel will be also used for the second EPU to reach 120% of OLTP in the near future. Thermal and gamma traversing in-core probes (Tip) are used for power monitoring purposes along with the Ge (now GNF-A) core monitoring system, 3-dimensional Monicore{sup TM}. GNF-A has also participated by preparing the core management plan that is regularly fine-tuned in collaboration with Comision Federal de Electricidad (CFE owner of the Laguna Verde reactors). For determination of thermal margins and eigenvalue prediction, GNF-A employs the NRC-licensed steady-state core simulator PANAC11. Tip comparisons are routinely used to adapt power distributions for a better thermal margin calculation. Over the years, several challenges have appeared in the near and long term fuel management planning such as increasing cycle length, optimization of the thermal margins, rated power increase, etc. Each challenge has been successfully overcome via operational strategy, code improvements and better fuel designs. This paper summarizes Laguna Verde Unit 1 and 2 steady-state performance from initial commercial operation, with a discussion of the nuclear and thermal-hydraulic design features, as well as of the operational strategies that set and interesting benchmark for future fuel applications, code development and operation of the BWRs. (Author)

  6. Core-shell Au-Pd nanoparticles as cathode catalysts for microbial fuel cell applications

    Science.gov (United States)

    Yang, Gaixiu; Chen, Dong; Lv, Pengmei; Kong, Xiaoying; Sun, Yongming; Wang, Zhongming; Yuan, Zhenhong; Liu, Hui; Yang, Jun

    2016-01-01

    Bimetallic nanoparticles with core-shell structures usually display enhanced catalytic properties due to the lattice strain created between the core and shell regions. In this study, we demonstrate the application of bimetallic Au-Pd nanoparticles with an Au core and a thin Pd shell as cathode catalysts in microbial fuel cells, which represent a promising technology for wastewater treatment, while directly generating electrical energy. In specific, in comparison with the hollow structured Pt nanoparticles, a benchmark for the electrocatalysis, the bimetallic core-shell Au-Pd nanoparticles are found to have superior activity and stability for oxygen reduction reaction in a neutral condition due to the strong electronic interaction and lattice strain effect between the Au core and the Pd shell domains. The maximum power density generated in a membraneless single-chamber microbial fuel cell running on wastewater with core-shell Au-Pd as cathode catalysts is ca. 16.0 W m−3 and remains stable over 150 days, clearly illustrating the potential of core-shell nanostructures in the applications of microbial fuel cells. PMID:27734945

  7. Core-shell Au-Pd nanoparticles as cathode catalysts for microbial fuel cell applications

    Science.gov (United States)

    Yang, Gaixiu; Chen, Dong; Lv, Pengmei; Kong, Xiaoying; Sun, Yongming; Wang, Zhongming; Yuan, Zhenhong; Liu, Hui; Yang, Jun

    2016-10-01

    Bimetallic nanoparticles with core-shell structures usually display enhanced catalytic properties due to the lattice strain created between the core and shell regions. In this study, we demonstrate the application of bimetallic Au-Pd nanoparticles with an Au core and a thin Pd shell as cathode catalysts in microbial fuel cells, which represent a promising technology for wastewater treatment, while directly generating electrical energy. In specific, in comparison with the hollow structured Pt nanoparticles, a benchmark for the electrocatalysis, the bimetallic core-shell Au-Pd nanoparticles are found to have superior activity and stability for oxygen reduction reaction in a neutral condition due to the strong electronic interaction and lattice strain effect between the Au core and the Pd shell domains. The maximum power density generated in a membraneless single-chamber microbial fuel cell running on wastewater with core-shell Au-Pd as cathode catalysts is ca. 16.0 W m‑3 and remains stable over 150 days, clearly illustrating the potential of core-shell nanostructures in the applications of microbial fuel cells.

  8. Infinite resistive lattices

    NARCIS (Netherlands)

    Atkinson, D; van Steenwijk, F.J.

    The resistance between two arbitrary nodes in an infinite square lattice of:identical resistors is calculated, The method is generalized to infinite triangular and hexagonal lattices in two dimensions, and also to infinite cubic and hypercubic lattices in three and more dimensions. (C) 1999 American

  9. Lattice Regularization and Symmetries

    CERN Document Server

    Hasenfratz, Peter; Von Allmen, R; Allmen, Reto von; Hasenfratz, Peter; Niedermayer, Ferenc

    2006-01-01

    Finding the relation between the symmetry transformations in the continuum and on the lattice might be a nontrivial task as illustrated by the history of chiral symmetry. Lattice actions induced by a renormalization group procedure inherit all symmetries of the continuum theory. We give a general procedure which gives the corresponding symmetry transformations on the lattice.

  10. Benchmarking of human resources management

    Directory of Open Access Journals (Sweden)

    David M. Akinnusi

    2008-12-01

    Full Text Available This paper reviews the role of human resource management (HRM which, today, plays a strategic partnership role in management. The focus of the paper is on HRM in the public sector, where much hope rests on HRM as a means of transforming the public service and achieving much needed service delivery. However, a critical evaluation of HRM practices in the public sector reveals that these services leave much to be desired. The paper suggests the adoption of benchmarking as a process to revamp HRM in the public sector so that it is able to deliver on its promises. It describes the nature and process of benchmarking and highlights the inherent difficulties in applying benchmarking in HRM. It concludes with some suggestions for a plan of action. The process of identifying “best” practices in HRM requires the best collaborative efforts of HRM practitioners and academicians. If used creatively, benchmarking has the potential to bring about radical and positive changes in HRM in the public sector. The adoption of the benchmarking process is, in itself, a litmus test of the extent to which HRM in the public sector has grown professionally.

  11. Benchmark simulation models, quo vadis?

    Science.gov (United States)

    Jeppsson, U; Alex, J; Batstone, D J; Benedetti, L; Comas, J; Copp, J B; Corominas, L; Flores-Alsina, X; Gernaey, K V; Nopens, I; Pons, M-N; Rodríguez-Roda, I; Rosen, C; Steyer, J-P; Vanrolleghem, P A; Volcke, E I P; Vrecko, D

    2013-01-01

    As the work of the IWA Task Group on Benchmarking of Control Strategies for wastewater treatment plants (WWTPs) is coming to an end, it is essential to disseminate the knowledge gained. For this reason, all authors of the IWA Scientific and Technical Report on benchmarking have come together to provide their insights, highlighting areas where knowledge may still be deficient and where new opportunities are emerging, and to propose potential avenues for future development and application of the general benchmarking framework and its associated tools. The paper focuses on the topics of temporal and spatial extension, process modifications within the WWTP, the realism of models, control strategy extensions and the potential for new evaluation tools within the existing benchmark system. We find that there are major opportunities for application within all of these areas, either from existing work already being done within the context of the benchmarking simulation models (BSMs) or applicable work in the wider literature. Of key importance is increasing capability, usability and transparency of the BSM package while avoiding unnecessary complexity.

  12. A benchmark for non-covalent interactions in solids.

    Science.gov (United States)

    Otero-de-la-Roza, A; Johnson, Erin R

    2012-08-01

    A benchmark for non-covalent interactions in solids (C21) based on the experimental sublimation enthalpies and geometries of 21 molecular crystals is presented. Thermal and zero-point effects are carefully accounted for and reference lattice energies and thermal pressures are provided, which allow dispersion-corrected density functionals to be assessed in a straightforward way. Other thermal corrections to the sublimation enthalpy (the 2RT term) are reexamined. We compare the recently implemented exchange-hole dipole moment (XDM) model with other approaches in the literature to find that XDM roughly doubles the accuracy of DFT-D2 and non-local functionals in computed lattice energies (4.8 kJ/mol mean absolute error) while, at the same time, predicting cell geometries within less than 2% of the experimental result on average. The XDM model of dispersion interactions is confirmed as a very promising approach in solid-state applications.

  13. Safety, codes and standards for hydrogen installations. Metrics development and benchmarking

    Energy Technology Data Exchange (ETDEWEB)

    Harris, Aaron P. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Dedrick, Daniel E. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); LaFleur, Angela Christine [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); San Marchi, Christopher W. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2014-04-01

    Automakers and fuel providers have made public commitments to commercialize light duty fuel cell electric vehicles and fueling infrastructure in select US regions beginning in 2014. The development, implementation, and advancement of meaningful codes and standards is critical to enable the effective deployment of clean and efficient fuel cell and hydrogen solutions in the energy technology marketplace. Metrics pertaining to the development and implementation of safety knowledge, codes, and standards are important to communicate progress and inform future R&D investments. This document describes the development and benchmarking of metrics specific to the development of hydrogen specific codes relevant for hydrogen refueling stations. These metrics will be most useful as the hydrogen fuel market transitions from pre-commercial to early-commercial phases. The target regions in California will serve as benchmarking case studies to quantify the success of past investments in research and development supporting safety codes and standards R&D.

  14. Randomized benchmarking of multiqubit gates.

    Science.gov (United States)

    Gaebler, J P; Meier, A M; Tan, T R; Bowler, R; Lin, Y; Hanneke, D; Jost, J D; Home, J P; Knill, E; Leibfried, D; Wineland, D J

    2012-06-29

    We describe an extension of single-qubit gate randomized benchmarking that measures the error of multiqubit gates in a quantum information processor. This platform-independent protocol evaluates the performance of Clifford unitaries, which form a basis of fault-tolerant quantum computing. We implemented the benchmarking protocol with trapped ions and found an error per random two-qubit Clifford unitary of 0.162±0.008, thus setting the first benchmark for such unitaries. By implementing a second set of sequences with an extra two-qubit phase gate inserted after each step, we extracted an error per phase gate of 0.069±0.017. We conducted these experiments with transported, sympathetically cooled ions in a multizone Paul trap-a system that can in principle be scaled to larger numbers of ions.

  15. Wilson Dslash Kernel From Lattice QCD Optimization

    Energy Technology Data Exchange (ETDEWEB)

    Joo, Balint [Jefferson Lab, Newport News, VA; Smelyanskiy, Mikhail [Parallel Computing Lab, Intel Corporation, California, USA; Kalamkar, Dhiraj D. [Parallel Computing Lab, Intel Corporation, India; Vaidyanathan, Karthikeyan [Parallel Computing Lab, Intel Corporation, India

    2015-07-01

    Lattice Quantum Chromodynamics (LQCD) is a numerical technique used for calculations in Theoretical Nuclear and High Energy Physics. LQCD is traditionally one of the first applications ported to many new high performance computing architectures and indeed LQCD practitioners have been known to design and build custom LQCD computers. Lattice QCD kernels are frequently used as benchmarks (e.g. 168.wupwise in the SPEC suite) and are generally well understood, and as such are ideal to illustrate several optimization techniques. In this chapter we will detail our work in optimizing the Wilson-Dslash kernels for Intel Xeon Phi, however, as we will show the technique gives excellent performance on regular Xeon Architecture as well.

  16. Radiation Detection Computational Benchmark Scenarios

    Energy Technology Data Exchange (ETDEWEB)

    Shaver, Mark W.; Casella, Andrew M.; Wittman, Richard S.; McDonald, Ben S.

    2013-09-24

    Modeling forms an important component of radiation detection development, allowing for testing of new detector designs, evaluation of existing equipment against a wide variety of potential threat sources, and assessing operation performance of radiation detection systems. This can, however, result in large and complex scenarios which are time consuming to model. A variety of approaches to radiation transport modeling exist with complementary strengths and weaknesses for different problems. This variety of approaches, and the development of promising new tools (such as ORNL’s ADVANTG) which combine benefits of multiple approaches, illustrates the need for a means of evaluating or comparing different techniques for radiation detection problems. This report presents a set of 9 benchmark problems for comparing different types of radiation transport calculations, identifying appropriate tools for classes of problems, and testing and guiding the development of new methods. The benchmarks were drawn primarily from existing or previous calculations with a preference for scenarios which include experimental data, or otherwise have results with a high level of confidence, are non-sensitive, and represent problem sets of interest to NA-22. From a technical perspective, the benchmarks were chosen to span a range of difficulty and to include gamma transport, neutron transport, or both and represent different important physical processes and a range of sensitivity to angular or energy fidelity. Following benchmark identification, existing information about geometry, measurements, and previous calculations were assembled. Monte Carlo results (MCNP decks) were reviewed or created and re-run in order to attain accurate computational times and to verify agreement with experimental data, when present. Benchmark information was then conveyed to ORNL in order to guide testing and development of hybrid calculations. The results of those ADVANTG calculations were then sent to PNNL for

  17. Perceptual hashing algorithms benchmark suite

    Institute of Scientific and Technical Information of China (English)

    Zhang Hui; Schmucker Martin; Niu Xiamu

    2007-01-01

    Numerous perceptual hashing algorithms have been developed for identification and verification of multimedia objects in recent years. Many application schemes have been adopted for various commercial objects. Developers and users are looking for a benchmark tool to compare and evaluate their current algorithms or technologies. In this paper, a novel benchmark platform is presented. PHABS provides an open framework and lets its users define their own test strategy, perform tests, collect and analyze test data. With PHABS, various performance parameters of algorithms can be tested, and different algorithms or algorithms with different parameters can be evaluated and compared easily.

  18. Closed-loop neuromorphic benchmarks

    CSIR Research Space (South Africa)

    Stewart

    2015-11-01

    Full Text Available Benchmarks   Terrence C. Stewart 1* , Travis DeWolf 1 , Ashley Kleinhans 2 , Chris Eliasmith 1   1 University of Waterloo, Canada, 2 Council for Scientific and Industrial Research, South Africa   Submitted to Journal:   Frontiers in Neuroscience   Specialty... the study was exempt from ethical approval procedures.) Did the study presented in the manuscript involve human or animal subjects: No I v i w 1Closed-loop Neuromorphic Benchmarks Terrence C. Stewart 1,∗, Travis DeWolf 1, Ashley Kleinhans 2 and Chris...

  19. The contextual benchmark method: benchmarking e-government services

    NARCIS (Netherlands)

    Jansen, Jurjen; Vries, de Sjoerd; Schaik, van Paul

    2010-01-01

    This paper offers a new method for benchmarking e-Government services. Government organizations no longer doubt the need to deliver their services on line. Instead, the question that is more relevant is how well the electronic services offered by a particular organization perform in comparison with

  20. Benchmarking Internet of Things devices

    CSIR Research Space (South Africa)

    Kruger, CP

    2014-07-01

    Full Text Available International Conference on Industrial Informatics (INDIN), 27-30 July 2014 Benchmarking Internet of Things devices C.P. Kruger y and G.P. Hancke yz *Advanced Sensor Networks Research Group, Counsil for Scientific and Industrial Research, South...

  1. Benchmarked Library Websites Comparative Study

    KAUST Repository

    Ramli, Rindra M.

    2015-01-01

    This presentation provides an analysis of services provided by the benchmarked library websites. The exploratory study includes comparison of these websites against a list of criterion and presents a list of services that are most commonly deployed by the selected websites. In addition to that, the investigators proposed a list of services that could be provided via the KAUST library website.

  2. Engine Benchmarking - Final CRADA Report

    Energy Technology Data Exchange (ETDEWEB)

    Wallner, Thomas [Argonne National Lab. (ANL), Argonne, IL (United States)

    2016-01-01

    Detailed benchmarking of the powertrains of three light-duty vehicles was performed. Results were presented and provided to CRADA partners. The vehicles included a MY2011 Audi A4, a MY2012 Mini Cooper and a MY2014 Nissan Versa.

  3. Benchmarking Universiteitsvastgoed: Managementinformatie bij vastgoedbeslissingen

    NARCIS (Netherlands)

    Den Heijer, A.C.; De Vries, J.C.

    2004-01-01

    Voor u ligt het eindrapport van het onderzoek "Benchmarking universiteitsvastgoed". Dit rapport is de samenvoeging van twee deel producten: het theorierapport (verschenen in december 2003) en het praktijkrapport (verschenen in januari 2004). Onderwerpen in het theoriedeel zijn de analyse van andere

  4. Benchmark Lisp And Ada Programs

    Science.gov (United States)

    Davis, Gloria; Galant, David; Lim, Raymond; Stutz, John; Gibson, J.; Raghavan, B.; Cheesema, P.; Taylor, W.

    1992-01-01

    Suite of nonparallel benchmark programs, ELAPSE, designed for three tests: comparing efficiency of computer processing via Lisp vs. Ada; comparing efficiencies of several computers processing via Lisp; or comparing several computers processing via Ada. Tests efficiency which computer executes routines in each language. Available for computer equipped with validated Ada compiler and/or Common Lisp system.

  5. 42 CFR 440.385 - Delivery of benchmark and benchmark-equivalent coverage through managed care entities.

    Science.gov (United States)

    2010-10-01

    ... 42 Public Health 4 2010-10-01 2010-10-01 false Delivery of benchmark and benchmark-equivalent...: GENERAL PROVISIONS Benchmark Benefit and Benchmark-Equivalent Coverage § 440.385 Delivery of benchmark and benchmark-equivalent coverage through managed care entities. In implementing benchmark or...

  6. Calculations for a BWR Lattice with Adjacent Gadolinium Pins Using the Monte Carlo Cell Code Serpent v.1.1.7

    Directory of Open Access Journals (Sweden)

    Diego Ferraro

    2011-01-01

    Full Text Available Monte Carlo neutron transport codes are usually used to perform criticality calculations and to solve shielding problems due to their capability to model complex systems without major approximations. However, these codes demand high computational resources. The improvement in computer capabilities leads to several new applications of Monte Carlo neutron transport codes. An interesting one is to use this method to perform cell-level fuel assembly calculations in order to obtain few group constants to be used on core calculations. In the present work the VTT recently developed Serpent v.1.1.7 cell-oriented neutronic calculation code is used to perform cell calculations of a theoretical BWR lattice benchmark with burnable poisons, and the main results are compared to reported ones and with calculations performed with Condor v.2.61, the INVAP's neutronic collision probability cell code.

  7. Jammed lattice sphere packings

    OpenAIRE

    Kallus, Yoav; Marcotte, Étienne; Torquato, Salvatore

    2013-01-01

    We generate and study an ensemble of isostatic jammed hard-sphere lattices. These lattices are obtained by compression of a periodic system with an adaptive unit cell containing a single sphere until the point of mechanical stability. We present detailed numerical data about the densities, pair correlations, force distributions, and structure factors of such lattices. We show that this model retains many of the crucial structural features of the classical hard-sphere model and propose it as a...

  8. On Traveling Waves in Lattices: The Case of Riccati Lattices

    Science.gov (United States)

    Dimitrova, Zlatinka

    2012-09-01

    The method of simplest equation is applied for analysis of a class of lattices described by differential-difference equations that admit traveling-wave solutions constructed on the basis of the solution of the Riccati equation. We denote such lattices as Riccati lattices. We search for Riccati lattices within two classes of lattices: generalized Lotka-Volterra lattices and generalized Holling lattices. We show that from the class of generalized Lotka-Volterra lattices only the Wadati lattice belongs to the class of Riccati lattices. Opposite to this many lattices from the Holling class are Riccati lattices. We construct exact traveling wave solutions on the basis of the solution of Riccati equation for three members of the class of generalized Holling lattices.

  9. Twisted mass lattice QCD

    Energy Technology Data Exchange (ETDEWEB)

    Shindler, A. [Deutsches Elektronen-Synchrotron (DESY), Zeuthen (Germany). John von Neumann-Inst. fuer Computing NIC

    2007-07-15

    I review the theoretical foundations, properties as well as the simulation results obtained so far of a variant of the Wilson lattice QCD formulation: Wilson twisted mass lattice QCD. Emphasis is put on the discretization errors and on the effects of these discretization errors on the phase structure for Wilson-like fermions in the chiral limit. The possibility to use in lattice simulations different lattice actions for sea and valence quarks to ease the renormalization patterns of phenomenologically relevant local operators, is also discussed. (orig.)

  10. Sensitivity of MCNP5 calculations for a spherical numerical benchmark problem to the angular scattering distributions for deuterium

    Energy Technology Data Exchange (ETDEWEB)

    Kozier, K. S. [Atomic Energy of Canada Limited, Chalk River Laboratories, Chalk River, Ont. K0J 1J0 (Canada)

    2006-07-01

    This paper examines the sensitivity of MCNP5 k{sub eff} results to various deuterium data files for a simple benchmark problem consisting of an 8.4-cm radius sphere of uranium surrounded by an annulus of deuterium at the nuclide number density corresponding to heavy water. This study was performed to help clarify why {Delta}k{sub eff} values of about 10 mk are obtained when different ENDF/B deuterium data files are used in simulations of critical experiments involving solutions of high-enrichment uranyl fluoride in heavy water, while simulations of low-leakage, heterogeneous critical lattices of natural-uranium fuel rods in heavy water show differences of <1 mk. The benchmark calculations were performed as a function of deuterium reflector thickness for several uranium compositions using deuterium ACE files derived from ENDF/B-VII.b1 (release beta 1), ENDF/B-VI.4 and JENDL-3.3, which differ primarily in the energy/angle distributions for elastic scattering <3.2 MeV. Calculations were also performed using modified ACE files having equiprobable cosine bin values in the centre-of-mass reference frame in a progressive manner with increasing energy. It was found that the {Delta}k{sub eff} values increased with deuterium reflector thickness and uranium enrichment. The studies using modified ACE files indicate that most of the reactivity differences arise at energies <1 MeV; hence, this energy range should be given priority if new scattering distribution measurements are undertaken. (authors)

  11. Benchmarking clinical photography services in the NHS.

    Science.gov (United States)

    Arbon, Giles

    2015-01-01

    Benchmarking is used in services across the National Health Service (NHS) using various benchmarking programs. Clinical photography services do not have a program in place and services have to rely on ad hoc surveys of other services. A trial benchmarking exercise was undertaken with 13 services in NHS Trusts. This highlights valuable data and comparisons that can be used to benchmark and improve services throughout the profession.

  12. IAEA CRP on HTGR Uncertainties in Modeling: Assessment of Phase I Lattice to Core Model Uncertainties

    Energy Technology Data Exchange (ETDEWEB)

    Rouxelin, Pascal Nicolas [Idaho National Lab. (INL), Idaho Falls, ID (United States); Strydom, Gerhard [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2016-09-01

    Best-estimate plus uncertainty analysis of reactors is replacing the traditional conservative (stacked uncertainty) method for safety and licensing analysis. To facilitate uncertainty analysis applications, a comprehensive approach and methodology must be developed and applied. High temperature gas cooled reactors (HTGRs) have several features that require techniques not used in light-water reactor analysis (e.g., coated-particle design and large graphite quantities at high temperatures). The International Atomic Energy Agency has therefore launched the Coordinated Research Project on HTGR Uncertainty Analysis in Modeling to study uncertainty propagation in the HTGR analysis chain. The benchmark problem defined for the prismatic design is represented by the General Atomics Modular HTGR 350. The main focus of this report is the compilation and discussion of the results obtained for various permutations of Exercise I 2c and the use of the cross section data in Exercise II 1a of the prismatic benchmark, which is defined as the last and first steps of the lattice and core simulation phases, respectively. The report summarizes the Idaho National Laboratory (INL) best estimate results obtained for Exercise I 2a (fresh single-fuel block), Exercise I 2b (depleted single-fuel block), and Exercise I 2c (super cell) in addition to the first results of an investigation into the cross section generation effects for the super-cell problem. The two dimensional deterministic code known as the New ESC based Weighting Transport (NEWT) included in the Standardized Computer Analyses for Licensing Evaluation (SCALE) 6.1.2 package was used for the cross section evaluation, and the results obtained were compared to the three dimensional stochastic SCALE module KENO VI. The NEWT cross section libraries were generated for several permutations of the current benchmark super-cell geometry and were then provided as input to the Phase II core calculation of the stand alone neutronics Exercise

  13. Benchmarking: Achieving the best in class

    Energy Technology Data Exchange (ETDEWEB)

    Kaemmerer, L

    1996-05-01

    Oftentimes, people find the process of organizational benchmarking an onerous task, or, because they do not fully understand the nature of the process, end up with results that are less than stellar. This paper presents the challenges of benchmarking and reasons why benchmarking can benefit an organization in today`s economy.

  14. The LDBC Social Network Benchmark: Interactive Workload

    NARCIS (Netherlands)

    Erling, O.; Averbuch, A.; Larriba-Pey, J.; Chafi, H.; Gubichev, A.; Prat, A.; Pham, M.D.; Boncz, P.A.

    2015-01-01

    The Linked Data Benchmark Council (LDBC) is now two years underway and has gathered strong industrial participation for its mission to establish benchmarks, and benchmarking practices for evaluating graph data management systems. The LDBC introduced a new choke-point driven methodology for developin

  15. How Benchmarking and Higher Education Came Together

    Science.gov (United States)

    Levy, Gary D.; Ronco, Sharron L.

    2012-01-01

    This chapter introduces the concept of benchmarking and how higher education institutions began to use benchmarking for a variety of purposes. Here, benchmarking is defined as a strategic and structured approach whereby an organization compares aspects of its processes and/or outcomes to those of another organization or set of organizations to…

  16. Fuel Cell Technology Status Analysis Project: Partnership Opportunities

    Energy Technology Data Exchange (ETDEWEB)

    2017-03-13

    Fact sheet describing the National Renewable Energy Laboratory's (NREL's) Fuel Cell Technology Status Analysis Project. NREL is seeking fuel cell industry partners from the United States and abroad to participate in an objective and credible analysis of commercially available fuel cell products to benchmark the current state of the technology and support industry growth.

  17. Methodology for Benchmarking IPsec Gateways

    Directory of Open Access Journals (Sweden)

    Adam Tisovský

    2012-08-01

    Full Text Available The paper analyses forwarding performance of IPsec gateway over the rage of offered loads. It focuses on the forwarding rate and packet loss particularly at the gateway’s performance peak and at the state of gateway’s overload. It explains possible performance degradation when the gateway is overloaded by excessive offered load. The paper further evaluates different approaches for obtaining forwarding performance parameters – a widely used throughput described in RFC 1242, maximum forwarding rate with zero packet loss and us proposed equilibrium throughput. According to our observations equilibrium throughput might be the most universal parameter for benchmarking security gateways as the others may be dependent on the duration of test trials. Employing equilibrium throughput would also greatly shorten the time required for benchmarking. Lastly, the paper presents methodology and a hybrid step/binary search algorithm for obtaining value of equilibrium throughput.

  18. Geothermal Heat Pump Benchmarking Report

    Energy Technology Data Exchange (ETDEWEB)

    None

    1997-01-17

    A benchmarking study was conducted on behalf of the Department of Energy to determine the critical factors in successful utility geothermal heat pump programs. A Successful program is one that has achieved significant market penetration. Successfully marketing geothermal heat pumps has presented some major challenges to the utility industry. However, select utilities have developed programs that generate significant GHP sales. This benchmarking study concludes that there are three factors critical to the success of utility GHP marking programs: (1) Top management marketing commitment; (2) An understanding of the fundamentals of marketing and business development; and (3) An aggressive competitive posture. To generate significant GHP sales, competitive market forces must by used. However, because utilities have functioned only in a regulated arena, these companies and their leaders are unschooled in competitive business practices. Therefore, a lack of experience coupled with an intrinsically non-competitive culture yields an industry environment that impedes the generation of significant GHP sales in many, but not all, utilities.

  19. Benchmarking Variable Selection in QSAR.

    Science.gov (United States)

    Eklund, Martin; Norinder, Ulf; Boyer, Scott; Carlsson, Lars

    2012-02-01

    Variable selection is important in QSAR modeling since it can improve model performance and transparency, as well as reduce the computational cost of model fitting and predictions. Which variable selection methods that perform well in QSAR settings is largely unknown. To address this question we, in a total of 1728 benchmarking experiments, rigorously investigated how eight variable selection methods affect the predictive performance and transparency of random forest models fitted to seven QSAR datasets covering different endpoints, descriptors sets, types of response variables, and number of chemical compounds. The results show that univariate variable selection methods are suboptimal and that the number of variables in the benchmarked datasets can be reduced with about 60 % without significant loss in model performance when using multivariate adaptive regression splines MARS and forward selection.

  20. OECD/NEA burnup credit calculational criticality benchmark Phase I-B results

    Energy Technology Data Exchange (ETDEWEB)

    DeHart, M.D.; Parks, C.V. [Oak Ridge National Lab., TN (United States); Brady, M.C. [Sandia National Labs., Las Vegas, NV (United States)

    1996-06-01

    In most countries, criticality analysis of LWR fuel stored in racks and casks has assumed that the fuel is fresh with the maximum allowable initial enrichment. This assumption has led to the design of widely spaced and/or highly poisoned storage and transport arrays. If credit is assumed for fuel burnup, initial enrichment limitations can be raised in existing systems, and more compact and economical arrays can be designed. Such reliance on the reduced reactivity of spent fuel for criticality control is referred to as burnup credit. The Burnup Credit Working Group, formed under the auspices of the Nuclear Energy Agency of the Organization for Economic Cooperation and Development, has established a set of well-defined calculational benchmarks designed to study significant aspects of burnup credit computational methods. These benchmarks are intended to provide a means for the intercomparison of computer codes, methods, and data applied in spent fuel analysis. The benchmarks have been divided into multiple phases, each phase focusing on a particular feature of burnup credit analysis. This report summarizes the results and findings of the Phase I-B benchmark, which was proposed to provide a comparison of the ability of different code systems and data libraries to perform depletion analysis for the prediction of spent fuel isotopic concentrations. Results included here represent 21 different sets of calculations submitted by 16 different organizations worldwide and are based on a limited set of nuclides determined to have the most important effect on the neutron multiplication factor of light-water-reactor spent fuel. A comparison of all sets of results demonstrates that most methods agree to within 10% in the ability to estimate the spent fuel concentrations of most actinides. All methods agree within 11% about the average for all fission products studied. Most deviations are less than 10%, and many are less than 5%. The exceptions are Sm 149, Sm 151, and Gd 155.

  1. Nuclear lattice simulations

    Directory of Open Access Journals (Sweden)

    Epelbaum E.

    2010-04-01

    Full Text Available We review recent progress on nuclear lattice simulations using chiral effective field theory. We discuss lattice results for dilute neutron matter at next-to-leading order, three-body forces at next-to-next-toleading order, isospin-breaking and Coulomb effects, and the binding energy of light nuclei.

  2. A Benchmark for Management Effectiveness

    OpenAIRE

    Zimmermann, Bill; Chanaron, Jean-Jacques; Klieb, Leslie

    2007-01-01

    International audience; This study presents a tool to gauge managerial effectiveness in the form of a questionnaire that is easy to administer and score. The instrument covers eight distinct areas of organisational climate and culture of management inside a company or department. Benchmark scores were determined by administering sample-surveys to a wide cross-section of individuals from numerous firms in Southeast Louisiana, USA. Scores remained relatively constant over a seven-year timeframe...

  3. Restaurant Energy Use Benchmarking Guideline

    Energy Technology Data Exchange (ETDEWEB)

    Hedrick, R.; Smith, V.; Field, K.

    2011-07-01

    A significant operational challenge for food service operators is defining energy use benchmark metrics to compare against the performance of individual stores. Without metrics, multiunit operators and managers have difficulty identifying which stores in their portfolios require extra attention to bring their energy performance in line with expectations. This report presents a method whereby multiunit operators may use their own utility data to create suitable metrics for evaluating their operations.

  4. Thermal Performance Benchmarking: Annual Report

    Energy Technology Data Exchange (ETDEWEB)

    Moreno, Gilbert

    2016-04-08

    The goal for this project is to thoroughly characterize the performance of state-of-the-art (SOA) automotive power electronics and electric motor thermal management systems. Information obtained from these studies will be used to: Evaluate advantages and disadvantages of different thermal management strategies; establish baseline metrics for the thermal management systems; identify methods of improvement to advance the SOA; increase the publicly available information related to automotive traction-drive thermal management systems; help guide future electric drive technologies (EDT) research and development (R&D) efforts. The performance results combined with component efficiency and heat generation information obtained by Oak Ridge National Laboratory (ORNL) may then be used to determine the operating temperatures for the EDT components under drive-cycle conditions. In FY15, the 2012 Nissan LEAF power electronics and electric motor thermal management systems were benchmarked. Testing of the 2014 Honda Accord Hybrid power electronics thermal management system started in FY15; however, due to time constraints it was not possible to include results for this system in this report. The focus of this project is to benchmark the thermal aspects of the systems. ORNL's benchmarking of electric and hybrid electric vehicle technology reports provide detailed descriptions of the electrical and packaging aspects of these automotive systems.

  5. HS06 Benchmark for an ARM Server

    Science.gov (United States)

    Kluth, Stefan

    2014-06-01

    We benchmarked an ARM cortex-A9 based server system with a four-core CPU running at 1.1 GHz. The system used Ubuntu 12.04 as operating system and the HEPSPEC 2006 (HS06) benchmarking suite was compiled natively with gcc-4.4 on the system. The benchmark was run for various settings of the relevant gcc compiler options. We did not find significant influence from the compiler options on the benchmark result. The final HS06 benchmark result is 10.4.

  6. HS06 Benchmark for an ARM Server

    CERN Document Server

    Kluth, Stefan

    2013-01-01

    We benchmarked an ARM cortex-A9 based server system with a four-core CPU running at 1.1 GHz. The system used Ubuntu 12.04 as operating system and the HEPSPEC 2006 (HS06) benchmarking suite was compiled natively with gcc-4.4 on the system. The benchmark was run for various settings of the relevant gcc compiler options. We did not find significant influence from the compiler options on the benchmark result. The final HS06 benchmark result is 10.4.

  7. Thermal Analysis of a TREAT Fuel Assembly

    Energy Technology Data Exchange (ETDEWEB)

    Papadias, Dionissios [Argonne National Lab. (ANL), Argonne, IL (United States); Wright, Arthur E. [Argonne National Lab. (ANL), Argonne, IL (United States)

    2014-07-09

    The objective of this study was to explore options as to reduce peak cladding temperatures despite an increase in peak fuel temperatures. A 3D thermal-hydraulic model for a single TREAT fuel assembly was benchmarked to reproduce results obtained with previous thermal models developed for a TREAT HEU fuel assembly. In exercising this model, and variants thereof depending on the scope of analysis, various options were explored to reduce the peak cladding temperatures.

  8. Argonne Code Center: Benchmark problem book.

    Energy Technology Data Exchange (ETDEWEB)

    None, None

    1977-06-01

    This book is an outgrowth of activities of the Computational Benchmark Problems Committee of the Mathematics and Computation Division of the American Nuclear Society. This is the second supplement of the original benchmark book which was first published in February, 1968 and contained computational benchmark problems in four different areas. Supplement No. 1, which was published in December, 1972, contained corrections to the original benchmark book plus additional problems in three new areas. The current supplement. Supplement No. 2, contains problems in eight additional new areas. The objectives of computational benchmark work and the procedures used by the committee in pursuing the objectives are outlined in the original edition of the benchmark book (ANL-7416, February, 1968). The members of the committee who have made contributions to Supplement No. 2 are listed below followed by the contributors to the earlier editions of the benchmark book.

  9. Large Core Code Evaluation Working Group Benchmark Problem Four: neutronics and burnup analysis of a large heterogeneous fast reactor. Part 1. Analysis of benchmark results. [LMFBR

    Energy Technology Data Exchange (ETDEWEB)

    Cowan, C.L.; Protsik, R.; Lewellen, J.W. (eds.)

    1984-01-01

    The Large Core Code Evaluation Working Group Benchmark Problem Four was specified to provide a stringent test of the current methods which are used in the nuclear design and analyses process. The benchmark specifications provided a base for performing detailed burnup calculations over the first two irradiation cycles for a large heterogeneous fast reactor. Particular emphasis was placed on the techniques for modeling the three-dimensional benchmark geometry, and sensitivity studies were carried out to determine the performance parameter sensitivities to changes in the neutronics and burnup specifications. The results of the Benchmark Four calculations indicated that a linked RZ-XY (Hex) two-dimensional representation of the benchmark model geometry can be used to predict mass balance data, power distributions, regionwise fuel exposure data and burnup reactivities with good accuracy when compared with the results of direct three-dimensional computations. Most of the small differences in the results of the benchmark analyses by the different participants were attributed to ambiguities in carrying out the regionwise flux renormalization calculations throughout the burnup step.

  10. Classical Logic and Quantum Logic with Multiple and Common Lattice Models

    Directory of Open Access Journals (Sweden)

    Mladen Pavičić

    2016-01-01

    Full Text Available We consider a proper propositional quantum logic and show that it has multiple disjoint lattice models, only one of which is an orthomodular lattice (algebra underlying Hilbert (quantum space. We give an equivalent proof for the classical logic which turns out to have disjoint distributive and nondistributive ortholattices. In particular, we prove that both classical logic and quantum logic are sound and complete with respect to each of these lattices. We also show that there is one common nonorthomodular lattice that is a model of both quantum and classical logic. In technical terms, that enables us to run the same classical logic on both a digital (standard, two-subset, 0-1-bit computer and a nondigital (say, a six-subset computer (with appropriate chips and circuits. With quantum logic, the same six-element common lattice can serve us as a benchmark for an efficient evaluation of equations of bigger lattice models or theorems of the logic.

  11. Benchmarks

    Data.gov (United States)

    Earth Data Analysis Center, University of New Mexico — The National Flood Hazard Layer (NFHL) data incorporates all Digital Flood Insurance Rate Map(DFIRM) databases published by FEMA, and any Letters Of Map Revision...

  12. Modified Lattice Landau Gauge

    CERN Document Server

    Von Smekal, L; Sternbeck, A; Williams, A G

    2007-01-01

    We propose a modified lattice Landau gauge based on stereographically projecting the link variables on the circle S^1 -> R for compact U(1) or the 3-sphere S^3 -> R^3 for SU(2) before imposing the Landau gauge condition. This can reduce the number of Gribov copies exponentially and solves the Gribov problem in compact U(1) where it is a lattice artifact. Applied to the maximal Abelian subgroup this might be just enough to avoid the perfect cancellation amongst the Gribov copies in a lattice BRST formulation for SU(N), and thus to avoid the Neuberger 0/0 problem. The continuum limit of the Landau gauge remains unchanged.

  13. Jammed lattice sphere packings.

    Science.gov (United States)

    Kallus, Yoav; Marcotte, Étienne; Torquato, Salvatore

    2013-12-01

    We generate and study an ensemble of isostatic jammed hard-sphere lattices. These lattices are obtained by compression of a periodic system with an adaptive unit cell containing a single sphere until the point of mechanical stability. We present detailed numerical data about the densities, pair correlations, force distributions, and structure factors of such lattices. We show that this model retains many of the crucial structural features of the classical hard-sphere model and propose it as a model for the jamming and glass transitions that enables exploration of much higher dimensions than are usually accessible.

  14. Jammed lattice sphere packings

    Science.gov (United States)

    Kallus, Yoav; Marcotte, Étienne; Torquato, Salvatore

    2013-12-01

    We generate and study an ensemble of isostatic jammed hard-sphere lattices. These lattices are obtained by compression of a periodic system with an adaptive unit cell containing a single sphere until the point of mechanical stability. We present detailed numerical data about the densities, pair correlations, force distributions, and structure factors of such lattices. We show that this model retains many of the crucial structural features of the classical hard-sphere model and propose it as a model for the jamming and glass transitions that enables exploration of much higher dimensions than are usually accessible.

  15. PageRank Pipeline Benchmark: Proposal for a Holistic System Benchmark for Big-Data Platforms

    CERN Document Server

    Dreher, Patrick; Hill, Chris; Gadepally, Vijay; Kuszmaul, Bradley; Kepner, Jeremy

    2016-01-01

    The rise of big data systems has created a need for benchmarks to measure and compare the capabilities of these systems. Big data benchmarks present unique scalability challenges. The supercomputing community has wrestled with these challenges for decades and developed methodologies for creating rigorous scalable benchmarks (e.g., HPC Challenge). The proposed PageRank pipeline benchmark employs supercomputing benchmarking methodologies to create a scalable benchmark that is reflective of many real-world big data processing systems. The PageRank pipeline benchmark builds on existing prior scalable benchmarks (Graph500, Sort, and PageRank) to create a holistic benchmark with multiple integrated kernels that can be run together or independently. Each kernel is well defined mathematically and can be implemented in any programming environment. The linear algebraic nature of PageRank makes it well suited to being implemented using the GraphBLAS standard. The computations are simple enough that performance predictio...

  16. NASA Software Engineering Benchmarking Effort

    Science.gov (United States)

    Godfrey, Sally; Rarick, Heather

    2012-01-01

    Benchmarking was very interesting and provided a wealth of information (1) We did see potential solutions to some of our "top 10" issues (2) We have an assessment of where NASA stands with relation to other aerospace/defense groups We formed new contacts and potential collaborations (1) Several organizations sent us examples of their templates, processes (2) Many of the organizations were interested in future collaboration: sharing of training, metrics, Capability Maturity Model Integration (CMMI) appraisers, instructors, etc. We received feedback from some of our contractors/ partners (1) Desires to participate in our training; provide feedback on procedures (2) Welcomed opportunity to provide feedback on working with NASA

  17. Benchmarking of human resources management

    OpenAIRE

    David M. Akinnusi

    2008-01-01

    This paper reviews the role of human resource management (HRM) which, today, plays a strategic partnership role in management. The focus of the paper is on HRM in the public sector, where much hope rests on HRM as a means of transforming the public service and achieving much needed service delivery. However, a critical evaluation of HRM practices in the public sector reveals that these services leave much to be desired. The paper suggests the adoption of benchmarking as a process to revamp HR...

  18. Lattice Gerbe Theory

    CERN Document Server

    Lipstein, Arthur E

    2014-01-01

    We formulate the theory of a 2-form gauge field on a Euclidean spacetime lattice. In this approach, the fundamental degrees of freedom live on the faces of the lattice, and the action can be constructed from the sum over Wilson surfaces associated with each fundamental cube of the lattice. If we take the gauge group to be $U(1)$, the theory reduces to the well-known abelian gerbe theory in the continuum limit. We also propose a very simple and natural non-abelian generalization with gauge group $U(N) \\times U(N)$, which gives rise to $U(N)$ Yang-Mills theory upon dimensional reduction. Formulating the theory on a lattice has several other advantages. In particular, it is possible to compute many observables, such as the expectation value of Wilson surfaces, analytically at strong coupling and numerically for any value of the coupling.

  19. Root lattices and quasicrystals

    Science.gov (United States)

    Baake, M.; Joseph, D.; Kramer, P.; Schlottmann, M.

    1990-10-01

    It is shown that root lattices and their reciprocals might serve as the right pool for the construction of quasicrystalline structure models. All noncrystallographic symmetries observed so far are covered in minimal embedding with maximal symmetry.

  20. SPIN ON THE LATTICE.

    Energy Technology Data Exchange (ETDEWEB)

    ORGINOS,K.

    2003-01-07

    I review the current status of hadronic structure computations on the lattice. I describe the basic lattice techniques and difficulties and present some of the latest lattice results; in particular recent results of the RBC group using domain wall fermions are also discussed. In conclusion, lattice computations can play an important role in understanding the hadronic structure and the fundamental properties of Quantum Chromodynamics (QCD). Although some difficulties still exist, several significant steps have been made. Advances in computer technology are expected to play a significant role in pushing these computations closer to the chiral limit and in including dynamical fermions. RBC has already begun preliminary dynamical domain wall fermion computations [49] which we expect to be pushed forward with the arrival of QCD0C. In the near future, we also expect to complete the non-perturbative renormalization of the relevant derivative operators in quenched QCD.

  1. Superalloy Lattice Block Structures

    Science.gov (United States)

    Nathal, M. V.; Whittenberger, J. D.; Hebsur, M. G.; Kantzos, P. T.; Krause, D. L.

    2004-01-01

    Initial investigations of investment cast superalloy lattice block suggest that this technology will yield a low cost approach to utilize the high temperature strength and environmental resistance of superalloys in lightweight, damage tolerant structural configurations. Work to date has demonstrated that relatively large superalloy lattice block panels can be successfully investment cast from both IN-718 and Mar-M247. These castings exhibited mechanical properties consistent with the strength of the same superalloys measured from more conventional castings. The lattice block structure also accommodates significant deformation without failure, and is defect tolerant in fatigue. The potential of lattice block structures opens new opportunities for the use of superalloys in future generations of aircraft applications that demand strength and environmental resistance at elevated temperatures along with low weight.

  2. Benchmarking and Self-Assessment in the Wine Industry

    Energy Technology Data Exchange (ETDEWEB)

    Galitsky, Christina; Radspieler, Anthony; Worrell, Ernst; Healy,Patrick; Zechiel, Susanne

    2005-12-01

    Not all industrial facilities have the staff or theopportunity to perform a detailed audit of their operations. The lack ofknowledge of energy efficiency opportunities provides an importantbarrier to improving efficiency. Benchmarking programs in the U.S. andabroad have shown to improve knowledge of the energy performance ofindustrial facilities and buildings and to fuel energy managementpractices. Benchmarking provides a fair way to compare the energyintensity of plants, while accounting for structural differences (e.g.,the mix of products produced, climate conditions) between differentfacilities. In California, the winemaking industry is not only one of theeconomic pillars of the economy; it is also a large energy consumer, witha considerable potential for energy-efficiency improvement. LawrenceBerkeley National Laboratory and Fetzer Vineyards developed the firstbenchmarking tool for the California wine industry called "BEST(Benchmarking and Energy and water Savings Tool) Winery". BEST Wineryenables a winery to compare its energy efficiency to a best practicereference winery. Besides overall performance, the tool enables the userto evaluate the impact of implementing efficiency measures. The toolfacilitates strategic planning of efficiency measures, based on theestimated impact of the measures, their costs and savings. The tool willraise awareness of current energy intensities and offer an efficient wayto evaluate the impact of future efficiency measures.

  3. Benchmarking and Self-Assessment in the Wine Industry

    Energy Technology Data Exchange (ETDEWEB)

    Galitsky, Christina; Radspieler, Anthony; Worrell, Ernst; Healy,Patrick; Zechiel, Susanne

    2005-12-01

    Not all industrial facilities have the staff or theopportunity to perform a detailed audit of their operations. The lack ofknowledge of energy efficiency opportunities provides an importantbarrier to improving efficiency. Benchmarking programs in the U.S. andabroad have shown to improve knowledge of the energy performance ofindustrial facilities and buildings and to fuel energy managementpractices. Benchmarking provides a fair way to compare the energyintensity of plants, while accounting for structural differences (e.g.,the mix of products produced, climate conditions) between differentfacilities. In California, the winemaking industry is not only one of theeconomic pillars of the economy; it is also a large energy consumer, witha considerable potential for energy-efficiency improvement. LawrenceBerkeley National Laboratory and Fetzer Vineyards developed the firstbenchmarking tool for the California wine industry called "BEST(Benchmarking and Energy and water Savings Tool) Winery". BEST Wineryenables a winery to compare its energy efficiency to a best practicereference winery. Besides overall performance, the tool enables the userto evaluate the impact of implementing efficiency measures. The toolfacilitates strategic planning of efficiency measures, based on theestimated impact of the measures, their costs and savings. The tool willraise awareness of current energy intensities and offer an efficient wayto evaluate the impact of future efficiency measures.

  4. Comparative Analysis of CTF and Trace Thermal-Hydraulic Codes Using OECD/NRC PSBT Benchmark Void Distribution Database

    OpenAIRE

    2013-01-01

    The international OECD/NRC PSBT benchmark has been established to provide a test bed for assessing the capabilities of thermal-hydraulic codes and to encourage advancement in the analysis of fluid flow in rod bundles. The benchmark was based on one of the most valuable databases identified for the thermal-hydraulics modeling developed by NUPEC, Japan. The database includes void fraction and departure from nucleate boiling measurements in a representative PWR fuel assembly. On behalf of the be...

  5. Meshless lattice Boltzmann method for the simulation of fluid flows.

    Science.gov (United States)

    Musavi, S Hossein; Ashrafizaadeh, Mahmud

    2015-02-01

    A meshless lattice Boltzmann numerical method is proposed. The collision and streaming operators of the lattice Boltzmann equation are separated, as in the usual lattice Boltzmann models. While the purely local collision equation remains the same, we rewrite the streaming equation as a pure advection equation and discretize the resulting partial differential equation using the Lax-Wendroff scheme in time and the meshless local Petrov-Galerkin scheme based on augmented radial basis functions in space. The meshless feature of the proposed method makes it a more powerful lattice Boltzmann solver, especially for cases in which using meshes introduces significant numerical errors into the solution, or when improving the mesh quality is a complex and time-consuming process. Three well-known benchmark fluid flow problems, namely the plane Couette flow, the circular Couette flow, and the impulsively started cylinder flow, are simulated for the validation of the proposed method. Excellent agreement with analytical solutions or with previous experimental and numerical results in the literature is observed in all the simulations. Although the computational resources required for the meshless method per node are higher compared to that of the standard lattice Boltzmann method, it is shown that for cases in which the total number of nodes is significantly reduced, the present method actually outperforms the standard lattice Boltzmann method.

  6. Verification of the code DYN3D/R with the help of international benchmarks

    Energy Technology Data Exchange (ETDEWEB)

    Grundmann, U.; Rohde, U.

    1997-10-01

    Different benchmarks for reactors with quadratic fuel assemblies were calculated with the code DYN3D/R. In this report comparisons with the results of the reference solutions are carried out. The results of DYN3D/R and the reference calculation for the eigenvalue k{sub eff} and the power distribution are shown for the steady-state 3-dimensional IAEA-Benchmark. The results of NEACRP-Benchmarks on control rod ejections in a standard PWR were compared with the reference solutions published by the NEA Data Bank. For assessing the accuracy of DYN3D/R results in comparison to other codes the deviations to the reference solutions are considered. Detailed comparisons with the published reference solutions of the NEA-NSC Benchmarks on uncontrolled withdrawal of control rods are made. The influence of the axial nodalization is also investigated. All in all, a good agreement of the DYN3D/R results with the reference solutions can be seen for the considered benchmark problems. (orig.) [Deutsch] Verschiedene Benchmarks fuer Reaktoren mit quadratischen Brennelementen wurden mit dem Code DYN3D/R berechnet. In diesem Bericht erfolgen Vergleiche mit den Ergebnissen der Referenzloesungen. Die Ergebnisse von DYN3D/R und der Referenzrechnung fuer Eigenwert k{sub eff} und Leistungsverteilung des stationaeren 3-dimensionalen IAEA-Benchmarks werden dargestellt. Die Ergebnisse der NEACRP-Benchmarks fuer die Auswuerfe von Steuerstaeben in einem typischen DWR werden mit den von der NEA Data Bank veroeffentlichten Referenzloesungen verglichen. Zur Einschaetzung der Genauigkeit der DYN3D/R Resultate im Vergleich zu anderen Rechenprogrammen werden die Abweichungen zu den Referenzloesungen betrachtet. Detaillierte Vergleiche mit den Referenzloesungen erfolgen fuer die NEA-NSC Benchmarks zum unkontrollierten Ausfahren von Steuerstaeben. Dabei wird der Einfluss der axialen Nodalisierung untersucht. Insgesamt wird eine gute Uebereinstimmung der DYN3D/R Resultate mit den Referenzloesungen fuer die

  7. Vector Lattice Vortex Solitons

    Institute of Scientific and Technical Information of China (English)

    WANG Jian-Dong; YE Fang-Wei; DONG Liang-Wei; LI Yong-Ping

    2005-01-01

    @@ Two-dimensional vector vortex solitons in harmonic optical lattices are investigated. The stability properties of such solitons are closely connected to the lattice depth Vo. For small Vo, vector vortex solitons with the total zero-angular momentum are more stable than those with the total nonzero-angular momentum, while for large Vo, this case is inversed. If Vo is large enough, both the types of such solitons are stable.

  8. Technicolor on the Lattice

    CERN Document Server

    Pica, C; Lucini, B; Patella, A; Rago, A

    2009-01-01

    Technicolor theories provide an elegant mechanism for dynamical electroweak symmetry breaking. We will discuss the use of lattice simulations to study the strongly-interacting dynamics of some of the candidate theories, with matter fields in representations other than the fundamental. To be viable candidates for phenomenology, such theories need to be different from a scaled-up version of QCD, which were ruled out by LEP precision measurements, and represent a challenge for modern lattice computations.

  9. Automated Lattice Perturbation Theory

    Energy Technology Data Exchange (ETDEWEB)

    Monahan, Christopher

    2014-11-01

    I review recent developments in automated lattice perturbation theory. Starting with an overview of lattice perturbation theory, I focus on the three automation packages currently "on the market": HiPPy/HPsrc, Pastor and PhySyCAl. I highlight some recent applications of these methods, particularly in B physics. In the final section I briefly discuss the related, but distinct, approach of numerical stochastic perturbation theory.

  10. Permutohedral Lattice CNNs

    OpenAIRE

    Kiefel, Martin; Jampani, Varun; Gehler, Peter V.

    2014-01-01

    This paper presents a convolutional layer that is able to process sparse input features. As an example, for image recognition problems this allows an efficient filtering of signals that do not lie on a dense grid (like pixel position), but of more general features (such as color values). The presented algorithm makes use of the permutohedral lattice data structure. The permutohedral lattice was introduced to efficiently implement a bilateral filter, a commonly used image processing operation....

  11. [Benchmarking in health care: conclusions and recommendations].

    Science.gov (United States)

    Geraedts, Max; Selbmann, Hans-Konrad

    2011-01-01

    The German Health Ministry funded 10 demonstration projects and accompanying research of benchmarking in health care. The accompanying research work aimed to infer generalisable findings and recommendations. We performed a meta-evaluation of the demonstration projects and analysed national and international approaches to benchmarking in health care. It was found that the typical benchmarking sequence is hardly ever realised. Most projects lack a detailed analysis of structures and processes of the best performers as a starting point for the process of learning from and adopting best practice. To tap the full potential of benchmarking in health care, participation in voluntary benchmarking projects should be promoted that have been demonstrated to follow all the typical steps of a benchmarking process.

  12. Solitons in spiraling Vogel lattices

    CERN Document Server

    Kartashov, Yaroslav V; Torner, Lluis

    2012-01-01

    We address light propagation in Vogel optical lattices and show that such lattices support a variety of stable soliton solutions in both self-focusing and self-defocusing media, whose propagation constants belong to domains resembling gaps in the spectrum of a truly periodic lattice. The azimuthally-rich structure of Vogel lattices allows generation of spiraling soliton motion.

  13. VENUS-2 MOX Core Benchmark: Results of ORNL Calculations Using HELIOS-1.4 - Revised Report

    Energy Technology Data Exchange (ETDEWEB)

    Ellis, RJ

    2001-06-01

    The Task Force on Reactor-Based Plutonium Disposition (TFRPD) was formed by the Organization for Economic Cooperation and Development/Nuclear Energy Agency (OECD/NEA) to study reactor physics, fuel performance, and fuel cycle issues related to the disposition of weapons-grade (WG) plutonium as mixed-oxide (MOX) reactor fuel. To advance the goals of the TFRPD, 10 countries and 12 institutions participated in a major TFRPD activity: a blind benchmark study to compare code calculations to experimental data for the VENUS-2 MOX core at SCK-CEN in Mol, Belgium. At Oak Ridge National Laboratory, the HELIOS-1.4 code system was used to perform the comprehensive study of pin-cell and MOX core calculations for the VENUS-2 MOX core benchmark study.

  14. An Effective Approach for Benchmarking Implementation

    OpenAIRE

    B. M. Deros; Tan, J.; M.N.A. Rahman; N. A.Q.M. Daud

    2011-01-01

    Problem statement: The purpose of this study is to present a benchmarking guideline, conceptual framework and computerized mini program to assists companies achieve better performance in terms of quality, cost, delivery, supply chain and eventually increase their competitiveness in the market. The study begins with literature review on benchmarking definition, barriers and advantages from the implementation and the study of benchmarking framework. Approach: Thirty res...

  15. Computational Chemistry Comparison and Benchmark Database

    Science.gov (United States)

    SRD 101 NIST Computational Chemistry Comparison and Benchmark Database (Web, free access)   The NIST Computational Chemistry Comparison and Benchmark Database is a collection of experimental and ab initio thermochemical properties for a selected set of molecules. The goals are to provide a benchmark set of molecules for the evaluation of ab initio computational methods and allow the comparison between different ab initio computational methods for the prediction of thermochemical properties.

  16. Benchmarking i eksternt regnskab og revision

    DEFF Research Database (Denmark)

    Thinggaard, Frank; Kiertzner, Lars

    2001-01-01

    løbende i en benchmarking-proces. Dette kapitel vil bredt undersøge, hvor man med nogen ret kan få benchmarking-begrebet knyttet til eksternt regnskab og revision. Afsnit 7.1 beskæftiger sig med det eksterne årsregnskab, mens afsnit 7.2 tager fat i revisionsområdet. Det sidste afsnit i kapitlet opsummerer...... betragtningerne om benchmarking i forbindelse med begge områder....

  17. Using benchmarking for the primary allocation of EU allowances. An application to the German power sector

    Energy Technology Data Exchange (ETDEWEB)

    Schleich, J.; Cremer, C.

    2007-07-01

    Basing allocation of allowances for existing installations under the EU Emissions Trading Scheme on specific emission values (benchmarks) rather than on historic emissions may have several advantages. Benchmarking may recognize early ac-tion, provide higher incentives for replacing old installations and result in fewer distortions in case of updating, facilitate EU-wide harmonization of allocation rules or allow for simplified and more efficient closure rules. Applying an optimization model for the German power sector, we analyze the distributional effects of vari-ous allocation regimes across and within different generation technologies. Re-sults illustrate that regimes with a single uniform benchmark for all fuels or with a single benchmark for coal- and lignite-fired plants imply substantial distributional effects. In particular, lignite- and old coal-fired plants would be made worse off. Under a regime with fuel-specific benchmarks for gas, coal, and lignite 50 % of the gas-fired plants and 4 % of the lignite and coal-fired plants would face an allow-ance deficit of at least 10 %, while primarily modern lignite-fired plants would benefit. Capping the surplus and shortage of allowances would further moderate the distributional effects, but may tarnish incentives for efficiency improvements and recognition of early action. (orig.)

  18. Developing Benchmarks for Solar Radio Bursts

    Science.gov (United States)

    Biesecker, D. A.; White, S. M.; Gopalswamy, N.; Black, C.; Domm, P.; Love, J. J.; Pierson, J.

    2016-12-01

    Solar radio bursts can interfere with radar, communication, and tracking signals. In severe cases, radio bursts can inhibit the successful use of radio communications and disrupt a wide range of systems that are reliant on Position, Navigation, and Timing services on timescales ranging from minutes to hours across wide areas on the dayside of Earth. The White House's Space Weather Action Plan has asked for solar radio burst intensity benchmarks for an event occurrence frequency of 1 in 100 years and also a theoretical maximum intensity benchmark. The solar radio benchmark team was also asked to define the wavelength/frequency bands of interest. The benchmark team developed preliminary (phase 1) benchmarks for the VHF (30-300 MHz), UHF (300-3000 MHz), GPS (1176-1602 MHz), F10.7 (2800 MHz), and Microwave (4000-20000) bands. The preliminary benchmarks were derived based on previously published work. Limitations in the published work will be addressed in phase 2 of the benchmark process. In addition, deriving theoretical maxima requires additional work, where it is even possible to, in order to meet the Action Plan objectives. In this presentation, we will present the phase 1 benchmarks and the basis used to derive them. We will also present the work that needs to be done in order to complete the final, or phase 2 benchmarks.

  19. Benchmarking for controllere: Metoder, teknikker og muligheder

    DEFF Research Database (Denmark)

    Bukh, Per Nikolaj; Sandalgaard, Niels; Dietrichson, Lars

    2008-01-01

    Der vil i artiklen blive stillet skarpt på begrebet benchmarking ved at præsentere og diskutere forskellige facetter af det. Der vil blive redegjort for fire forskellige anvendelser af benchmarking for at vise begrebets bredde og væsentligheden af at klarlægge formålet med et benchmarkingprojekt......, inden man går i gang. Forskellen på resultatbenchmarking og procesbenchmarking vil blive behandlet, hvorefter brugen af intern hhv. ekstern benchmarking vil blive diskuteret. Endelig introduceres brugen af benchmarking i budgetlægning og budgetopfølgning....

  20. Establishing benchmarks and metrics for utilization management.

    Science.gov (United States)

    Melanson, Stacy E F

    2014-01-01

    The changing environment of healthcare reimbursement is rapidly leading to a renewed appreciation of the importance of utilization management in the clinical laboratory. The process of benchmarking of laboratory operations is well established for comparing organizational performance to other hospitals (peers) and for trending data over time through internal benchmarks. However, there are relatively few resources available to assist organizations in benchmarking for laboratory utilization management. This article will review the topic of laboratory benchmarking with a focus on the available literature and services to assist in managing physician requests for laboratory testing. © 2013.

  1. Alternative formulation to incorporate forcing terms in a lattice Boltzmann scheme with central moments

    Science.gov (United States)

    De Rosis, Alessandro

    2017-02-01

    Within the framework of the central-moment-based lattice Boltzmann method, we propose a strategy to account for external forces in two and three dimensions. Its numerical properties are evaluated against consolidated benchmark problems, highlighting very high accuracy and optimal convergence. Moreover, our derivations are light and intelligible.

  2. Benchmarking Implementations of Functional Languages with ``Pseudoknot'', a Float-Intensive Benchmark

    NARCIS (Netherlands)

    Hartel, P.H.; Feeley, M.; Alt, M.; Augustsson, L.

    1996-01-01

    Over 25 implementations of different functional languages are benchmarked using the same program, a floatingpoint intensive application taken from molecular biology. The principal aspects studied are compile time and execution time for the various implementations that were benchmarked. An important

  3. The Zoo, Benchmarks & You: How To Reach the Oregon State Benchmarks with Zoo Resources.

    Science.gov (United States)

    2002

    This document aligns Oregon state educational benchmarks and standards with Oregon Zoo resources. Benchmark areas examined include English, mathematics, science, social studies, and career and life roles. Brief descriptions of the programs offered by the zoo are presented. (SOE)

  4. The Zoo, Benchmarks & You: How To Reach the Oregon State Benchmarks with Zoo Resources.

    Science.gov (United States)

    2002

    This document aligns Oregon state educational benchmarks and standards with Oregon Zoo resources. Benchmark areas examined include English, mathematics, science, social studies, and career and life roles. Brief descriptions of the programs offered by the zoo are presented. (SOE)

  5. Benchmarking Implementations of Functional Languages with "Pseudoknot", a float-intensive benchmark

    NARCIS (Netherlands)

    Hartel, Pieter H.; Feeley, M.; Alt, M.; Augustsson, L.

    Over 25 implementations of different functional languages are benchmarked using the same program, a floatingpoint intensive application taken from molecular biology. The principal aspects studied are compile time and execution time for the various implementations that were benchmarked. An important

  6. A Bijection between Lattice-Valued Filters and Lattice-Valued Congruences in Residuated Lattices

    Directory of Open Access Journals (Sweden)

    Wei Wei

    2013-01-01

    Full Text Available The aim of this paper is to study relations between lattice-valued filters and lattice-valued congruences in residuated lattices. We introduce a new definition of congruences which just depends on the meet ∧ and the residuum →. Then it is shown that each of these congruences is automatically a universal-algebra-congruence. Also, lattice-valued filters and lattice-valued congruences are studied, and it is shown that there is a one-to-one correspondence between the set of all (lattice-valued filters and the set of all (lattice-valued congruences.

  7. Preliminary Benchmark Evaluation of Japan’s High Temperature Engineering Test Reactor

    Energy Technology Data Exchange (ETDEWEB)

    John Darrell Bess

    2009-05-01

    A benchmark model of the initial fully-loaded start-up core critical of Japan’s High Temperature Engineering Test Reactor (HTTR) was developed to provide data in support of ongoing validation efforts of the Very High Temperature Reactor Program using publicly available resources. The HTTR is a 30 MWt test reactor utilizing graphite moderation, helium coolant, and prismatic TRISO fuel. The benchmark was modeled using MCNP5 with various neutron cross-section libraries. An uncertainty evaluation was performed by perturbing the benchmark model and comparing the resultant eigenvalues. The calculated eigenvalues are approximately 2-3% greater than expected with an uncertainty of ±0.70%. The primary sources of uncertainty are the impurities in the core and reflector graphite. The release of additional HTTR data could effectively reduce the benchmark model uncertainties and bias. Sensitivity of the results to the graphite impurity content might imply that further evaluation of the graphite content could significantly improve calculated results. Proper characterization of graphite for future Next Generation Nuclear Power reactor designs will improve computational modeling capabilities. Current benchmarking activities include evaluation of the annular HTTR cores and assessment of the remaining start-up core physics experiments, including reactivity effects, reactivity coefficient, and reaction-rate distribution measurements. Long term benchmarking goals might include analyses of the hot zero-power critical, rise-to-power tests, and other irradiation, safety, and technical evaluations performed with the HTTR.

  8. Benchmarking: A tool to enhance performance

    Energy Technology Data Exchange (ETDEWEB)

    Munro, J.F. [Oak Ridge National Lab., TN (United States); Kristal, J. [USDOE Assistant Secretary for Environmental Management, Washington, DC (United States); Thompson, G.; Johnson, T. [Los Alamos National Lab., NM (United States)

    1996-12-31

    The Office of Environmental Management is bringing Headquarters and the Field together to implement process improvements throughout the Complex through a systematic process of organizational learning called benchmarking. Simply stated, benchmarking is a process of continuously comparing and measuring practices, processes, or methodologies with those of other private and public organizations. The EM benchmarking program, which began as the result of a recommendation from Xerox Corporation, is building trust and removing barriers to performance enhancement across the DOE organization. The EM benchmarking program is designed to be field-centered with Headquarters providing facilitatory and integrative functions on an ``as needed`` basis. One of the main goals of the program is to assist Field Offices and their associated M&O/M&I contractors develop the capabilities to do benchmarking for themselves. In this regard, a central precept is that in order to realize tangible performance benefits, program managers and staff -- the ones closest to the work - must take ownership of the studies. This avoids the ``check the box`` mentality associated with some third party studies. This workshop will provide participants with a basic level of understanding why the EM benchmarking team was developed and the nature and scope of its mission. Participants will also begin to understand the types of study levels and the particular methodology the EM benchmarking team is using to conduct studies. The EM benchmarking team will also encourage discussion on ways that DOE (both Headquarters and the Field) can team with its M&O/M&I contractors to conduct additional benchmarking studies. This ``introduction to benchmarking`` is intended to create a desire to know more and a greater appreciation of how benchmarking processes could be creatively employed to enhance performance.

  9. Benchmarking ICRF simulations for ITER

    Energy Technology Data Exchange (ETDEWEB)

    R. V. Budny, L. Berry, R. Bilato, P. Bonoli, M. Brambilla, R.J. Dumont, A. Fukuyama, R. Harvey, E.F. Jaeger, E. Lerche, C.K. Phillips, V. Vdovin, J. Wright, and members of the ITPA-IOS

    2010-09-28

    Abstract Benchmarking of full-wave solvers for ICRF simulations is performed using plasma profiles and equilibria obtained from integrated self-consistent modeling predictions of four ITER plasmas. One is for a high performance baseline (5.3 T, 15 MA) DT H-mode plasma. The others are for half-field, half-current plasmas of interest for the pre-activation phase with bulk plasma ion species being either hydrogen or He4. The predicted profiles are used by seven groups to predict the ICRF electromagnetic fields and heating profiles. Approximate agreement is achieved for the predicted heating power partitions for the DT and He4 cases. Profiles of the heating powers and electromagnetic fields are compared.

  10. Benchmarking Asteroid-Deflection Experiment

    Science.gov (United States)

    Remington, Tane; Bruck Syal, Megan; Owen, John Michael; Miller, Paul L.

    2016-10-01

    An asteroid impacting Earth could have devastating consequences. In preparation to deflect or disrupt one before it reaches Earth, it is imperative to have modeling capabilities that adequately simulate the deflection actions. Code validation is key to ensuring full confidence in simulation results used in an asteroid-mitigation plan. We are benchmarking well-known impact experiments using Spheral, an adaptive smoothed-particle hydrodynamics code, to validate our modeling of asteroid deflection. We describe our simulation results, compare them with experimental data, and discuss what we have learned from our work. This work was performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344. LLNL-ABS-695540

  11. NASA Software Engineering Benchmarking Study

    Science.gov (United States)

    Rarick, Heather L.; Godfrey, Sara H.; Kelly, John C.; Crumbley, Robert T.; Wifl, Joel M.

    2013-01-01

    To identify best practices for the improvement of software engineering on projects, NASA's Offices of Chief Engineer (OCE) and Safety and Mission Assurance (OSMA) formed a team led by Heather Rarick and Sally Godfrey to conduct this benchmarking study. The primary goals of the study are to identify best practices that: Improve the management and technical development of software intensive systems; Have a track record of successful deployment by aerospace industries, universities [including research and development (R&D) laboratories], and defense services, as well as NASA's own component Centers; and Identify candidate solutions for NASA's software issues. Beginning in the late fall of 2010, focus topics were chosen and interview questions were developed, based on the NASA top software challenges. Between February 2011 and November 2011, the Benchmark Team interviewed a total of 18 organizations, consisting of five NASA Centers, five industry organizations, four defense services organizations, and four university or university R and D laboratory organizations. A software assurance representative also participated in each of the interviews to focus on assurance and software safety best practices. Interviewees provided a wealth of information on each topic area that included: software policy, software acquisition, software assurance, testing, training, maintaining rigor in small projects, metrics, and use of the Capability Maturity Model Integration (CMMI) framework, as well as a number of special topics that came up in the discussions. NASA's software engineering practices compared favorably with the external organizations in most benchmark areas, but in every topic, there were ways in which NASA could improve its practices. Compared to defense services organizations and some of the industry organizations, one of NASA's notable weaknesses involved communication with contractors regarding its policies and requirements for acquired software. One of NASA's strengths

  12. COG validation: SINBAD Benchmark Problems

    Energy Technology Data Exchange (ETDEWEB)

    Lent, E M; Sale, K E; Buck, R M; Descalle, M

    2004-02-23

    We validated COG, a 3D Monte Carlo radiation transport code, against experimental data and MNCP4C simulations from the Shielding Integral Benchmark Archive Database (SINBAD) compiled by RSICC. We modeled three experiments: the Osaka Nickel and Aluminum sphere experiments conducted at the OKTAVIAN facility, and the liquid oxygen experiment conducted at the FNS facility. COG results are in good agreement with experimental data and generally within a few % of MCNP results. There are several possible sources of discrepancy between MCNP and COG results: (1) the cross-section database versions are different, MCNP uses ENDFB VI 1.1 while COG uses ENDFB VIR7, (2) the code implementations are different, and (3) the models may differ slightly. We also limited the use of variance reduction methods when running the COG version of the problems.

  13. General benchmarks for quantum repeaters

    CERN Document Server

    Pirandola, Stefano

    2015-01-01

    Using a technique based on quantum teleportation, we simplify the most general adaptive protocols for key distribution, entanglement distillation and quantum communication over a wide class of quantum channels in arbitrary dimension. Thanks to this method, we bound the ultimate rates for secret key generation and quantum communication through single-mode Gaussian channels and several discrete-variable channels. In particular, we derive exact formulas for the two-way assisted capacities of the bosonic quantum-limited amplifier and the dephasing channel in arbitrary dimension, as well as the secret key capacity of the qubit erasure channel. Our results establish the limits of quantum communication with arbitrary systems and set the most general and precise benchmarks for testing quantum repeaters in both discrete- and continuous-variable settings.

  14. Measuring on Lattices

    Science.gov (United States)

    Knuth, Kevin H.

    2009-12-01

    Previous derivations of the sum and product rules of probability theory relied on the algebraic properties of Boolean logic. Here they are derived within a more general framework based on lattice theory. The result is a new foundation of probability theory that encompasses and generalizes both the Cox and Kolmogorov formulations. In this picture probability is a bi-valuation defined on a lattice of statements that quantifies the degree to which one statement implies another. The sum rule is a constraint equation that ensures that valuations are assigned so as to not violate associativity of the lattice join and meet. The product rule is much more interesting in that there are actually two product rules: one is a constraint equation arises from associativity of the direct products of lattices, and the other a constraint equation derived from associativity of changes of context. The generality of this formalism enables one to derive the traditionally assumed condition of additivity in measure theory, as well introduce a general notion of product. To illustrate the generic utility of this novel lattice-theoretic foundation of measure, the sum and product rules are applied to number theory. Further application of these concepts to understand the foundation of quantum mechanics is described in a joint paper in this proceedings.

  15. HPC Analytics Support. Requirements for Uncertainty Quantification Benchmarks

    Energy Technology Data Exchange (ETDEWEB)

    Paulson, Patrick R. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Purohit, Sumit [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Rodriguez, Luke R. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2015-05-01

    This report outlines techniques for extending benchmark generation products so they support uncertainty quantification by benchmarked systems. We describe how uncertainty quantification requirements can be presented to candidate analytical tools supporting SPARQL. We describe benchmark data sets for evaluating uncertainty quantification, as well as an approach for using our benchmark generator to produce data sets for generating benchmark data sets.

  16. 42 CFR 440.330 - Benchmark health benefits coverage.

    Science.gov (United States)

    2010-10-01

    ... 42 Public Health 4 2010-10-01 2010-10-01 false Benchmark health benefits coverage. 440.330 Section... SERVICES (CONTINUED) MEDICAL ASSISTANCE PROGRAMS SERVICES: GENERAL PROVISIONS Benchmark Benefit and Benchmark-Equivalent Coverage § 440.330 Benchmark health benefits coverage. Benchmark coverage is...

  17. General point dipole theory for periodic metasurfaces: magnetoelectric scattering lattices coupled to planar photonic structures

    CERN Document Server

    Chen, Yuntian

    2015-01-01

    We study semi-analytically the light emission and absorption properties of arbitrary stratified photonic structures with embedded two-dimensional magnetoelectric point scattering lattices, as used in recent plasmon-enhanced LEDs and solar cells. By employing dyadic Green's function for the layered structure in combination with Ewald lattice summation to deal with the particle lattice, we develop an efficient method to study the coupling between planar 2D scattering lattices of plasmonic, or metamaterial point particles, coupled to layered structures. Using the `array scanning method' we deal with localized sources. Firstly, we apply our method to light emission enhancement of dipole emitters in slab waveguides, mediated by plasmonic lattices. We benchmark the array scanning method against a reciprocity-based approach to find that the calculated radiative rate enhancement in k-space below the light cone shows excellent agreement. Secondly, we apply our method to study absorption-enhancement in thin-film solar ...

  18. Extended particle swarm optimisation method for folding protein on triangular lattice.

    Science.gov (United States)

    Guo, Yuzhen; Wu, Zikai; Wang, Ying; Wang, Yong

    2016-02-01

    In this study, the authors studied the protein structure prediction problem by the two-dimensional hydrophobic-polar model on triangular lattice. Particularly the non-compact conformation was modelled to fold the amino acid sequence into a relatively larger triangular lattice, which is more biologically realistic and significant than the compact conformation. Then protein structure prediction problem was abstracted to match amino acids to lattice points. Mathematically, the problem was formulated as an integer programming and they transformed the biological problem into an optimisation problem. To solve this problem, classical particle swarm optimisation algorithm was extended by the single point adjustment strategy. Compared with square lattice, conformations on triangular lattice are more flexible in several benchmark examples. They further compared the authors' algorithm with hybrid of hill climbing and genetic algorithm. The results showed that their method was more effective in finding solution with lower energy and less running time.

  19. An Effective Approach for Benchmarking Implementation

    Directory of Open Access Journals (Sweden)

    B. M. Deros

    2011-01-01

    Full Text Available Problem statement: The purpose of this study is to present a benchmarking guideline, conceptual framework and computerized mini program to assists companies achieve better performance in terms of quality, cost, delivery, supply chain and eventually increase their competitiveness in the market. The study begins with literature review on benchmarking definition, barriers and advantages from the implementation and the study of benchmarking framework. Approach: Thirty respondents were involved in the case study. They comprise of industrial practitioners, which had assessed usability and practicability of the guideline, conceptual framework and computerized mini program. Results: A guideline and template were proposed to simplify the adoption of benchmarking techniques. A conceptual framework was proposed by integrating the Deming’s PDCA and Six Sigma DMAIC theory. It was provided a step-by-step method to simplify the implementation and to optimize the benchmarking results. A computerized mini program was suggested to assist the users in adopting the technique as part of improvement project. As the result from the assessment test, the respondents found that the implementation method provided an idea for company to initiate benchmarking implementation and it guides them to achieve the desired goal as set in a benchmarking project. Conclusion: The result obtained and discussed in this study can be applied in implementing benchmarking in a more systematic way for ensuring its success.

  20. Synergetic effect of benchmarking competitive advantages

    Directory of Open Access Journals (Sweden)

    N.P. Tkachova

    2011-12-01

    Full Text Available It is analyzed the essence of synergistic competitive benchmarking. The classification of types of synergies is developed. It is determined the sources of synergies in conducting benchmarking of competitive advantages. It is proposed methodological framework definition of synergy in the formation of competitive advantage.

  1. Synergetic effect of benchmarking competitive advantages

    OpenAIRE

    N.P. Tkachova; P.G. Pererva

    2011-01-01

    It is analyzed the essence of synergistic competitive benchmarking. The classification of types of synergies is developed. It is determined the sources of synergies in conducting benchmarking of competitive advantages. It is proposed methodological framework definition of synergy in the formation of competitive advantage.

  2. Benchmarking set for domestic smart grid management

    NARCIS (Netherlands)

    Bosman, M.G.C.; Bakker, Vincent; Molderink, Albert; Hurink, Johann L.; Smit, Gerardus Johannes Maria

    2010-01-01

    In this paper we propose a benchmark for domestic smart grid management. It consists of an in-depth description of a domestic smart grid, in which local energy consumers, producers and buffers can be controlled. First, from this description a general benchmark framework is derived, which can be used

  3. Machines are benchmarked by code, not algorithms

    NARCIS (Netherlands)

    Poss, R.

    2013-01-01

    This article highlights how small modifications to either the source code of a benchmark program or the compilation options may impact its behavior on a specific machine. It argues that for evaluating machines, benchmark providers and users be careful to ensure reproducibility of results based on th

  4. Benchmark analysis of railway networks and undertakings

    NARCIS (Netherlands)

    Hansen, I.A.; Wiggenraad, P.B.L.; Wolff, J.W.

    2013-01-01

    Benchmark analysis of railway networks and companies has been stimulated by the European policy of deregulation of transport markets, the opening of national railway networks and markets to new entrants and separation of infrastructure and train operation. Recent international railway benchmarking s

  5. Benchmark Assessment for Improved Learning. AACC Report

    Science.gov (United States)

    Herman, Joan L.; Osmundson, Ellen; Dietel, Ronald

    2010-01-01

    This report describes the purposes of benchmark assessments and provides recommendations for selecting and using benchmark assessments--addressing validity, alignment, reliability, fairness and bias and accessibility, instructional sensitivity, utility, and reporting issues. We also present recommendations on building capacity to support schools'…

  6. Benchmark Two-Good Utility Functions

    NARCIS (Netherlands)

    de Jaegher, K.

    Benchmark two-good utility functions involving a good with zero income elasticity and unit income elasticity are well known. This paper derives utility functions for the additional benchmark cases where one good has zero cross-price elasticity, unit own-price elasticity, and zero own price

  7. Benchmark Two-Good Utility Functions

    NARCIS (Netherlands)

    de Jaegher, K.

    2007-01-01

    Benchmark two-good utility functions involving a good with zero income elasticity and unit income elasticity are well known. This paper derives utility functions for the additional benchmark cases where one good has zero cross-price elasticity, unit own-price elasticity, and zero own price elasticit

  8. Benchmarking Learning and Teaching: Developing a Method

    Science.gov (United States)

    Henderson-Smart, Cheryl; Winning, Tracey; Gerzina, Tania; King, Shalinie; Hyde, Sarah

    2006-01-01

    Purpose: To develop a method for benchmarking teaching and learning in response to an institutional need to validate a new program in Dentistry at the University of Sydney, Australia. Design/methodology/approach: After a collaborative partner, University of Adelaide, was identified, the areas of teaching and learning to be benchmarked, PBL…

  9. Lattice Boltzmann Stokesian dynamics.

    Science.gov (United States)

    Ding, E J

    2015-11-01

    Lattice Boltzmann Stokesian dynamics (LBSD) is presented for simulation of particle suspension in Stokes flows. This method is developed from Stokesian dynamics (SD) with resistance and mobility matrices calculated using the time-independent lattice Boltzmann algorithm (TILBA). TILBA is distinguished from the traditional lattice Boltzmann method (LBM) in that a background matrix is generated prior to the calculation. The background matrix, once generated, can be reused for calculations for different scenarios, thus the computational cost for each such subsequent calculation is significantly reduced. The LBSD inherits the merits of the SD where both near- and far-field interactions are considered. It also inherits the merits of the LBM that the computational cost is almost independent of the particle shape.

  10. Lattice gauge theories

    Science.gov (United States)

    Weisz, Peter; Majumdar, Pushan

    2012-03-01

    Lattice gauge theory is a formulation of quantum field theory with gauge symmetries on a space-time lattice. This formulation is particularly suitable for describing hadronic phenomena. In this article we review the present status of lattice QCD. We outline some of the computational methods, discuss some phenomenological applications and a variety of non-perturbative topics. The list of references is severely incomplete, the ones we have included are text books or reviews and a few subjectively selected papers. Kronfeld and Quigg (2010) supply a reasonably comprehensive set of QCD references. We apologize for the fact that have not covered many important topics such as QCD at finite density and heavy quark effective theory adequately, and mention some of them only in the last section "In Brief". These topics should be considered in further Scholarpedia articles.

  11. Improved Lattice Radial Quantization

    CERN Document Server

    Brower, Richard C; Fleming, George T

    2014-01-01

    Lattice radial quantization was proposed in a recent paper by Brower, Fleming and Neuberger[1] as a nonperturbative method especially suited to numerically solve Euclidean conformal field theories. The lessons learned from the lattice radial quantization of the 3D Ising model on a longitudinal cylinder with 2D Icosahedral cross-section suggested the need for an improved discretization. We consider here the use of the Finite Element Methods(FEM) to descretize the universally-equivalent $\\phi^4$ Lagrangian on $\\mathbb R \\times \\mathbb S^2$. It is argued that this lattice regularization will approach the exact conformal theory at the Wilson-Fisher fixed point in the continuum. Numerical tests are underway to support this conjecture.

  12. Graphene antidot lattice waveguides

    DEFF Research Database (Denmark)

    Pedersen, Jesper Goor; Gunst, Tue; Markussen, Troels

    2012-01-01

    We introduce graphene antidot lattice waveguides: nanostructured graphene where a region of pristine graphene is sandwiched between regions of graphene antidot lattices. The band gaps in the surrounding antidot lattices enable localized states to emerge in the central waveguide region. We model...... the waveguides via a position-dependent mass term in the Dirac approximation of graphene and arrive at analytical results for the dispersion relation and spinor eigenstates of the localized waveguide modes. To include atomistic details we also use a tight-binding model, which is in excellent agreement...... with the analytical results. The waveguides resemble graphene nanoribbons, but without the particular properties of ribbons that emerge due to the details of the edge. We show that electrons can be guided through kinks without additional resistance and that transport through the waveguides is robust against...

  13. Digital lattice gauge theories

    CERN Document Server

    Zohar, Erez; Reznik, Benni; Cirac, J Ignacio

    2016-01-01

    We propose a general scheme for a digital construction of lattice gauge theories with dynamical fermions. In this method, the four-body interactions arising in models with $2+1$ dimensions and higher, are obtained stroboscopically, through a sequence of two-body interactions with ancillary degrees of freedom. This yields stronger interactions than the ones obtained through pertubative methods, as typically done in previous proposals, and removes an important bottleneck in the road towards experimental realizations. The scheme applies to generic gauge theories with Lie or finite symmetry groups, both Abelian and non-Abelian. As a concrete example, we present the construction of a digital quantum simulator for a $\\mathbb{Z}_{3}$ lattice gauge theory with dynamical fermionic matter in $2+1$ dimensions, using ultracold atoms in optical lattices, involving three atomic species, representing the matter, gauge and auxiliary degrees of freedom, that are separated in three different layers. By moving the ancilla atoms...

  14. Optical Lattice Clocks

    Science.gov (United States)

    Oates, Chris

    2012-06-01

    Since they were first proposed in 2003 [1], optical lattice clocks have become one of the leading technologies for the next generation of atomic clocks, which will be used for advanced timing applications and in tests of fundamental physics [2]. These clocks are based on stabilized lasers whose frequency is ultimately referenced to an ultra-narrow neutral atom transition (natural linewidths magic'' value so as to yield a vanishing net AC Stark shift for the clock transition. As a result lattice clocks have demonstrated the capability of generating high stability clock signals with small absolute uncertainties (˜ 1 part in 10^16). In this presentation I will first give an overview of the field, which now includes three different atomic species. I will then use experiments with Yb performed in our laboratory to illustrate the key features of a lattice clock. Our research has included the development of state-of-the-art optical cavities enabling ultra-high-resolution optical spectroscopy (1 Hz linewidth). Together with the large atom number in the optical lattice, we are able to achieve very low clock instability (< 0.3 Hz in 1 s) [3]. Furthermore, I will show results from some of our recent investigations of key shifts for the Yb lattice clock, including high precision measurements of ultracold atom-atom interactions in the lattice and the dc Stark effect for the Yb clock transition (necessary for the evaluation of blackbody radiation shifts). [4pt] [1] H. Katori, M. Takamoto, V. G. Pal'chikov, and V. D. Ovsiannikov, Phys. Rev. Lett. 91, 173005 (2003). [0pt] [2] Andrei Derevianko and Hidetoshi Katori, Rev. Mod. Phys. 83, 331 (2011). [0pt] [3] Y. Y. Jiang, A. D. Ludlow, N. D. Lemke, R. W. Fox, J. A. Sherman, L.-S. Ma, and C. W. Oates, Nature Photonics 5, 158 (2011).

  15. A Seafloor Benchmark for 3-dimensional Geodesy

    Science.gov (United States)

    Chadwell, C. D.; Webb, S. C.; Nooner, S. L.

    2014-12-01

    We have developed an inexpensive, permanent seafloor benchmark to increase the longevity of seafloor geodetic measurements. The benchmark provides a physical tie to the sea floor lasting for decades (perhaps longer) on which geodetic sensors can be repeatedly placed and removed with millimeter resolution. Global coordinates estimated with seafloor geodetic techniques will remain attached to the benchmark allowing for the interchange of sensors as they fail or become obsolete, or for the sensors to be removed and used elsewhere, all the while maintaining a coherent series of positions referenced to the benchmark. The benchmark has been designed to free fall from the sea surface with transponders attached. The transponder can be recalled via an acoustic command sent from the surface to release from the benchmark and freely float to the sea surface for recovery. The duration of the sensor attachment to the benchmark will last from a few days to a few years depending on the specific needs of the experiment. The recovered sensors are then available to be reused at other locations, or again at the same site in the future. Three pins on the sensor frame mate precisely and unambiguously with three grooves on the benchmark. To reoccupy a benchmark a Remotely Operated Vehicle (ROV) uses its manipulator arm to place the sensor pins into the benchmark grooves. In June 2014 we deployed four benchmarks offshore central Oregon. We used the ROV Jason to successfully demonstrate the removal and replacement of packages onto the benchmark. We will show the benchmark design and its operational capabilities. Presently models of megathrust slip within the Cascadia Subduction Zone (CSZ) are mostly constrained by the sub-aerial GPS vectors from the Plate Boundary Observatory, a part of Earthscope. More long-lived seafloor geodetic measures are needed to better understand the earthquake and tsunami risk associated with a large rupture of the thrust fault within the Cascadia subduction zone

  16. Entropic Lattice Boltzmann Method for Moving and Deforming Geometries in Three Dimensions

    CERN Document Server

    Dorschner, B; Karlin, I V

    2016-01-01

    Entropic lattice Boltzmann methods have been developed to alleviate intrinsic stability issues of lattice Boltzmann models for under-resolved simulations. Its reliability in combination with moving objects was established for various laminar benchmark flows in two dimensions in our previous work Dorschner et al. [11] as well as for three dimensional one-way coupled simulations of engine-type geometries in Dorschner et al. [12] for flat moving walls. The present contribution aims to fully exploit the advantages of entropic lattice Boltzmann models in terms of stability and accuracy and extends the methodology to three-dimensional cases including two-way coupling between fluid and structure, turbulence and deformable meshes. To cover this wide range of applications, the classical benchmark of a sedimenting sphere is chosen first to validate the general two-way coupling algorithm. Increasing the complexity, we subsequently consider the simulation of a plunging SD7003 airfoil at a Reynolds number of Re = 40000 an...

  17. Exact Lattice Supersymmetry

    Energy Technology Data Exchange (ETDEWEB)

    Catterall, Simon; Kaplan, David B.; Unsal, Mithat

    2009-03-31

    We provide an introduction to recent lattice formulations of supersymmetric theories which are invariant under one or more real supersymmetries at nonzero lattice spacing. These include the especially interesting case of N = 4 SYM in four dimensions. We discuss approaches based both on twisted supersymmetry and orbifold-deconstruction techniques and show their equivalence in the case of gauge theories. The presence of an exact supersymmetry reduces and in some cases eliminates the need for fine tuning to achieve a continuum limit invariant under the full supersymmetry of the target theory. We discuss open problems.

  18. Belief functions on lattices

    CERN Document Server

    Grabisch, Michel

    2008-01-01

    We extend the notion of belief function to the case where the underlying structure is no more the Boolean lattice of subsets of some universal set, but any lattice, which we will endow with a minimal set of properties according to our needs. We show that all classical constructions and definitions (e.g., mass allocation, commonality function, plausibility functions, necessity measures with nested focal elements, possibility distributions, Dempster rule of combination, decomposition w.r.t. simple support functions, etc.) remain valid in this general setting. Moreover, our proof of decomposition of belief functions into simple support functions is much simpler and general than the original one by Shafer.

  19. TREAT Transient Analysis Benchmarking for the HEU Core

    Energy Technology Data Exchange (ETDEWEB)

    Kontogeorgakos, D. C. [Argonne National Lab. (ANL), Argonne, IL (United States); Connaway, H. M. [Argonne National Lab. (ANL), Argonne, IL (United States); Wright, A. E. [Argonne National Lab. (ANL), Argonne, IL (United States)

    2014-05-01

    This work was performed to support the feasibility study on the potential conversion of the Transient Reactor Test Facility (TREAT) at Idaho National Laboratory from the use of high enriched uranium (HEU) fuel to the use of low enriched uranium (LEU) fuel. The analyses were performed by the GTRI Reactor Conversion staff at the Argonne National Laboratory (ANL). The objective of this study was to benchmark the transient calculations against temperature-limited transients performed in the final operating HEU TREAT core configuration. The MCNP code was used to evaluate steady-state neutronics behavior, and the point kinetics code TREKIN was used to determine core power and energy during transients. The first part of the benchmarking process was to calculate with MCNP all the neutronic parameters required by TREKIN to simulate the transients: the transient rod-bank worth, the prompt neutron generation lifetime, the temperature reactivity feedback as a function of total core energy, and the core-average temperature and peak temperature as a functions of total core energy. The results of these calculations were compared against measurements or against reported values as documented in the available TREAT reports. The heating of the fuel was simulated as an adiabatic process. The reported values were extracted from ANL reports, intra-laboratory memos and experiment logsheets and in some cases it was not clear if the values were based on measurements, on calculations or a combination of both. Therefore, it was decided to use the term “reported” values when referring to such data. The methods and results from the HEU core transient analyses will be used for the potential LEU core configurations to predict the converted (LEU) core’s performance.

  20. Lattice of ℤ-module

    Directory of Open Access Journals (Sweden)

    Futa Yuichi

    2016-03-01

    Full Text Available In this article, we formalize the definition of lattice of ℤ-module and its properties in the Mizar system [5].We formally prove that scalar products in lattices are bilinear forms over the field of real numbers ℝ. We also formalize the definitions of positive definite and integral lattices and their properties. Lattice of ℤ-module is necessary for lattice problems, LLL (Lenstra, Lenstra and Lovász base reduction algorithm [14], and cryptographic systems with lattices [15] and coding theory [9].

  1. An Algorithm on Generating Lattice Based on Layered Concept Lattice

    Directory of Open Access Journals (Sweden)

    Zhang Chang-sheng

    2013-08-01

    Full Text Available Concept lattice is an effective tool for data analysis and rule extraction, a bottleneck factor on impacting the applications of concept lattice is how to generate lattice efficiently. In this paper, an algorithm LCLG on generating lattice in batch processing based on layered concept lattice is developed, this algorithm is based on layered concept lattice, the lattice is generated downward layer by layer through concept nodes and provisional nodes in current layer; the concept nodes are found parent-child relationships upward layer by layer, then the Hasse diagram of inter-layer connection is generated; in the generated process of the lattice nodes in each layer, we do the pruning operations dynamically according to relevant properties, and delete some unnecessary nodes, such that the generating speed is improved greatly; the experimental results demonstrate that the proposed algorithm has good performance.

  2. The JKJ Lattice

    Science.gov (United States)

    Shigaki, Kenta; Noda, Fumiaki; Yamamoto, Kazami; Machida, Shinji; Molodojentsev, Alexander; Ishi, Yoshihiro

    2002-12-01

    The JKJ high-intensity proton accelerator facility consists of a 400-MeV linac, a 3-GeV 1-MW rapid-cycling synchrotron and a 50-GeV 0.75-MW synchrotron. The lattice and beam dynamics design of the two synchrotrons are reported.

  3. Quantum lattice problems

    NARCIS (Netherlands)

    de Raedt, Hans; von der Linden, W.; Binder, K

    1995-01-01

    In this chapter we review methods currently used to perform Monte Carlo calculations for quantum lattice models. A detailed exposition is given of the formalism underlying the construction of the simulation algorithms. We discuss the fundamental and technical difficulties that are encountered and gi

  4. Measuring on Lattices

    CERN Document Server

    Knuth, Kevin H

    2009-01-01

    Previous derivations of the sum and product rules of probability theory relied on the algebraic properties of Boolean logic. Here they are derived within a more general framework based on lattice theory. The result is a new foundation of probability theory that encompasses and generalizes both the Cox and Kolmogorov formulations. In this picture probability is a bi-valuation defined on a lattice of statements that quantifies the degree to which one statement implies another. The sum rule is a constraint equation that ensures that valuations are assigned so as to not violate associativity of the lattice join and meet. The product rule is much more interesting in that there are actually two product rules: one is a constraint equation arises from associativity of the direct products of lattices, and the other a constraint equation derived from associativity of changes of context. The generality of this formalism enables one to derive the traditionally assumed condition of additivity in measure theory, as well in...

  5. Lattice Multiverse Models

    OpenAIRE

    Williamson, S. Gill

    2010-01-01

    Will the cosmological multiverse, when described mathematically, have easily stated properties that are impossible to prove or disprove using mathematical physics? We explore this question by constructing lattice multiverses which exhibit such behavior even though they are much simpler mathematically than any likely cosmological multiverse.

  6. Phenomenology from lattice QCD

    CERN Document Server

    Lellouch, L P

    2003-01-01

    After a short presentation of lattice QCD and some of its current practical limitations, I review recent progress in applications to phenomenology. Emphasis is placed on heavy-quark masses and on hadronic weak matrix elements relevant for constraining the CKM unitarity triangle. The main numerical results are highlighted in boxes.

  7. Noetherian and Artinian Lattices

    Directory of Open Access Journals (Sweden)

    Derya Keskin Tütüncü

    2012-01-01

    Full Text Available It is proved that if L is a complete modular lattice which is compactly generated, then Rad(L/0 is Artinian if, and only if for every small element a of L, the sublattice a/0 is Artinian if, and only if L satisfies DCC on small elements.

  8. VENUS-2 MOX Core Benchmark: Results of ORNL Calculations Using HELIOS-1.4

    Energy Technology Data Exchange (ETDEWEB)

    Ellis, RJ

    2001-02-02

    The Task Force on Reactor-Based Plutonium Disposition, now an Expert Group, was set up through the Organization for Economic Cooperation and Development/Nuclear Energy Agency to facilitate technical assessments of burning weapons-grade plutonium mixed-oxide (MOX) fuel in U.S. pressurized-water reactors and Russian VVER nuclear reactors. More than ten countries participated to advance the work of the Task Force in a major initiative, which was a blind benchmark study to compare code benchmark calculations against experimental data for the VENUS-2 MOX core at SCK-CEN in Mol, Belgium. At the Oak Ridge National Laboratory, the HELIOS-1.4 code was used to perform a comprehensive study of pin-cell and core calculations for the VENUS-2 benchmark.

  9. Solution of the neutronics code dynamic benchmark by finite element method

    Science.gov (United States)

    Avvakumov, A. V.; Vabishchevich, P. N.; Vasilev, A. O.; Strizhov, V. F.

    2016-10-01

    The objective is to analyze the dynamic benchmark developed by Atomic Energy Research for the verification of best-estimate neutronics codes. The benchmark scenario includes asymmetrical ejection of a control rod in a water-type hexagonal reactor at hot zero power. A simple Doppler feedback mechanism assuming adiabatic fuel temperature heating is proposed. The finite element method on triangular calculation grids is used to solve the three-dimensional neutron kinetics problem. The software has been developed using the engineering and scientific calculation library FEniCS. The matrix spectral problem is solved using the scalable and flexible toolkit SLEPc. The solution accuracy of the dynamic benchmark is analyzed by condensing calculation grid and varying degree of finite elements.

  10. ICSBEP Benchmarks For Nuclear Data Applications

    Science.gov (United States)

    Briggs, J. Blair

    2005-05-01

    The International Criticality Safety Benchmark Evaluation Project (ICSBEP) was initiated in 1992 by the United States Department of Energy. The ICSBEP became an official activity of the Organization for Economic Cooperation and Development (OECD) — Nuclear Energy Agency (NEA) in 1995. Representatives from the United States, United Kingdom, France, Japan, the Russian Federation, Hungary, Republic of Korea, Slovenia, Serbia and Montenegro (formerly Yugoslavia), Kazakhstan, Spain, Israel, Brazil, Poland, and the Czech Republic are now participating. South Africa, India, China, and Germany are considering participation. The purpose of the ICSBEP is to identify, evaluate, verify, and formally document a comprehensive and internationally peer-reviewed set of criticality safety benchmark data. The work of the ICSBEP is published as an OECD handbook entitled "International Handbook of Evaluated Criticality Safety Benchmark Experiments." The 2004 Edition of the Handbook contains benchmark specifications for 3331 critical or subcritical configurations that are intended for use in validation efforts and for testing basic nuclear data. New to the 2004 Edition of the Handbook is a draft criticality alarm / shielding type benchmark that should be finalized in 2005 along with two other similar benchmarks. The Handbook is being used extensively for nuclear data testing and is expected to be a valuable resource for code and data validation and improvement efforts for decades to come. Specific benchmarks that are useful for testing structural materials such as iron, chromium, nickel, and manganese; beryllium; lead; thorium; and 238U are highlighted.

  11. The Isprs Benchmark on Indoor Modelling

    Science.gov (United States)

    Khoshelham, K.; Díaz Vilariño, L.; Peter, M.; Kang, Z.; Acharya, D.

    2017-09-01

    Automated generation of 3D indoor models from point cloud data has been a topic of intensive research in recent years. While results on various datasets have been reported in literature, a comparison of the performance of different methods has not been possible due to the lack of benchmark datasets and a common evaluation framework. The ISPRS benchmark on indoor modelling aims to address this issue by providing a public benchmark dataset and an evaluation framework for performance comparison of indoor modelling methods. In this paper, we present the benchmark dataset comprising several point clouds of indoor environments captured by different sensors. We also discuss the evaluation and comparison of indoor modelling methods based on manually created reference models and appropriate quality evaluation criteria. The benchmark dataset is available for download at: http://www2.isprs.org/commissions/comm4/wg5/benchmark-on-indoor-modelling.html"target="_blank">http://www2.isprs.org/commissions/comm4/wg5/benchmark-on-indoor-modelling.html.

  12. Plans to update benchmarking tool.

    Science.gov (United States)

    Stokoe, Mark

    2013-02-01

    The use of the current AssetMark system by hospital health facilities managers and engineers (in Australia) has decreased to a point of no activity occurring. A number of reasons have been cited, including cost, time to do, slow process, and level of information required. Based on current levels of activity, it would not be of any value to IHEA, or to its members, to continue with this form of AssetMark. For AssetMark to remain viable, it needs to be developed as a tool seen to be of value to healthcare facilities managers, and not just healthcare facility engineers. Benchmarking is still a very important requirement in the industry, and AssetMark can fulfil this need provided that it remains abreast of customer needs. The proposed future direction is to develop an online version of AssetMark with its current capabilities regarding capturing of data (12 Key Performance Indicators), reporting, and user interaction. The system would also provide end-users with access to live reporting features via a user-friendly web nterface linked through the IHEA web page.

  13. Academic Benchmarks for Otolaryngology Leaders.

    Science.gov (United States)

    Eloy, Jean Anderson; Blake, Danielle M; D'Aguillo, Christine; Svider, Peter F; Folbe, Adam J; Baredes, Soly

    2015-08-01

    This study aimed to characterize current benchmarks for academic otolaryngologists serving in positions of leadership and identify factors potentially associated with promotion to these positions. Information regarding chairs (or division chiefs), vice chairs, and residency program directors was obtained from faculty listings and organized by degree(s) obtained, academic rank, fellowship training status, sex, and experience. Research productivity was characterized by (a) successful procurement of active grants from the National Institutes of Health and prior grants from the American Academy of Otolaryngology-Head and Neck Surgery Foundation Centralized Otolaryngology Research Efforts program and (b) scholarly impact, as measured by the h-index. Chairs had the greatest amount of experience (32.4 years) and were the least likely to have multiple degrees, with 75.8% having an MD degree only. Program directors were the most likely to be fellowship trained (84.8%). Women represented 16% of program directors, 3% of chairs, and no vice chairs. Chairs had the highest scholarly impact (as measured by the h-index) and the greatest external grant funding. This analysis characterizes the current picture of leadership in academic otolaryngology. Chairs, when compared to their vice chair and program director counterparts, had more experience and greater research impact. Women were poorly represented among all academic leadership positions. © The Author(s) 2015.

  14. Benchmarking Measures of Network Influence

    Science.gov (United States)

    Bramson, Aaron; Vandermarliere, Benjamin

    2016-01-01

    Identifying key agents for the transmission of diseases (ideas, technology, etc.) across social networks has predominantly relied on measures of centrality on a static base network or a temporally flattened graph of agent interactions. Various measures have been proposed as the best trackers of influence, such as degree centrality, betweenness, and k-shell, depending on the structure of the connectivity. We consider SIR and SIS propagation dynamics on a temporally-extruded network of observed interactions and measure the conditional marginal spread as the change in the magnitude of the infection given the removal of each agent at each time: its temporal knockout (TKO) score. We argue that this TKO score is an effective benchmark measure for evaluating the accuracy of other, often more practical, measures of influence. We find that none of the network measures applied to the induced flat graphs are accurate predictors of network propagation influence on the systems studied; however, temporal networks and the TKO measure provide the requisite targets for the search for effective predictive measures. PMID:27670635

  15. Developing integrated benchmarks for DOE performance measurement

    Energy Technology Data Exchange (ETDEWEB)

    Barancik, J.I.; Kramer, C.F.; Thode, Jr. H.C.

    1992-09-30

    The objectives of this task were to describe and evaluate selected existing sources of information on occupational safety and health with emphasis on hazard and exposure assessment, abatement, training, reporting, and control identifying for exposure and outcome in preparation for developing DOE performance benchmarks. Existing resources and methodologies were assessed for their potential use as practical performance benchmarks. Strengths and limitations of current data resources were identified. Guidelines were outlined for developing new or improved performance factors, which then could become the basis for selecting performance benchmarks. Data bases for non-DOE comparison populations were identified so that DOE performance could be assessed relative to non-DOE occupational and industrial groups. Systems approaches were described which can be used to link hazards and exposure, event occurrence, and adverse outcome factors, as needed to generate valid, reliable, and predictive performance benchmarks. Data bases were identified which contain information relevant to one or more performance assessment categories . A list of 72 potential performance benchmarks was prepared to illustrate the kinds of information that can be produced through a benchmark development program. Current information resources which may be used to develop potential performance benchmarks are limited. There is need to develop an occupational safety and health information and data system in DOE, which is capable of incorporating demonstrated and documented performance benchmarks prior to, or concurrent with the development of hardware and software. A key to the success of this systems approach is rigorous development and demonstration of performance benchmark equivalents to users of such data before system hardware and software commitments are institutionalized.

  16. INL Results for Phases I and III of the OECD/NEA MHTGR-350 Benchmark

    Energy Technology Data Exchange (ETDEWEB)

    Gerhard Strydom; Javier Ortensi; Sonat Sen; Hans Hammer

    2013-09-01

    The Idaho National Laboratory (INL) Very High Temperature Reactor (VHTR) Technology Development Office (TDO) Methods Core Simulation group led the construction of the Organization for Economic Cooperation and Development (OECD) Modular High Temperature Reactor (MHTGR) 350 MW benchmark for comparing and evaluating prismatic VHTR analysis codes. The benchmark is sponsored by the OECD's Nuclear Energy Agency (NEA), and the project will yield a set of reference steady-state, transient, and lattice depletion problems that can be used by the Department of Energy (DOE), the Nuclear Regulatory Commission (NRC), and vendors to assess their code suits. The Methods group is responsible for defining the benchmark specifications, leading the data collection and comparison activities, and chairing the annual technical workshops. This report summarizes the latest INL results for Phase I (steady state) and Phase III (lattice depletion) of the benchmark. The INSTANT, Pronghorn and RattleSnake codes were used for the standalone core neutronics modeling of Exercise 1, and the results obtained from these codes are compared in Section 4. Exercise 2 of Phase I requires the standalone steady-state thermal fluids modeling of the MHTGR-350 design, and the results for the systems code RELAP5-3D are discussed in Section 5. The coupled neutronics and thermal fluids steady-state solution for Exercise 3 are reported in Section 6, utilizing the newly developed Parallel and Highly Innovative Simulation for INL Code System (PHISICS)/RELAP5-3D code suit. Finally, the lattice depletion models and results obtained for Phase III are compared in Section 7. The MHTGR-350 benchmark proved to be a challenging simulation set of problems to model accurately, and even with the simplifications introduced in the benchmark specification this activity is an important step in the code-to-code verification of modern prismatic VHTR codes. A final OECD/NEA comparison report will compare the Phase I and III

  17. Basis reduction for layered lattices

    NARCIS (Netherlands)

    Torreão Dassen, Erwin

    2011-01-01

    We develop the theory of layered Euclidean spaces and layered lattices. We present algorithms to compute both Gram-Schmidt and reduced bases in this generalized setting. A layered lattice can be seen as lattices where certain directions have infinite weight. It can also be interpre

  18. Spin qubits in antidot lattices

    DEFF Research Database (Denmark)

    Pedersen, Jesper Goor; Flindt, Christian; Mortensen, Niels Asger;

    2008-01-01

    and density of states for a periodic potential modulation, referred to as an antidot lattice, and find that localized states appear, when designed defects are introduced in the lattice. Such defect states may form the building blocks for quantum computing in a large antidot lattice, allowing for coherent...

  19. SPOC Benchmark Case: SNRE Model

    Energy Technology Data Exchange (ETDEWEB)

    Vishal Patel; Michael Eades; Claude Russel Joyner II

    2016-02-01

    The Small Nuclear Rocket Engine (SNRE) was modeled in the Center for Space Nuclear Research’s (CSNR) Space Propulsion Optimization Code (SPOC). SPOC aims to create nuclear thermal propulsion (NTP) geometries quickly to perform parametric studies on design spaces of historic and new NTP designs. The SNRE geometry was modeled in SPOC and a critical core with a reasonable amount of criticality margin was found. The fuel, tie-tubes, reflector, and control drum masses were predicted rather well. These are all very important for neutronics calculations so the active reactor geometries created with SPOC can continue to be trusted. Thermal calculations of the average and hot fuel channels agreed very well. The specific impulse calculations used historically and in SPOC disagree so mass flow rates and impulses differed. Modeling peripheral and power balance components that do not affect nuclear characteristics of the core is not a feature of SPOC and as such, these components should continue to be designed using other tools. A full paper detailing the available SNRE data and comparisons with SPOC outputs will be submitted as a follow-up to this abstract.

  20. Benchmark studies of BOUT++ code and TPSMBI code on neutral transport during SMBI

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Y.H. [Institute of Plasma Physics, Chinese Academy of Sciences, Hefei 230031 (China); University of Science and Technology of China, Hefei 230026 (China); Center for Magnetic Fusion Theory, Chinese Academy of Sciences, Hefei 230031 (China); Wang, Z.H., E-mail: zhwang@swip.ac.cn [Southwestern Institute of Physics, Chengdu 610041 (China); Guo, W., E-mail: wfguo@ipp.ac.cn [Institute of Plasma Physics, Chinese Academy of Sciences, Hefei 230031 (China); Center for Magnetic Fusion Theory, Chinese Academy of Sciences, Hefei 230031 (China); Ren, Q.L. [Institute of Plasma Physics, Chinese Academy of Sciences, Hefei 230031 (China); Sun, A.P.; Xu, M.; Wang, A.K. [Southwestern Institute of Physics, Chengdu 610041 (China); Xiang, N. [Institute of Plasma Physics, Chinese Academy of Sciences, Hefei 230031 (China); Center for Magnetic Fusion Theory, Chinese Academy of Sciences, Hefei 230031 (China)

    2017-06-09

    SMBI (supersonic molecule beam injection) plays an important role in tokamak plasma fuelling, density control and ELM mitigation in magnetic confinement plasma physics, which has been widely used in many tokamaks. The trans-neut module of BOUT++ code is the only large-scale parallel 3D fluid code used to simulate the SMBI fueling process, while the TPSMBI (transport of supersonic molecule beam injection) code is a recent developed 1D fluid code of SMBI. In order to find a method to increase SMBI fueling efficiency in H-mode plasma, especially for ITER, it is significant to first verify the codes. The benchmark study between the trans-neut module of BOUT++ code and the TPSMBI code on radial transport dynamics of neutral during SMBI has been first successfully achieved in both slab and cylindrical coordinates. The simulation results from the trans-neut module of BOUT++ code and TPSMBI code are consistent very well with each other. Different upwind schemes have been compared to deal with the sharp gradient front region during the inward propagation of SMBI for the code stability. The influence of the WENO3 (weighted essentially non-oscillatory) and the third order upwind schemes on the benchmark results has also been discussed. - Highlights: • A 1D model of SMBI has developed. • Benchmarks of BOUT++ and TPSMBI codes have first been finished. • The influence of the WENO3 and the third order upwind schemes on the benchmark results has also been discussed.

  1. Benchmarking – A tool for judgment or improvement?

    DEFF Research Database (Denmark)

    Rasmussen, Grane Mikael Gregaard

    2010-01-01

    these issues, and describes how effects are closely connected to the perception of benchmarking, the intended users of the system and the application of the benchmarking results. The fundamental basis of this paper is taken from the development of benchmarking in the Danish construction sector. Two distinct...... perceptions of benchmarking will be presented; public benchmarking and best practice benchmarking. These two types of benchmarking are used to characterize and discuss the Danish benchmarking system and to enhance which effects, possibilities and challenges that follow in the wake of using this kind...... of benchmarking. In conclusion it is argued that clients and the Danish government are the intended users of the benchmarking system. The benchmarking results are primarily used by the government for monitoring and regulation of the construction sector and by clients for contractor selection. The dominating use...

  2. Benchmarks for dynamic multi-objective optimisation

    CSIR Research Space (South Africa)

    Helbig, M

    2013-06-01

    Full Text Available When algorithms solve dynamic multi-objective optimisation problems (DMOOPs), benchmark functions should be used to determine whether the algorithm can overcome specific difficulties that can occur in real-world problems. However, for dynamic multi...

  3. Medicare Contracting - Redacted Benchmark Metric Reports

    Data.gov (United States)

    U.S. Department of Health & Human Services — The Centers for Medicare and Medicaid Services has compiled aggregate national benchmark cost and workload metrics using data submitted to CMS by the AB MACs and the...

  4. XWeB: The XML Warehouse Benchmark

    Science.gov (United States)

    Mahboubi, Hadj; Darmont, Jérôme

    With the emergence of XML as a standard for representing business data, new decision support applications are being developed. These XML data warehouses aim at supporting On-Line Analytical Processing (OLAP) operations that manipulate irregular XML data. To ensure feasibility of these new tools, important performance issues must be addressed. Performance is customarily assessed with the help of benchmarks. However, decision support benchmarks do not currently support XML features. In this paper, we introduce the XML Warehouse Benchmark (XWeB), which aims at filling this gap. XWeB derives from the relational decision support benchmark TPC-H. It is mainly composed of a test data warehouse that is based on a unified reference model for XML warehouses and that features XML-specific structures, and its associate XQuery decision support workload. XWeB's usage is illustrated by experiments on several XML database management systems.

  5. XWeB: the XML Warehouse Benchmark

    CERN Document Server

    Mahboubi, Hadj

    2011-01-01

    With the emergence of XML as a standard for representing business data, new decision support applications are being developed. These XML data warehouses aim at supporting On-Line Analytical Processing (OLAP) operations that manipulate irregular XML data. To ensure feasibility of these new tools, important performance issues must be addressed. Performance is customarily assessed with the help of benchmarks. However, decision support benchmarks do not currently support XML features. In this paper, we introduce the XML Warehouse Benchmark (XWeB), which aims at filling this gap. XWeB derives from the relational decision support benchmark TPC-H. It is mainly composed of a test data warehouse that is based on a unified reference model for XML warehouses and that features XML-specific structures, and its associate XQuery decision support workload. XWeB's usage is illustrated by experiments on several XML database management systems.

  6. Professional Performance and Bureaucratic Benchmarking Information

    DEFF Research Database (Denmark)

    Schneider, Melanie L.; Mahlendorf, Matthias D.; Schäffer, Utz

    provision to the chief physician of the respective department. Professional performance is publicly disclosed due to regulatory requirements. At the same time, chief physicians typically receive bureaucratic benchmarking information from the administration. We find that more frequent bureaucratic...

  7. Benchmarking of PR Function in Serbian Companies

    National Research Council Canada - National Science Library

    Nikolić, Milan; Sajfert, Zvonko; Vukonjanski, Jelena

    2009-01-01

    The purpose of this paper is to present methodologies for carrying out benchmarking of the PR function in Serbian companies and to test the practical application of the research results and proposed...

  8. Introduction to lattice QCD

    Energy Technology Data Exchange (ETDEWEB)

    Gupta, R.

    1998-12-31

    The goal of the lectures on lattice QCD (LQCD) is to provide an overview of both the technical issues and the progress made so far in obtaining phenomenologically useful numbers. The lectures consist of three parts. The author`s charter is to provide an introduction to LQCD and outline the scope of LQCD calculations. In the second set of lectures, Guido Martinelli will discuss the progress they have made so far in obtaining results, and their impact on Standard Model phenomenology. Finally, Martin Luescher will discuss the topical subjects of chiral symmetry, improved formulation of lattice QCD, and the impact these improvements will have on the quality of results expected from the next generation of simulations.

  9. Lattice Quantum Chromodynamics

    Science.gov (United States)

    Sachrajda, C. T.

    2016-10-01

    I review the the application of the lattice formulation of QCD and large-scale numerical simulations to the evaluation of non-perturbative hadronic effects in Standard Model Phenomenology. I present an introduction to the elements of the calculations and discuss the limitations both in the range of quantities which can be studied and in the precision of the results. I focus particularly on the extraction of the QCD parameters, i.e. the quark masses and the strong coupling constant, and on important quantities in flavour physics. Lattice QCD is playing a central role in quantifying the hadronic effects necessary for the development of precision flavour physics and its use in exploring the limits of the Standard Model and in searches for inconsistencies which would signal the presence of new physics.

  10. Lattices of dielectric resonators

    CERN Document Server

    Trubin, Alexander

    2016-01-01

    This book provides the analytical theory of complex systems composed of a large number of high-Q dielectric resonators. Spherical and cylindrical dielectric resonators with inferior and also whispering gallery oscillations allocated in various lattices are considered. A new approach to S-matrix parameter calculations based on perturbation theory of Maxwell equations, developed for a number of high-Q dielectric bodies, is introduced. All physical relationships are obtained in analytical form and are suitable for further computations. Essential attention is given to a new unified formalism of the description of scattering processes. The general scattering task for coupled eigen oscillations of the whole system of dielectric resonators is described. The equations for the  expansion coefficients are explained in an applicable way. The temporal Green functions for the dielectric resonator are presented. The scattering process of short pulses in dielectric filter structures, dielectric antennas  and lattices of d...

  11. Fractional lattice charge transport

    Science.gov (United States)

    Flach, Sergej; Khomeriki, Ramaz

    2017-01-01

    We consider the dynamics of noninteracting quantum particles on a square lattice in the presence of a magnetic flux α and a dc electric field E oriented along the lattice diagonal. In general, the adiabatic dynamics will be characterized by Bloch oscillations in the electrical field direction and dispersive ballistic transport in the perpendicular direction. For rational values of α and a corresponding discrete set of values of E(α) vanishing gaps in the spectrum induce a fractionalization of the charge in the perpendicular direction - while left movers are still performing dispersive ballistic transport, the complementary fraction of right movers is propagating in a dispersionless relativistic manner in the opposite direction. Generalizations and the possible probing of the effect with atomic Bose-Einstein condensates and photonic networks are discussed. Zak phase of respective band associated with gap closing regime has been computed and it is found converging to π/2 value. PMID:28102302

  12. Lattice QCD for Cosmology

    CERN Document Server

    Borsanyi, Sz; Kampert, K H; Katz, S D; Kawanai, T; Kovacs, T G; Mages, S W; Pasztor, A; Pittler, F; Redondo, J; Ringwald, A; Szabo, K K

    2016-01-01

    We present a full result for the equation of state (EoS) in 2+1+1 (up/down, strange and charm quarks are present) flavour lattice QCD. We extend this analysis and give the equation of state in 2+1+1+1 flavour QCD. In order to describe the evolution of the universe from temperatures several hundreds of GeV to several tens of MeV we also include the known effects of the electroweak theory and give the effective degree of freedoms. As another application of lattice QCD we calculate the topological susceptibility (chi) up to the few GeV temperature region. These two results, EoS and chi, can be used to predict the dark matter axion's mass in the post-inflation scenario and/or give the relationship between the axion's mass and the universal axionic angle, which acts as a initial condition of our universe.

  13. Solitons in nonlinear lattices

    CERN Document Server

    Kartashov, Yaroslav V; Torner, Lluis

    2010-01-01

    This article offers a comprehensive survey of results obtained for solitons and complex nonlinear wave patterns supported by purely nonlinear lattices (NLs), which represent a spatially periodic modulation of the local strength and sign of the nonlinearity, and their combinations with linear lattices. A majority of the results obtained, thus far, in this field and reviewed in this article are theoretical. Nevertheless, relevant experimental settings are surveyed too, with emphasis on perspectives for implementation of the theoretical predictions in the experiment. Physical systems discussed in the review belong to the realms of nonlinear optics (including artificial optical media, such as photonic crystals, and plasmonics) and Bose-Einstein condensation (BEC). The solitons are considered in one, two, and three dimensions (1D, 2D, and 3D). Basic properties of the solitons presented in the review are their existence, stability, and mobility. Although the field is still far from completion, general conclusions c...

  14. Parametric lattice Boltzmann method

    Science.gov (United States)

    Shim, Jae Wan

    2017-06-01

    The discretized equilibrium distributions of the lattice Boltzmann method are presented by using the coefficients of the Lagrange interpolating polynomials that pass through the points related to discrete velocities and using moments of the Maxwell-Boltzmann distribution. The ranges of flow velocity and temperature providing positive valued distributions vary with regulating discrete velocities as parameters. New isothermal and thermal compressible models are proposed for flows of the level of the isothermal and thermal compressible Navier-Stokes equations. Thermal compressible shock tube flows are simulated by only five on-lattice discrete velocities. Two-dimensional isothermal and thermal vortices provoked by the Kelvin-Helmholtz instability are simulated by the parametric models.

  15. Varieties of lattices

    CERN Document Server

    Jipsen, Peter

    1992-01-01

    The study of lattice varieties is a field that has experienced rapid growth in the last 30 years, but many of the interesting and deep results discovered in that period have so far only appeared in research papers. The aim of this monograph is to present the main results about modular and nonmodular varieties, equational bases and the amalgamation property in a uniform way. The first chapter covers preliminaries that make the material accessible to anyone who has had an introductory course in universal algebra. Each subsequent chapter begins with a short historical introduction which sites the original references and then presents the results with complete proofs (in nearly all cases). Numerous diagrams illustrate the beauty of lattice theory and aid in the visualization of many proofs. An extensive index and bibliography also make the monograph a useful reference work.

  16. Lattice Quantum Chromodynamics

    CERN Document Server

    Sachrajda, C T

    2016-01-01

    I review the the application of the lattice formulation of QCD and large-scale numerical simulations to the evaluation of non-perturbative hadronic effects in Standard Model Phenomenology. I present an introduction to the elements of the calculations and discuss the limitations both in the range of quantities which can be studied and in the precision of the results. I focus particularly on the extraction of the QCD parameters, i.e. the quark masses and the strong coupling constant, and on important quantities in flavour physics. Lattice QCD is playing a central role in quantifying the hadronic effects necessary for the development of precision flavour physics and its use in exploring the limits of the Standard Model and in searches for inconsistencies which would signal the presence of new physics.

  17. A framework of benchmarking land models

    Science.gov (United States)

    Luo, Y. Q.; Randerson, J.; Abramowitz, G.; Bacour, C.; Blyth, E.; Carvalhais, N.; Ciais, P.; Dalmonech, D.; Fisher, J.; Fisher, R.; Friedlingstein, P.; Hibbard, K.; Hoffman, F.; Huntzinger, D.; Jones, C. D.; Koven, C.; Lawrence, D.; Li, D. J.; Mahecha, M.; Niu, S. L.; Norby, R.; Piao, S. L.; Qi, X.; Peylin, P.; Prentice, I. C.; Riley, W.; Reichstein, M.; Schwalm, C.; Wang, Y. P.; Xia, J. Y.; Zaehle, S.; Zhou, X. H.

    2012-02-01

    Land models, which have been developed by the modeling community in the past two decades to predict future states of ecosystems and climate, have to be critically evaluated for their performance skills of simulating ecosystem responses and feedback to climate change. Benchmarking is an emerging procedure to measure and evaluate performance of models against a set of defined standards. This paper proposes a benchmarking framework for evaluation of land models. The framework includes (1) targeted aspects of model performance to be evaluated; (2) a set of benchmarks as defined references to test model performance; (3) metrics to measure and compare performance skills among models so as to identify model strengths and deficiencies; and (4) model improvement. Component 4 may or may not be involved in a benchmark analysis but is an ultimate goal of general modeling research. Land models are required to simulate exchange of water, energy, carbon and sometimes other trace gases between the atmosphere and the land-surface, and should be evaluated for their simulations of biophysical processes, biogeochemical cycles, and vegetation dynamics across timescales in response to both weather and climate change. Benchmarks that are used to evaluate models generally consist of direct observations, data-model products, and data-derived patterns and relationships. Metrics of measuring mismatches between models and benchmarks may include (1) a priori thresholds of acceptable model performance and (2) a scoring system to combine data-model mismatches for various processes at different temporal and spatial scales. The benchmark analyses should identify clues of weak model performance for future improvement. Iterations between model evaluation and improvement via benchmarking shall demonstrate progress of land modeling and help establish confidence in land models for their predictions of future states of ecosystems and climate.

  18. A framework of benchmarking land models

    Directory of Open Access Journals (Sweden)

    Y. Q. Luo

    2012-02-01

    Full Text Available Land models, which have been developed by the modeling community in the past two decades to predict future states of ecosystems and climate, have to be critically evaluated for their performance skills of simulating ecosystem responses and feedback to climate change. Benchmarking is an emerging procedure to measure and evaluate performance of models against a set of defined standards. This paper proposes a benchmarking framework for evaluation of land models. The framework includes (1 targeted aspects of model performance to be evaluated; (2 a set of benchmarks as defined references to test model performance; (3 metrics to measure and compare performance skills among models so as to identify model strengths and deficiencies; and (4 model improvement. Component 4 may or may not be involved in a benchmark analysis but is an ultimate goal of general modeling research. Land models are required to simulate exchange of water, energy, carbon and sometimes other trace gases between the atmosphere and the land-surface, and should be evaluated for their simulations of biophysical processes, biogeochemical cycles, and vegetation dynamics across timescales in response to both weather and climate change. Benchmarks that are used to evaluate models generally consist of direct observations, data-model products, and data-derived patterns and relationships. Metrics of measuring mismatches between models and benchmarks may include (1 a priori thresholds of acceptable model performance and (2 a scoring system to combine data-model mismatches for various processes at different temporal and spatial scales. The benchmark analyses should identify clues of weak model performance for future improvement. Iterations between model evaluation and improvement via benchmarking shall demonstrate progress of land modeling and help establish confidence in land models for their predictions of future states of ecosystems and climate.

  19. A framework for benchmarking land models

    Science.gov (United States)

    Luo, Y. Q.; Randerson, J. T.; Abramowitz, G.; Bacour, C.; Blyth, E.; Carvalhais, N.; Ciais, P.; Dalmonech, D.; Fisher, J. B.; Fisher, R.; Friedlingstein, P.; Hibbard, K.; Hoffman, F.; Huntzinger, D.; Jones, C. D.; Koven, C.; Lawrence, D.; Li, D. J.; Mahecha, M.; Niu, S. L.; Norby, R.; Piao, S. L.; Qi, X.; Peylin, P.; Prentice, I. C.; Riley, W.; Reichstein, M.; Schwalm, C.; Wang, Y. P.; Xia, J. Y.; Zaehle, S.; Zhou, X. H.

    2012-10-01

    Land models, which have been developed by the modeling community in the past few decades to predict future states of ecosystems and climate, have to be critically evaluated for their performance skills of simulating ecosystem responses and feedback to climate change. Benchmarking is an emerging procedure to measure performance of models against a set of defined standards. This paper proposes a benchmarking framework for evaluation of land model performances and, meanwhile, highlights major challenges at this infant stage of benchmark analysis. The framework includes (1) targeted aspects of model performance to be evaluated, (2) a set of benchmarks as defined references to test model performance, (3) metrics to measure and compare performance skills among models so as to identify model strengths and deficiencies, and (4) model improvement. Land models are required to simulate exchange of water, energy, carbon and sometimes other trace gases between the atmosphere and land surface, and should be evaluated for their simulations of biophysical processes, biogeochemical cycles, and vegetation dynamics in response to climate change across broad temporal and spatial scales. Thus, one major challenge is to select and define a limited number of benchmarks to effectively evaluate land model performance. The second challenge is to develop metrics of measuring mismatches between models and benchmarks. The metrics may include (1) a priori thresholds of acceptable model performance and (2) a scoring system to combine data-model mismatches for various processes at different temporal and spatial scales. The benchmark analyses should identify clues of weak model performance to guide future development, thus enabling improved predictions of future states of ecosystems and climate. The near-future research effort should be on development of a set of widely acceptable benchmarks that can be used to objectively, effectively, and reliably evaluate fundamental properties of land models

  20. Benchmarking Attosecond Physics with Atomic Hydrogen

    Science.gov (United States)

    2015-05-25

    Final 3. DATES COVERED (From - To) 12 Mar 12 – 11 Mar 15 4. TITLE AND SUBTITLE Benchmarking attosecond physics with atomic hydrogen 5a...AND SUBTITLE Benchmarking attosecond physics with atomic hydrogen 5a. CONTRACT NUMBER FA2386-12-1-4025 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER...attosecond physics with atomic hydrogen ” May 25, 2015 PI information: David Kielpinski, dave.kielpinski@gmail.com Griffith University Centre

  1. Aerodynamic Benchmarking of the Deepwind Design

    DEFF Research Database (Denmark)

    Bedona, Gabriele; Schmidt Paulsen, Uwe; Aagaard Madsen, Helge;

    2015-01-01

    The aerodynamic benchmarking for the DeepWind rotor is conducted comparing different rotor geometries and solutions and keeping the comparison as fair as possible. The objective for the benchmarking is to find the most suitable configuration in order to maximize the power production and minimize...... NACA airfoil family. (C) 2015 Published by Elsevier Ltd. This is an open access article under the CC BY-NC-ND license...

  2. Benchmarking Danish Vocational Education and Training Programmes

    DEFF Research Database (Denmark)

    Bogetoft, Peter; Wittrup, Jesper

    This study paper discusses methods whereby Danish vocational education and training colleges can be benchmarked, and presents results from a number of models. It is conceptually complicated to benchmark vocational colleges, as the various colleges in Denmark offer a wide range of course programmes...... attempt to summarise the various effects that the colleges have in two relevant figures, namely retention rates of students and employment rates among students who have completed training programmes....

  3. Implementation of NAS Parallel Benchmarks in Java

    Science.gov (United States)

    Frumkin, Michael; Schultz, Matthew; Jin, Hao-Qiang; Yan, Jerry

    2000-01-01

    A number of features make Java an attractive but a debatable choice for High Performance Computing (HPC). In order to gauge the applicability of Java to the Computational Fluid Dynamics (CFD) we have implemented NAS Parallel Benchmarks in Java. The performance and scalability of the benchmarks point out the areas where improvement in Java compiler technology and in Java thread implementation would move Java closer to Fortran in the competition for CFD applications.

  4. A framework for benchmarking land models

    Directory of Open Access Journals (Sweden)

    Y. Q. Luo

    2012-10-01

    Full Text Available Land models, which have been developed by the modeling community in the past few decades to predict future states of ecosystems and climate, have to be critically evaluated for their performance skills of simulating ecosystem responses and feedback to climate change. Benchmarking is an emerging procedure to measure performance of models against a set of defined standards. This paper proposes a benchmarking framework for evaluation of land model performances and, meanwhile, highlights major challenges at this infant stage of benchmark analysis. The framework includes (1 targeted aspects of model performance to be evaluated, (2 a set of benchmarks as defined references to test model performance, (3 metrics to measure and compare performance skills among models so as to identify model strengths and deficiencies, and (4 model improvement. Land models are required to simulate exchange of water, energy, carbon and sometimes other trace gases between the atmosphere and land surface, and should be evaluated for their simulations of biophysical processes, biogeochemical cycles, and vegetation dynamics in response to climate change across broad temporal and spatial scales. Thus, one major challenge is to select and define a limited number of benchmarks to effectively evaluate land model performance. The second challenge is to develop metrics of measuring mismatches between models and benchmarks. The metrics may include (1 a priori thresholds of acceptable model performance and (2 a scoring system to combine data–model mismatches for various processes at different temporal and spatial scales. The benchmark analyses should identify clues of weak model performance to guide future development, thus enabling improved predictions of future states of ecosystems and climate. The near-future research effort should be on development of a set of widely acceptable benchmarks that can be used to objectively, effectively, and reliably evaluate fundamental properties

  5. The MCNP6 Analytic Criticality Benchmark Suite

    Energy Technology Data Exchange (ETDEWEB)

    Brown, Forrest B. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States). Monte Carlo Codes Group

    2016-06-16

    Analytical benchmarks provide an invaluable tool for verifying computer codes used to simulate neutron transport. Several collections of analytical benchmark problems [1-4] are used routinely in the verification of production Monte Carlo codes such as MCNP® [5,6]. Verification of a computer code is a necessary prerequisite to the more complex validation process. The verification process confirms that a code performs its intended functions correctly. The validation process involves determining the absolute accuracy of code results vs. nature. In typical validations, results are computed for a set of benchmark experiments using a particular methodology (code, cross-section data with uncertainties, and modeling) and compared to the measured results from the set of benchmark experiments. The validation process determines bias, bias uncertainty, and possibly additional margins. Verification is generally performed by the code developers, while validation is generally performed by code users for a particular application space. The VERIFICATION_KEFF suite of criticality problems [1,2] was originally a set of 75 criticality problems found in the literature for which exact analytical solutions are available. Even though the spatial and energy detail is necessarily limited in analytical benchmarks, typically to a few regions or energy groups, the exact solutions obtained can be used to verify that the basic algorithms, mathematics, and methods used in complex production codes perform correctly. The present work has focused on revisiting this benchmark suite. A thorough review of the problems resulted in discarding some of them as not suitable for MCNP benchmarking. For the remaining problems, many of them were reformulated to permit execution in either multigroup mode or in the normal continuous-energy mode for MCNP. Execution of the benchmarks in continuous-energy mode provides a significant advance to MCNP verification methods.

  6. Simple Benchmark Specifications for Space Radiation Protection

    Science.gov (United States)

    Singleterry, Robert C. Jr.; Aghara, Sukesh K.

    2013-01-01

    This report defines space radiation benchmark specifications. This specification starts with simple, monoenergetic, mono-directional particles on slabs and progresses to human models in spacecraft. This report specifies the models and sources needed to what the team performing the benchmark needs to produce in a report. Also included are brief descriptions of how OLTARIS, the NASA Langley website for space radiation analysis, performs its analysis.

  7. International Lattice Data Grid

    CERN Document Server

    Davies, C T H; Kenway, R D; Maynard, C M

    2002-01-01

    We propose the co-ordination of lattice QCD grid developments in different countries to allow transparent exchange of gauge configurations in future, should participants wish to do so. We describe briefly UKQCD's XML schema for labelling and cataloguing the data. A meeting to further develop these ideas will be held in Edinburgh on 19/20 December 2002, and will be available over AccessGrid.

  8. Weakly deformed soliton lattices

    Energy Technology Data Exchange (ETDEWEB)

    Dubrovin, B. (Moskovskij Gosudarstvennyj Univ., Moscow (USSR). Dept. of Mechanics and Mathematics)

    1990-12-01

    In this lecture the author discusses periodic and quasiperiodic solutions of nonlinear evolution equations of phi{sub t}=K (phi, phi{sub x},..., phi{sup (n)}), the so-called soliton lattices. After introducing the theory of integrable systems of hydrodynamic type he discusses their Hamiltonian formalism, i.e. the theory of Poisson brackets of hydrodynamic type. Then he describes the application of algebraic geometry to the effective integration of such equations. (HSI).

  9. Actinides transmutation - a comparison of results for PWR benchmark

    Energy Technology Data Exchange (ETDEWEB)

    Claro, Luiz H. [Instituto de Estudos Avancados (IEAv/CTA), Sao Jose dos Campos, SP (Brazil)], e-mail: luizhenu@ieav.cta.br

    2009-07-01

    The physical aspects involved in the Partitioning and Transmutation (P and T) of minor actinides (MA) and fission products (FP) generated by reactors PWR are of great interest in the nuclear industry. Besides these the reduction in the storage of radioactive wastes are related with the acceptability of the nuclear electric power. From the several concepts for partitioning and transmutation suggested in literature, one of them involves PWR reactors to burn the fuel containing plutonium and minor actinides reprocessed of UO{sub 2} used in previous stages. In this work are presented the results of the calculations of a benchmark in P and T carried with WIMSD5B program using its new cross sections library generated from the ENDF-B-VII and the comparison with the results published in literature by other calculations. For comparison, was used the benchmark transmutation concept based in a typical PWR cell and the analyzed results were the k{infinity} and the atomic density of the isotopes Np-239, Pu-241, Pu-242 and Am-242m, as function of burnup considering discharge of 50 GWd/tHM. (author)

  10. OECD/NEA Burnup Credit Calculational Criticality Benchmark Phase I-B Results

    Energy Technology Data Exchange (ETDEWEB)

    DeHart, M.D.

    1993-01-01

    Burnup credit is an ongoing technical concern for many countries that operate commercial nuclear power reactors. In a multinational cooperative effort to resolve burnup credit issues, a Burnup Credit Working Group has been formed under the auspices of the Nuclear Energy Agency of the Organization for Economic Cooperation and Development. This working group has established a set of well-defined calculational benchmarks designed to study significant aspects of burnup credit computational methods. These benchmarks are intended to provide a means for the intercomparison of computer codes, methods, and data applied in spent fuel analysis. The benchmarks have been divided into multiple phases, each phase focusing on a particular feature of burnup credit analysis. This report summarizes the results and findings of the Phase I-B benchmark, which was proposed to provide a comparison of the ability of different code systems and data libraries to perform depletion analysis for the prediction of spent fuel isotopic concentrations. Results included here represent 21 different sets of calculations submitted by 16 different organizations worldwide, and are based on a limited set of nuclides determined to have the most important effect on the neutron multiplication factor of light-water-reactor spent fuel. A comparison of all sets of results demonstrates that most methods are in agreement to within 10% in the ability to estimate the spent fuel concentrations of most actinides. All methods are within 11% agreement about the average for all fission products studied. Furthermore, most deviations are less than 10%, and many are less than 5%. The exceptions are {sup 149}Sm, {sup 151}Sm, and {sup 155}Gd.

  11. Crystallographic Lattice Boltzmann Method

    Science.gov (United States)

    Namburi, Manjusha; Krithivasan, Siddharth; Ansumali, Santosh

    2016-01-01

    Current approaches to Direct Numerical Simulation (DNS) are computationally quite expensive for most realistic scientific and engineering applications of Fluid Dynamics such as automobiles or atmospheric flows. The Lattice Boltzmann Method (LBM), with its simplified kinetic descriptions, has emerged as an important tool for simulating hydrodynamics. In a heterogeneous computing environment, it is often preferred due to its flexibility and better parallel scaling. However, direct simulation of realistic applications, without the use of turbulence models, remains a distant dream even with highly efficient methods such as LBM. In LBM, a fictitious lattice with suitable isotropy in the velocity space is considered to recover Navier-Stokes hydrodynamics in macroscopic limit. The same lattice is mapped onto a cartesian grid for spatial discretization of the kinetic equation. In this paper, we present an inverted argument of the LBM, by making spatial discretization as the central theme. We argue that the optimal spatial discretization for LBM is a Body Centered Cubic (BCC) arrangement of grid points. We illustrate an order-of-magnitude gain in efficiency for LBM and thus a significant progress towards feasibility of DNS for realistic flows. PMID:27251098

  12. Topological Lattice Actions

    CERN Document Server

    Bietenholz, W; Pepe, M; Wiese, U -J

    2010-01-01

    We consider lattice field theories with topological actions, which are invariant against small deformations of the fields. Some of these actions have infinite barriers separating different topological sectors. Topological actions do not have the correct classical continuum limit and they cannot be treated using perturbation theory, but they still yield the correct quantum continuum limit. To show this, we present analytic studies of the 1-d O(2) and O(3) model, as well as Monte Carlo simulations of the 2-d O(3) model using topological lattice actions. Some topological actions obey and others violate a lattice Schwarz inequality between the action and the topological charge $Q$. Irrespective of this, in the 2-d O(3) model the topological susceptibility $\\chi_t = \\l/V$ is logarithmically divergent in the continuum limit. Still, at non-zero distance the correlator of the topological charge density has a finite continuum limit which is consistent with analytic predictions. Our study shows explicitly that some cla...

  13. Robots and lattice automata

    CERN Document Server

    Adamatzky, Andrew

    2015-01-01

    The book gives a comprehensive overview of the state-of-the-art research and engineering in theory and application of Lattice Automata in design and control of autonomous Robots. Automata and robots share the same notional meaning. Automata (originated from the latinization of the Greek word “αυτόματον”) as self-operating autonomous machines invented from ancient years can be easily considered the first steps of robotic-like efforts. Automata are mathematical models of Robots and also they are integral parts of robotic control systems. A Lattice Automaton is a regular array or a collective of finite state machines, or automata. The Automata update their states by the same rules depending on states of their immediate neighbours. In the context of this book, Lattice Automata are used in developing modular reconfigurable robotic systems, path planning and map exploration for robots, as robot controllers, synchronisation of robot collectives, robot vision, parallel robotic actuators. All chapters are...

  14. Hadroquarkonium from lattice QCD

    Science.gov (United States)

    Alberti, Maurizio; Bali, Gunnar S.; Collins, Sara; Knechtli, Francesco; Moir, Graham; Söldner, Wolfgang

    2017-04-01

    The hadroquarkonium picture [S. Dubynskiy and M. B. Voloshin, Phys. Lett. B 666, 344 (2008), 10.1016/j.physletb.2008.07.086] provides one possible interpretation for the pentaquark candidates with hidden charm, recently reported by the LHCb Collaboration, as well as for some of the charmoniumlike "X , Y , Z " states. In this picture, a heavy quarkonium core resides within a light hadron giving rise to four- or five-quark/antiquark bound states. We test this scenario in the heavy quark limit by investigating the modification of the potential between a static quark-antiquark pair induced by the presence of a hadron. Our lattice QCD simulations are performed on a Coordinated Lattice Simulations (CLS) ensemble with Nf=2 +1 flavors of nonperturbatively improved Wilson quarks at a pion mass of about 223 MeV and a lattice spacing of about a =0.0854 fm . We study the static potential in the presence of a variety of light mesons as well as of octet and decuplet baryons. In all these cases, the resulting configurations are favored energetically. The associated binding energies between the quarkonium in the heavy quark limit and the light hadron are found to be smaller than a few MeV, similar in strength to deuterium binding. It needs to be seen if the small attraction survives in the infinite volume limit and supports bound states or resonances.

  15. Digital lattice gauge theories

    Science.gov (United States)

    Zohar, Erez; Farace, Alessandro; Reznik, Benni; Cirac, J. Ignacio

    2017-02-01

    We propose a general scheme for a digital construction of lattice gauge theories with dynamical fermions. In this method, the four-body interactions arising in models with 2 +1 dimensions and higher are obtained stroboscopically, through a sequence of two-body interactions with ancillary degrees of freedom. This yields stronger interactions than the ones obtained through perturbative methods, as typically done in previous proposals, and removes an important bottleneck in the road towards experimental realizations. The scheme applies to generic gauge theories with Lie or finite symmetry groups, both Abelian and non-Abelian. As a concrete example, we present the construction of a digital quantum simulator for a Z3 lattice gauge theory with dynamical fermionic matter in 2 +1 dimensions, using ultracold atoms in optical lattices, involving three atomic species, representing the matter, gauge, and auxiliary degrees of freedom, that are separated in three different layers. By moving the ancilla atoms with a proper sequence of steps, we show how we can obtain the desired evolution in a clean, controlled way.

  16. Benchmarking for Cost Improvement. Final report

    Energy Technology Data Exchange (ETDEWEB)

    1993-09-01

    The US Department of Energy`s (DOE) Office of Environmental Restoration and Waste Management (EM) conducted the Benchmarking for Cost Improvement initiative with three objectives: Pilot test benchmarking as an EM cost improvement tool; identify areas for cost improvement and recommend actions to address these areas; provide a framework for future cost improvement. The benchmarking initiative featured the use of four principal methods (program classification, nationwide cost improvement survey, paired cost comparison and component benchmarking). Interested parties contributed during both the design and execution phases. The benchmarking initiative was conducted on an accelerated basis. Of necessity, it considered only a limited set of data that may not be fully representative of the diverse and complex conditions found at the many DOE installations. The initiative generated preliminary data about cost differences and it found a high degree of convergence on several issues. Based on this convergence, the report recommends cost improvement strategies and actions. This report describes the steps taken as part of the benchmarking initiative and discusses the findings and recommended actions for achieving cost improvement. The results and summary recommendations, reported below, are organized by the study objectives.

  17. Benchmarking infrastructure for mutation text mining

    Science.gov (United States)

    2014-01-01

    Background Experimental research on the automatic extraction of information about mutations from texts is greatly hindered by the lack of consensus evaluation infrastructure for the testing and benchmarking of mutation text mining systems. Results We propose a community-oriented annotation and benchmarking infrastructure to support development, testing, benchmarking, and comparison of mutation text mining systems. The design is based on semantic standards, where RDF is used to represent annotations, an OWL ontology provides an extensible schema for the data and SPARQL is used to compute various performance metrics, so that in many cases no programming is needed to analyze results from a text mining system. While large benchmark corpora for biological entity and relation extraction are focused mostly on genes, proteins, diseases, and species, our benchmarking infrastructure fills the gap for mutation information. The core infrastructure comprises (1) an ontology for modelling annotations, (2) SPARQL queries for computing performance metrics, and (3) a sizeable collection of manually curated documents, that can support mutation grounding and mutation impact extraction experiments. Conclusion We have developed the principal infrastructure for the benchmarking of mutation text mining tasks. The use of RDF and OWL as the representation for corpora ensures extensibility. The infrastructure is suitable for out-of-the-box use in several important scenarios and is ready, in its current state, for initial community adoption. PMID:24568600

  18. A Mechanical Lattice Aid for Crystallography Teaching.

    Science.gov (United States)

    Amezcua-Lopez, J.; Cordero-Borboa, A. E.

    1988-01-01

    Introduces a 3-dimensional mechanical lattice with adjustable telescoping mechanisms. Discusses the crystalline state, the 14 Bravais lattices, operational principles of the mechanical lattice, construction methods, and demonstrations in classroom. Provides lattice diagrams, schemes of the lattice, and various pictures of the lattice. (YP)

  19. Kenneth Wilson and lattice QCD

    CERN Document Server

    Ukawa, Akira

    2015-01-01

    We discuss the physics and computation of lattice QCD, a space-time lattice formulation of quantum chromodynamics, and Kenneth Wilson's seminal role in its development. We start with the fundamental issue of confinement of quarks in the theory of the strong interactions, and discuss how lattice QCD provides a framework for understanding this phenomenon. A conceptual issue with lattice QCD is a conflict of space-time lattice with chiral symmetry of quarks. We discuss how this problem is resolved. Since lattice QCD is a non-linear quantum dynamical system with infinite degrees of freedom, quantities which are analytically calculable are limited. On the other hand, it provides an ideal case of massively parallel numerical computations. We review the long and distinguished history of parallel-architecture supercomputers designed and built for lattice QCD. We discuss algorithmic developments, in particular the difficulties posed by the fermionic nature of quarks, and their resolution. The triad of efforts toward b...

  20. NODAL3 Sensitivity Analysis for NEACRP 3D LWR Core Transient Benchmark (PWR

    Directory of Open Access Journals (Sweden)

    Surian Pinem

    2016-01-01

    Full Text Available This paper reports the results of sensitivity analysis of the multidimension, multigroup neutron diffusion NODAL3 code for the NEACRP 3D LWR core transient benchmarks (PWR. The code input parameters covered in the sensitivity analysis are the radial and axial node sizes (the number of radial node per fuel assembly and the number of axial layers, heat conduction node size in the fuel pellet and cladding, and the maximum time step. The output parameters considered in this analysis followed the above-mentioned core transient benchmarks, that is, power peak, time of power peak, power, averaged Doppler temperature, maximum fuel centerline temperature, and coolant outlet temperature at the end of simulation (5 s. The sensitivity analysis results showed that the radial node size and maximum time step give a significant effect on the transient parameters, especially the time of power peak, for the HZP and HFP conditions. The number of ring divisions for fuel pellet and cladding gives negligible effect on the transient solutions. For productive work of the PWR transient analysis, based on the present sensitivity analysis results, we recommend NODAL3 users to use 2×2 radial nodes per assembly, 1×18 axial layers per assembly, the maximum time step of 10 ms, and 9 and 1 ring divisions for fuel pellet and cladding, respectively.

  1. Clean Energy Manufacturing Analysis Center Benchmark Report: Framework and Methodologies

    Energy Technology Data Exchange (ETDEWEB)

    Sandor, Debra [National Renewable Energy Lab. (NREL), Golden, CO (United States); Chung, Donald [National Renewable Energy Lab. (NREL), Golden, CO (United States); Keyser, David [National Renewable Energy Lab. (NREL), Golden, CO (United States); Mann, Margaret [National Renewable Energy Lab. (NREL), Golden, CO (United States); Engel-Cox, Jill [National Renewable Energy Lab. (NREL), Golden, CO (United States)

    2017-05-23

    This report documents the CEMAC methodologies for developing and reporting annual global clean energy manufacturing benchmarks. The report reviews previously published manufacturing benchmark reports and foundational data, establishes a framework for benchmarking clean energy technologies, describes the CEMAC benchmark analysis methodologies, and describes the application of the methodologies to the manufacturing of four specific clean energy technologies.

  2. Benchmarking von Krankenhausinformationssystemen – eine vergleichende Analyse deutschsprachiger Benchmarkingcluster

    Directory of Open Access Journals (Sweden)

    Jahn, Franziska

    2015-08-01

    Full Text Available Benchmarking is a method of strategic information management used by many hospitals today. During the last years, several benchmarking clusters have been established within the German-speaking countries. They support hospitals in comparing and positioning their information system’s and information management’s costs, performance and efficiency against other hospitals. In order to differentiate between these benchmarking clusters and to provide decision support in selecting an appropriate benchmarking cluster, a classification scheme is developed. The classification scheme observes both general conditions and examined contents of the benchmarking clusters. It is applied to seven benchmarking clusters which have been active in the German-speaking countries within the last years. Currently, performance benchmarking is the most frequent benchmarking type, whereas the observed benchmarking clusters differ in the number of benchmarking partners and their cooperation forms. The benchmarking clusters also deal with different benchmarking subjects. Assessing costs and quality application systems, physical data processing systems, organizational structures of information management and IT services processes are the most frequent benchmarking subjects. There is still potential for further activities within the benchmarking clusters to measure strategic and tactical information management, IT governance and quality of data and data-processing processes. Based on the classification scheme and the comparison of the benchmarking clusters, we derive general recommendations for benchmarking of hospital information systems.

  3. Storage-Intensive Supercomputing Benchmark Study

    Energy Technology Data Exchange (ETDEWEB)

    Cohen, J; Dossa, D; Gokhale, M; Hysom, D; May, J; Pearce, R; Yoo, A

    2007-10-30

    Critical data science applications requiring frequent access to storage perform poorly on today's computing architectures. This project addresses efficient computation of data-intensive problems in national security and basic science by exploring, advancing, and applying a new form of computing called storage-intensive supercomputing (SISC). Our goal is to enable applications that simply cannot run on current systems, and, for a broad range of data-intensive problems, to deliver an order of magnitude improvement in price/performance over today's data-intensive architectures. This technical report documents much of the work done under LDRD 07-ERD-063 Storage Intensive Supercomputing during the period 05/07-09/07. The following chapters describe: (1) a new file I/O monitoring tool iotrace developed to capture the dynamic I/O profiles of Linux processes; (2) an out-of-core graph benchmark for level-set expansion of scale-free graphs; (3) an entity extraction benchmark consisting of a pipeline of eight components; and (4) an image resampling benchmark drawn from the SWarp program in the LSST data processing pipeline. The performance of the graph and entity extraction benchmarks was measured in three different scenarios: data sets residing on the NFS file server and accessed over the network; data sets stored on local disk; and data sets stored on the Fusion I/O parallel NAND Flash array. The image resampling benchmark compared performance of software-only to GPU-accelerated. In addition to the work reported here, an additional text processing application was developed that used an FPGA to accelerate n-gram profiling for language classification. The n-gram application will be presented at SC07 at the High Performance Reconfigurable Computing Technologies and Applications Workshop. The graph and entity extraction benchmarks were run on a Supermicro server housing the NAND Flash 40GB parallel disk array, the Fusion-io. The Fusion system specs are as follows

  4. Application of fully ceramic microencapsulated fuels in light water reactors

    Energy Technology Data Exchange (ETDEWEB)

    Gentry, C.; George, N.; Maldonado, I. [Dept. of Nuclear Engineering, Univ. of Tennessee-Knoxville, Knoxville, TN 37996-2300 (United States); Godfrey, A.; Terrani, K.; Gehin, J. [Oak Ridge National Laboratory, Oak Ridge, TN 37831 (United States)

    2012-07-01

    This study performs a preliminary evaluation of the feasibility of incorporation of Fully Ceramic Microencapsulated (FCM) fuels in light water reactors (LWRs). In particular, pin cell, lattice, and full core analyses are carried out on FCM fuel in a pressurized water reactor (PWR). Using uranium-based fuel and Pu/Np-based fuel in TRistructural isotropic (TRISO) particle form, each fuel design was examined using the SCALE 6.1 analytical suite. In regards to the uranium-based fuel, pin cell calculations were used to determine which fuel material performed best when implemented in the fuel kernel as well as the size of the kernel and surrounding particle layers. The higher fissile material density of uranium mononitride (UN) proved to be favorable, while the parametric studies showed that the FCM particle fuel design with 19.75% enrichment would need roughly 12% additional fissile material in comparison to that of a standard UO{sub 2} rod in order to match the lifetime of an 18-month PWR cycle. As part of the fuel assembly design evaluations, fresh feed lattices were modeled to analyze the within-assembly pin power peaking. Also, a 'color-set' array of assemblies was constructed to evaluate power peaking and power sharing between a once-burned and a fresh feed assembly. In regards to the Pu/Np-based fuel, lattice calculations were performed to determine an optimal lattice design based on reactivity behavior, pin power peaking, and isotopic content. After obtaining a satisfactory lattice design, the feasibility of core designs fully loaded with Pu/Np FCM lattices was demonstrated using the NESTLE three-dimensional core simulator. (authors)

  5. Application of Fully Ceramic Microencapsulated Fuels in Light Water Reactors

    Energy Technology Data Exchange (ETDEWEB)

    Gentry, Cole A [ORNL; George, Nathan M [ORNL; Maldonado, G Ivan [ORNL; Godfrey, Andrew T [ORNL; Terrani, Kurt A [ORNL; Gehin, Jess C [ORNL

    2012-01-01

    This study aims to perform a preliminary evaluation of the feasibility of incorporation of Fully Ceramic Microencapsulated (FCM) fuels in Light Water Reactors (LWRs). In particular pin cell, lattice, and full core analyses are carried out on FCM fuel in a pressurized water reactor. Using uranium-based fuel and transuranic (TRU) based fuel in TRistructural ISOtropic (TRISO) particle form, each fuel design was examined using the SCALE 6.1 analytical suite. In regards to the uranium-based fuel, pin cell calculations were used to determine which fuel material performed best when implemented in the fuel kernel as well as the size of the kernel and surrounding particle layers. The higher physical density of uranium mononitride (UN) proved to be favorable, while the parametric studies showed that the FCM particle fuel design would need roughly 12% additional fissile material in comparison to that of a standard UO2 rod in order to match the lifetime of an 18-month PWR cycle. As part of the fuel assembly design evaluations, fresh feed lattices were modeled to analyze the within-assembly pin power peaking. Also, a color-set array of assemblies was constructed to evaluate power peaking and power sharing between a once-burned and a fresh feed assembly. In regards to the TRU based fuel, lattice calculations were performed to determine an optimal lattice design based on reactivity behavior, pin power peaking, and isotopic content. After obtaining a satisfactory lattice design, feasibility of core designs fully loaded with TRU FCM lattices was demonstrated using the NESTLE three-dimensional core simulator.

  6. Lattice topology dictates photon statistics.

    Science.gov (United States)

    Kondakci, H Esat; Abouraddy, Ayman F; Saleh, Bahaa E A

    2017-08-21

    Propagation of coherent light through a disordered network is accompanied by randomization and possible conversion into thermal light. Here, we show that network topology plays a decisive role in determining the statistics of the emerging field if the underlying lattice is endowed with chiral symmetry. In such lattices, eigenmode pairs come in skew-symmetric pairs with oppositely signed eigenvalues. By examining one-dimensional arrays of randomly coupled waveguides arranged on linear and ring topologies, we are led to a remarkable prediction: the field circularity and the photon statistics in ring lattices are dictated by its parity while the same quantities are insensitive to the parity of a linear lattice. For a ring lattice, adding or subtracting a single lattice site can switch the photon statistics from super-thermal to sub-thermal, or vice versa. This behavior is understood by examining the real and imaginary fields on a lattice exhibiting chiral symmetry, which form two strands that interleave along the lattice sites. These strands can be fully braided around an even-sited ring lattice thereby producing super-thermal photon statistics, while an odd-sited lattice is incommensurate with such an arrangement and the statistics become sub-thermal.

  7. Full sphere hydrodynamic and dynamo benchmarks

    KAUST Repository

    Marti, P.

    2014-01-26

    Convection in planetary cores can generate fluid flow and magnetic fields, and a number of sophisticated codes exist to simulate the dynamic behaviour of such systems. We report on the first community activity to compare numerical results of computer codes designed to calculate fluid flow within a whole sphere. The flows are incompressible and rapidly rotating and the forcing of the flow is either due to thermal convection or due to moving boundaries. All problems defined have solutions that alloweasy comparison, since they are either steady, slowly drifting or perfectly periodic. The first two benchmarks are defined based on uniform internal heating within the sphere under the Boussinesq approximation with boundary conditions that are uniform in temperature and stress-free for the flow. Benchmark 1 is purely hydrodynamic, and has a drifting solution. Benchmark 2 is a magnetohydrodynamic benchmark that can generate oscillatory, purely periodic, flows and magnetic fields. In contrast, Benchmark 3 is a hydrodynamic rotating bubble benchmark using no slip boundary conditions that has a stationary solution. Results from a variety of types of code are reported, including codes that are fully spectral (based on spherical harmonic expansions in angular coordinates and polynomial expansions in radius), mixed spectral and finite difference, finite volume, finite element and also a mixed Fourier-finite element code. There is good agreement between codes. It is found that in Benchmarks 1 and 2, the approximation of a whole sphere problem by a domain that is a spherical shell (a sphere possessing an inner core) does not represent an adequate approximation to the system, since the results differ from whole sphere results. © The Authors 2014. Published by Oxford University Press on behalf of The Royal Astronomical Society.

  8. Criteria of benchmark selection for efficient flexible multibody system formalisms

    Directory of Open Access Journals (Sweden)

    Valášek M.

    2007-10-01

    Full Text Available The paper deals with the selection process of benchmarks for testing and comparing efficient flexible multibody formalisms. The existing benchmarks are briefly summarized. The purposes for benchmark selection are investigated. The result of this analysis is the formulation of the criteria of benchmark selection for flexible multibody formalisms. Based on them the initial set of suitable benchmarks is described. Besides that the evaluation measures are revised and extended.

  9. Test Nationally, Benchmark Locally: Using Local DIBELS Benchmarks to Predict Performance on the Pssa

    Science.gov (United States)

    Ferchalk, Matthew R.

    2013-01-01

    The Dynamic Indicators of Basic Early Literacy Skills (DIBELS) benchmarks are frequently used to make important decision regarding student performance. More information, however, is needed to understand if the nationally-derived benchmarks created by the DIBELS system provide the most accurate criterion for evaluating reading proficiency. The…

  10. Benchmarking local healthcare-associated infections: available benchmarks and interpretation challenges.

    Science.gov (United States)

    El-Saed, Aiman; Balkhy, Hanan H; Weber, David J

    2013-10-01

    Growing numbers of healthcare facilities are routinely collecting standardized data on healthcare-associated infection (HAI), which can be used not only to track internal performance but also to compare local data to national and international benchmarks. Benchmarking overall (crude) HAI surveillance metrics without accounting or adjusting for potential confounders can result in misleading conclusions. Methods commonly used to provide risk-adjusted metrics include multivariate logistic regression analysis, stratification, indirect standardization, and restrictions. The characteristics of recognized benchmarks worldwide, including the advantages and limitations are described. The choice of the right benchmark for the data from the Gulf Cooperation Council (GCC) states is challenging. The chosen benchmark should have similar data collection and presentation methods. Additionally, differences in surveillance environments including regulations should be taken into consideration when considering such a benchmark. The GCC center for infection control took some steps to unify HAI surveillance systems in the region. GCC hospitals still need to overcome legislative and logistic difficulties in sharing data to create their own benchmark. The availability of a regional GCC benchmark may better enable health care workers and researchers to obtain more accurate and realistic comparisons.

  11. The Concepts "Benchmarks and Benchmarking" Used in Education Planning: Teacher Education as Example

    Science.gov (United States)

    Steyn, H. J.

    2015-01-01

    Planning in education is a structured activity that includes several phases and steps that take into account several kinds of information (Steyn, Steyn, De Waal & Wolhuter, 2002: 146). One of the sets of information that are usually considered is the (so-called) "benchmarks" and "benchmarking" regarding the focus of a…

  12. Ordered sets and lattices

    CERN Document Server

    Drashkovicheva, Kh; Igoshin, V I; Katrinyak, T; Kolibiar, M

    1989-01-01

    This book is another publication in the recent surveys of ordered sets and lattices. The papers, which might be characterized as "reviews of reviews," are based on articles reviewed in the Referativnyibreve Zhurnal: Matematika from 1978 to 1982. For the sake of completeness, the authors also attempted to integrate information from other relevant articles from that period. The bibliography of each paper provides references to the reviews in RZhMat and Mathematical Reviews where one can seek more detailed information. Specifically excluded from consideration in this volume were such topics as al

  13. Lattice Vibrations in Chlorobenzenes:

    DEFF Research Database (Denmark)

    Reynolds, P. A.; Kjems, Jørgen; White, J. W.

    1974-01-01

    Lattice vibrational dispersion curves for the ``intermolecular'' modes in the triclinic, one molecule per unit cell β phase of p‐C6D4Cl2 and p‐C6H4Cl2 have been obtained by inelastic neutron scattering. The deuterated sample was investigated at 295 and at 90°K and a linear extrapolation to 0°K...... by consideration of electrostatic forces or by further anisotropy in the dispersion forces not described in the atom‐atom model. Anharmonic effects are shown to be large, but the dominant features in the temperature variation of frequencies are describable by a quasiharmonic model....

  14. Features and technology of enterprise internal benchmarking

    Directory of Open Access Journals (Sweden)

    A.V. Dubodelova

    2013-06-01

    Full Text Available The aim of the article. The aim of the article is to generalize characteristics, objectives, advantages of internal benchmarking. The stages sequence of internal benchmarking technology is formed. It is focused on continuous improvement of process of the enterprise by implementing existing best practices.The results of the analysis. Business activity of domestic enterprises in crisis business environment has to focus on the best success factors of their structural units by using standard research assessment of their performance and their innovative experience in practice. Modern method of those needs satisfying is internal benchmarking. According to Bain & Co internal benchmarking is one the three most common methods of business management.The features and benefits of benchmarking are defined in the article. The sequence and methodology of implementation of individual stages of benchmarking technology projects are formulated.The authors define benchmarking as a strategic orientation on the best achievement by comparing performance and working methods with the standard. It covers the processes of researching, organization of production and distribution, management and marketing methods to reference objects to identify innovative practices and its implementation in a particular business.Benchmarking development at domestic enterprises requires analysis of theoretical bases and practical experience. Choice best of experience helps to develop recommendations for their application in practice.Also it is essential to classificate species, identify characteristics, study appropriate areas of use and development methodology of implementation. The structure of internal benchmarking objectives includes: promoting research and establishment of minimum acceptable levels of efficiency processes and activities which are available at the enterprise; identification of current problems and areas that need improvement without involvement of foreign experience

  15. Toxicological benchmarks for wildlife: 1994 Revision

    Energy Technology Data Exchange (ETDEWEB)

    Opresko, D.M.; Sample, B.E.; Suter, G.W. II

    1994-09-01

    The process by which ecological risks of environmental contaminants are evaluated is two-tiered. The first tier is a screening assessment where concentrations of contaminants in the environment are compared to toxicological benchmarks which represent concentrations of chemicals in environmental media (water, sediment, soil, food, etc.) that are presumed to be nonhazardous to the surrounding biota. The second tier is a baseline ecological risk assessment where toxicological benchmarks are one of several lines of evidence used to support or refute the presence of ecological effects. The report presents toxicological benchmarks for assessment of effects of 76 chemicals on 8 representative mammalian wildlife species and 31 chemicals on 9 avian wildlife species. The chemicals are some of those that occur at United States Department of Energy waste sites; the wildlife species were chosen because they are widely distributed and provide a representative range of body sizes and diets. Further descriptions of the chosen wildlife species and chemicals are provided in the report. The benchmarks presented in this report represent values believed to be nonhazardous for the listed wildlife species. These benchmarks only consider contaminant exposure through oral ingestion of contaminated media; exposure through inhalation or direct dermal exposure are not considered in this report.

  16. Variational tensor network renormalization in imaginary time: Benchmark results in the Hubbard model at finite temperature

    Science.gov (United States)

    Czarnik, Piotr; Rams, Marek M.; Dziarmaga, Jacek

    2016-12-01

    A Gibbs operator e-β H for a two-dimensional (2D) lattice system with a Hamiltonian H can be represented by a 3D tensor network, with the third dimension being the imaginary time (inverse temperature) β . Coarse graining the network along β results in a 2D projected entangled-pair operator (PEPO) with a finite bond dimension. The coarse graining is performed by a tree tensor network of isometries. They are optimized variationally to maximize the accuracy of the PEPO as a representation of the 2D thermal state e-β H. The algorithm is applied to the two-dimensional Hubbard model on an infinite square lattice. Benchmark results at finite temperature are obtained that are consistent with the best cluster dynamical mean-field theory and power-series expansion in the regime of parameters where they yield mutually consistent results.

  17. Lattice harmonics expansion revisited

    Science.gov (United States)

    Kontrym-Sznajd, G.; Holas, A.

    2017-04-01

    The main subject of the work is to provide the most effective way of determining the expansion of some quantities into orthogonal polynomials, when these quantities are known only along some limited number of sampling directions. By comparing the commonly used Houston method with the method based on the orthogonality relation, some relationships, which define the applicability and correctness of these methods, are demonstrated. They are verified for various sets of sampling directions applicable for expanding quantities having the full symmetry of the Brillouin zone of cubic and non-cubic lattices. All results clearly show that the Houston method is always better than the orthogonality-relation one. For the cubic symmetry we present a few sets of special directions (SDs) showing how their construction and, next, a proper application depend on the choice of various sets of lattice harmonics. SDs are important mainly for experimentalists who want to reconstruct anisotropic quantities from their measurements, performed at a limited number of sampling directions.

  18. Extreme lattices: symmetries and decorrelation

    Science.gov (United States)

    Andreanov, A.; Scardicchio, A.; Torquato, S.

    2016-11-01

    We study statistical and structural properties of extreme lattices, which are the local minima in the density landscape of lattice sphere packings in d-dimensional Euclidean space {{{R}}d} . Specifically, we ascertain statistics of the densities and kissing numbers as well as the numbers of distinct symmetries of the packings for dimensions 8 through 13 using the stochastic Voronoi algorithm. The extreme lattices in a fixed dimension of space d (d≥slant 8 ) are dominated by typical lattices that have similar packing properties, such as packing densities and kissing numbers, while the best and the worst packers are in the long tails of the distribution of the extreme lattices. We also study the validity of the recently proposed decorrelation principle, which has important implications for sphere packings in general. The degree to which extreme-lattice packings decorrelate as well as how decorrelation is related to the packing density and symmetry of the lattices as the space dimension increases is also investigated. We find that the extreme lattices decorrelate with increasing dimension, while the least symmetric lattices decorrelate faster.

  19. A LATTICE BOLTZMANN SUBGRID MODEL FOR LID-DRIVEN CAVITY FLOW

    Institute of Scientific and Technical Information of China (English)

    YANG Fan; LIU Shu-hong; WU Yu-lin; TANG Xue-lin

    2005-01-01

    In recent years, the Lattice Boltzmann Method (LBM) has developed into an alternative and promising numerical scheme for simulating fluid flows and modeling physics in fluids. In order to propose LBM for high Reynolds number fluid flow applications, a subgrid turbulence model for LBM was introduced based on standard Smagorinsky subgrid model and Lattice Bhatnagar-Gross-Krook (LBGK) model. The subgrid LBGK model was subsequently used to simulate the two-dimensional driven cavity flow at high Reynolds numbers. The simulation results including distribution of stream lines, dimensionless velocities distribution, values of stream function, as well as location of vertex center, were compared with benchmark solutions, with satisfactory agreements.

  20. A lattice Boltzmann coupled to finite volumes method for solving phase change problems

    Directory of Open Access Journals (Sweden)

    El Ganaoui Mohammed

    2009-01-01

    Full Text Available A numerical scheme coupling lattice Boltzmann and finite volumes approaches has been developed and qualified for test cases of phase change problems. In this work, the coupled partial differential equations of momentum conservation equations are solved with a non uniform lattice Boltzmann method. The energy equation is discretized by using a finite volume method. Simulations show the ability of this developed hybrid method to model the effects of convection, and to predict transfers. Benchmarking is operated both for conductive and convective situation dominating solid/liquid transition. Comparisons are achieved with respect to available analytical solutions and experimental results.

  1. Coral benchmarks in the center of biodiversity.

    Science.gov (United States)

    Licuanan, W Y; Robles, R; Dygico, M; Songco, A; van Woesik, R

    2017-01-30

    There is an urgent need to quantify coral reef benchmarks that assess changes and recovery rates through time and serve as goals for management. Yet, few studies have identified benchmarks for hard coral cover and diversity in the center of marine diversity. In this study, we estimated coral cover and generic diversity benchmarks on the Tubbataha reefs, the largest and best-enforced no-take marine protected area in the Philippines. The shallow (2-6m) reef slopes of Tubbataha were monitored annually, from 2012 to 2015, using hierarchical sampling. Mean coral cover was 34% (σ±1.7) and generic diversity was 18 (σ±0.9) per 75m by 25m station. The southeastern leeward slopes supported on average 56% coral cover, whereas the northeastern windward slopes supported 30%, and the western slopes supported 18% coral cover. Generic diversity was more spatially homogeneous than coral cover. Copyright © 2016 Elsevier Ltd. All rights reserved.

  2. Professional Performance and Bureaucratic Benchmarking Information

    DEFF Research Database (Denmark)

    Schneider, Melanie L.; Mahlendorf, Matthias D.; Schäffer, Utz

    Prior research documents positive effects of benchmarking information provision on performance and attributes this to social comparisons. However, the effects on professional recipients are unclear. Studies of professional control indicate that professional recipients often resist bureaucratic...... controls because of organizational-professional conflicts. We therefore analyze the association between bureaucratic benchmarking information provision and professional performance and suggest that the association is more positive if prior professional performance was low. We test our hypotheses based...... and professional performance but only if prior professional performance was low. Supplemental analyses support the robustness of our results. Findings indicate conditions under which bureaucratic benchmarking information may affect professional performance and advance research on professional control and social...

  3. The national hydrologic bench-mark network

    Science.gov (United States)

    Cobb, Ernest D.; Biesecker, J.E.

    1971-01-01

    The United States is undergoing a dramatic growth of population and demands on its natural resources. The effects are widespread and often produce significant alterations of the environment. The hydrologic bench-mark network was established to provide data on stream basins which are little affected by these changes. The network is made up of selected stream basins which are not expected to be significantly altered by man. Data obtained from these basins can be used to document natural changes in hydrologic characteristics with time, to provide a better understanding of the hydrologic structure of natural basins, and to provide a comparative base for studying the effects of man on the hydrologic environment. There are 57 bench-mark basins in 37 States. These basins are in areas having a wide variety of climate and topography. The bench-mark basins and the types of data collected in the basins are described.

  4. DWEB: A Data Warehouse Engineering Benchmark

    CERN Document Server

    Darmont, Jérôme; Boussaïd, Omar

    2005-01-01

    Data warehouse architectural choices and optimization techniques are critical to decision support query performance. To facilitate these choices, the performance of the designed data warehouse must be assessed. This is usually done with the help of benchmarks, which can either help system users comparing the performances of different systems, or help system engineers testing the effect of various design choices. While the TPC standard decision support benchmarks address the first point, they are not tuneable enough to address the second one and fail to model different data warehouse schemas. By contrast, our Data Warehouse Engineering Benchmark (DWEB) allows to generate various ad-hoc synthetic data warehouses and workloads. DWEB is fully parameterized to fulfill data warehouse design needs. However, two levels of parameterization keep it relatively easy to tune. Finally, DWEB is implemented as a Java free software that can be interfaced with most existing relational database management systems. A sample usag...

  5. Benchmarking optimization solvers for structural topology optimization

    DEFF Research Database (Denmark)

    Rojas Labanda, Susana; Stolpe, Mathias

    2015-01-01

    The purpose of this article is to benchmark different optimization solvers when applied to various finite element based structural topology optimization problems. An extensive and representative library of minimum compliance, minimum volume, and mechanism design problem instances for different...... sizes is developed for this benchmarking. The problems are based on a material interpolation scheme combined with a density filter. Different optimization solvers including Optimality Criteria (OC), the Method of Moving Asymptotes (MMA) and its globally convergent version GCMMA, the interior point...... profiles conclude that general solvers are as efficient and reliable as classical structural topology optimization solvers. Moreover, the use of the exact Hessians in SAND formulations, generally produce designs with better objective function values. However, with the benchmarked implementations solving...

  6. Energy benchmarking of South Australian WWTPs.

    Science.gov (United States)

    Krampe, J

    2013-01-01

    Optimising the energy consumption and energy generation of wastewater treatment plants (WWTPs) is a topic with increasing importance for water utilities in times of rising energy costs and pressures to reduce greenhouse gas (GHG) emissions. Assessing the energy efficiency and energy optimisation of a WWTP are difficult tasks as most plants vary greatly in size, process layout and other influencing factors. To overcome these limits it is necessary to compare energy efficiency with a statistically relevant base to identify shortfalls and optimisation potential. Such energy benchmarks have been successfully developed and used in central Europe over the last two decades. This paper demonstrates how the latest available energy benchmarks from Germany have been applied to 24 WWTPs in South Australia. It shows how energy benchmarking can be used to identify shortfalls in current performance, prioritise detailed energy assessments and help inform decisions on capital investment.

  7. Confidential benchmarking based on multiparty computation

    DEFF Research Database (Denmark)

    Damgård, Ivan Bjerre; Damgård, Kasper Lyneborg; Nielsen, Kurt;

    We report on the design and implementation of a system that uses multiparty computation to enable banks to benchmark their customers' confidential performance data against a large representative set of confidential performance data from a consultancy house. The system ensures that both the banks......' and the consultancy house's data stays confidential, the banks as clients learn nothing but the computed benchmarking score. In the concrete business application, the developed prototype help Danish banks to find the most efficient customers among a large and challenging group of agricultural customers with too much...... debt. We propose a model based on linear programming for doing the benchmarking and implement it using the SPDZ protocol by Damgård et al., which we modify using a new idea that allows clients to supply data and get output without having to participate in the preprocessing phase and without keeping...

  8. FGK Benchmark Stars A new metallicity scale

    CERN Document Server

    Jofre, Paula; Soubiran, C; Blanco-Cuaresma, S; Pancino, E; Bergemann, M; Cantat-Gaudin, T; Hernandez, J I Gonzalez; Hill, V; Lardo, C; de Laverny, P; Lind, K; Magrini, L; Masseron, T; Montes, D; Mucciarelli, A; Nordlander, T; Recio-Blanco, A; Sobeck, J; Sordo, R; Sousa, S G; Tabernero, H; Vallenari, A; Van Eck, S; Worley, C C

    2013-01-01

    In the era of large spectroscopic surveys of stars of the Milky Way, atmospheric parameter pipelines require reference stars to evaluate and homogenize their values. We provide a new metallicity scale for the FGK benchmark stars based on their corresponding fundamental effective temperature and surface gravity. This was done by analyzing homogeneously with up to seven different methods a spectral library of benchmark stars. Although our direct aim was to provide a reference metallicity to be used by the Gaia-ESO Survey, the fundamental effective temperatures and surface gravities of benchmark stars of Heiter et al. 2013 (in prep) and their metallicities obtained in this work can also be used as reference parameters for other ongoing surveys, such as Gaia, HERMES, RAVE, APOGEE and LAMOST.

  9. Professional Performance and Bureaucratic Benchmarking Information

    DEFF Research Database (Denmark)

    Schneider, Melanie L.; Mahlendorf, Matthias D.; Schäffer, Utz

    Professionals are often expected to be reluctant with regard to bureaucratic controls because of assumed conflicting values and goals of the organization vis-à-vis the profession. We suggest however, that the provision of bureaucratic benchmarking information is positively associated with profess......Professionals are often expected to be reluctant with regard to bureaucratic controls because of assumed conflicting values and goals of the organization vis-à-vis the profession. We suggest however, that the provision of bureaucratic benchmarking information is positively associated...... for 191 orthopaedics departments of German hospitals matched with survey data on bureaucratic benchmarking information provision to the chief physician of the respective department. Professional performance is publicly disclosed due to regulatory requirements. At the same time, chief physicians typically...

  10. Shielding Integral Benchmark Archive and Database (SINBAD)

    Energy Technology Data Exchange (ETDEWEB)

    Kirk, Bernadette Lugue [ORNL; Grove, Robert E [ORNL; Kodeli, I. [International Atomic Energy Agency (IAEA); Sartori, Enrico [ORNL; Gulliford, J. [OECD Nuclear Energy Agency

    2011-01-01

    The Shielding Integral Benchmark Archive and Database (SINBAD) collection of benchmarks was initiated in the early 1990 s. SINBAD is an international collaboration between the Organization for Economic Cooperation and Development s Nuclear Energy Agency Data Bank (OECD/NEADB) and the Radiation Safety Information Computational Center (RSICC) at Oak Ridge National Laboratory (ORNL). SINBAD is a major attempt to compile experiments and corresponding computational models with the goal of preserving institutional knowledge and expertise that need to be handed down to future scientists. SINBAD is also a learning tool for university students and scientists who need to design experiments or gain expertise in modeling and simulation. The SINBAD database is currently divided into three categories fission, fusion, and accelerator benchmarks. Where possible, each experiment is described and analyzed using deterministic or probabilistic (Monte Carlo) radiation transport software.

  11. A Benchmarking System for Domestic Water Use

    Directory of Open Access Journals (Sweden)

    Dexter V. L. Hunt

    2014-05-01

    Full Text Available The national demand for water in the UK is predicted to increase, exacerbated by a growing UK population, and home-grown demands for energy and food. When set against the context of overstretched existing supply sources vulnerable to droughts, particularly in increasingly dense city centres, the delicate balance of matching minimal demands with resource secure supplies becomes critical. When making changes to "internal" demands the role of technological efficiency and user behaviour cannot be ignored, yet existing benchmarking systems traditionally do not consider the latter. This paper investigates the practicalities of adopting a domestic benchmarking system (using a band rating that allows individual users to assess their current water use performance against what is possible. The benchmarking system allows users to achieve higher benchmarks through any approach that reduces water consumption. The sensitivity of water use benchmarks are investigated by making changes to user behaviour and technology. The impact of adopting localised supplies (i.e., Rainwater harvesting—RWH and Grey water—GW and including "external" gardening demands are investigated. This includes the impacts (in isolation and combination of the following: occupancy rates (1 to 4; roof size (12.5 m2 to 100 m2; garden size (25 m2 to 100 m2 and geographical location (North West, Midlands and South East, UK with yearly temporal effects (i.e., rainfall and temperature. Lessons learnt from analysis of the proposed benchmarking system are made throughout this paper, in particular its compatibility with the existing Code for Sustainable Homes (CSH accreditation system. Conclusions are subsequently drawn for the robustness of the proposed system.

  12. Elimination of spurious lattice fermion solutions and noncompact lattice QCD

    Energy Technology Data Exchange (ETDEWEB)

    Lee, T.D.

    1997-09-22

    It is well known that the Dirac equation on a discrete hyper-cubic lattice in D dimension has 2{sup D} degenerate solutions. The usual method of removing these spurious solutions encounters difficulties with chiral symmetry when the lattice spacing l {ne} 0, as exemplified by the persistent problem of the pion mass. On the other hand, we recall that in any crystal in nature, all the electrons do move in a lattice and satisfy the Dirac equation; yet there is not a single physical result that has ever been entangled with a spurious fermion solution. Therefore it should not be difficult to eliminate these unphysical elements. On a discrete lattice, particle hop from point to point, whereas in a real crystal the lattice structure in embedded in a continuum and electrons move continuously from lattice cell to lattice cell. In a discrete system, the lattice functions are defined only on individual points (or links as in the case of gauge fields). However, in a crystal the electron state vector is represented by the Bloch wave functions which are continuous functions in {rvec {gamma}}, and herein lies one of the essential differences.

  13. Benchmarking Danish Vocational Education and Training Programmes

    DEFF Research Database (Denmark)

    Bogetoft, Peter; Wittrup, Jesper

    This study paper discusses methods whereby Danish vocational education and training colleges can be benchmarked, and presents results from a number of models. It is conceptually complicated to benchmark vocational colleges, as the various colleges in Denmark offer a wide range of course programmes....... This makes it difficult to compare the resources used, since some programmes by their nature require more classroom time and equipment than others. It is also far from straightforward to compare college effects with respect to grades, since the various programmes apply very different forms of assessment...

  14. Confidential benchmarking based on multiparty computation

    DEFF Research Database (Denmark)

    Damgård, Ivan Bjerre; Damgård, Kasper Lyneborg; Nielsen, Kurt

    We report on the design and implementation of a system that uses multiparty computation to enable banks to benchmark their customers' confidential performance data against a large representative set of confidential performance data from a consultancy house. The system ensures that both the banks......' and the consultancy house's data stays confidential, the banks as clients learn nothing but the computed benchmarking score. In the concrete business application, the developed prototype help Danish banks to find the most efficient customers among a large and challenging group of agricultural customers with too much...

  15. Benchmarking af kommunernes førtidspensionspraksis

    DEFF Research Database (Denmark)

    Gregersen, Ole

    Hvert år udgiver Den Sociale Ankestyrelse statistikken over afgørelser i sager om førtidspension. I forbindelse med årsstatistikken udgives resultater fra en benchmarking model, hvor antal tilkendelser i den enkelte kommune sammenlignes med et forventet antal tilkendelser, hvis kommunen havde haft...... samme afgørelsespraksis, som den "gennemsnitlige kommune", når vi korrigerer for den sociale struktur i kommunen. Den hidtil anvendte benchmarking model er dokumenteret i Ole Gregersen (1994): Kommunernes Pensionspraksis, Servicerapport, Socialforskningsinstituttet. I dette notat dokumenteres en...

  16. Benchmarking of Heavy Ion Transport Codes

    Energy Technology Data Exchange (ETDEWEB)

    Remec, Igor [ORNL; Ronningen, Reginald M. [Michigan State University, East Lansing; Heilbronn, Lawrence [University of Tennessee, Knoxville (UTK)

    2011-01-01

    Accurate prediction of radiation fields generated by heavy ion interactions is important in medical applications, space missions, and in designing and operation of rare isotope research facilities. In recent years, several well-established computer codes in widespread use for particle and radiation transport calculations have been equipped with the capability to simulate heavy ion transport and interactions. To assess and validate these capabilities, we performed simulations of a series of benchmark-quality heavy ion experiments with the computer codes FLUKA, MARS15, MCNPX, and PHITS. We focus on the comparisons of secondary neutron production. Results are encouraging; however, further improvements in models and codes and additional benchmarking are required.

  17. Toxicological benchmarks for wildlife: 1996 Revision

    Energy Technology Data Exchange (ETDEWEB)

    Sample, B.E.; Opresko, D.M.; Suter, G.W., II

    1996-06-01

    The purpose of this report is to present toxicological benchmarks for assessment of effects of certain chemicals on mammalian and avian wildlife species. Publication of this document meets a milestone for the Environmental Restoration (ER) Risk Assessment Program. This document provides the ER Program with toxicological benchmarks that may be used as comparative tools in screening assessments as well as lines of evidence to support or refute the presence of ecological effects in ecological risk assessments. The chemicals considered in this report are some that occur at US DOE waste sites, and the wildlife species evaluated herein were chosen because they represent a range of body sizes and diets.

  18. Lattice Boltzmann Model for Compressible Fluid on a Square Lattice

    Institute of Scientific and Technical Information of China (English)

    SUN Cheng-Hai

    2000-01-01

    A two-level four-direction lattice Boltzmann model is formulated on a square lattice to simulate compressible flows with a high Mach number. The particle velocities are adaptive to the mean velocity and internal energy. Therefore, the mean flow can have a high Mach number. Due to the simple form of the equilibrium distribution, the 4th order velocity tensors are not involved in the calculations. Unlike the standard lattice Boltzmann model, o special treatment is need for the homogeneity of 4th order velocity tensors on square lattices. The Navier-Stokes equations were derived by the Chapman-Enskog method from the BGK Boltzmann equation. The model can be easily extended to three-dimensional cubic lattices. Two-dimensional shock-wave propagation was simulated

  19. Entangling gates in even Euclidean lattices such as Leech lattice

    CERN Document Server

    Planat, Michel

    2010-01-01

    We point out a organic relationship between real entangling n-qubit gates of quantum computation and the group of automorphisms of even Euclidean lattices of the corresponding dimension 2n. The type of entanglement that is found in the gates/generators of Aut() depends on the lattice. In particular, we investigate Zn lattices, Barnes-Wall lattices D4, E8, 16 (associated to n = 2, 3 and 4 qubits), and the Leech lattices h24 and 24 (associated to a 3-qubit/qutrit system). Balanced tripartite entanglement is found to be a basic feature of Aut(), a nding that bears out our recent work related to the Weyl group of E8 [1, 2].

  20. Nuclear Fuel Design Technology Development for the Future Fuel

    Energy Technology Data Exchange (ETDEWEB)

    Koo, Yang Hyun; Lee, Byung Ho; Cheon, Jin Sik; Oh, Je Yong; Yim, Jeong Sik; Sohn, Dong Seong; Lee, Byung Uk; Ko, Han Suk; So, Dong Sup; Koo, Dae Seo

    2006-04-15

    The test MOX fuels have been irradiated in the Halden reactor, and their burnup attained 40 GWd/t as of October 2005. The fuel temperature and internal pressure were measured by the sensors installed in the fuels and test rig. The COSMOS code, which was developed by KAERI, well predicted in-reactor behavior of MOX fuel. The COSMOS code was verified by OECD-NEA benchmarks, and the result confirmed the superiority of COSMOS code. MOX in-pile database (IFA-629.3, IFA-610.2 and 4) in Halden was also used for the verification of code. The COSMOS code was improved by introducing Graphic User Interface (GUI) and batch mode. The PCMI analysis module was developed and introduced by the new fission gas behavior model. The irradiation test performed under the arbitrary rod internal pressure could also be analyzed with the COSMOS code. Several presentations were made for the preparation to transfer MOX fuel performance analysis code to the industry, and the transfer of COSMOS code to the industry is being discussed. The user manual and COSMOS program (executive file) were provided for the industry to test the performance of COSMOS code. To envisage the direction of research, the MOX fuel research trend of foreign countries, specially focused on USA's GENP policy, was analyzed.

  1. Introduction to lattice gauge theory

    Science.gov (United States)

    Gupta, R.

    The lattice formulation of Quantum Field Theory (QFT) can be exploited in many ways. We can derive the lattice Feynman rules and carry out weak coupling perturbation expansions. The lattice then serves as a manifestly gauge invariant regularization scheme, albeit one that is more complicated than standard continuum schemes. Strong coupling expansions: these give us useful qualitative information, but unfortunately no hard numbers. The lattice theory is amenable to numerical simulations by which one calculates the long distance properties of a strongly interacting theory from first principles. The observables are measured as a function of the bare coupling g and a gauge invariant cut-off approx. = 1/alpha, where alpha is the lattice spacing. The continuum (physical) behavior is recovered in the limit alpha yields 0, at which point the lattice artifacts go to zero. This is the more powerful use of lattice formulation, so in these lectures the author focuses on setting up the theory for the purpose of numerical simulations to get hard numbers. The numerical techniques used in Lattice Gauge Theories have their roots in statistical mechanics, so it is important to develop an intuition for the interconnection between quantum mechanics and statistical mechanics.

  2. Dark matter on the lattice

    OpenAIRE

    Lewis, Randy

    2014-01-01

    Several collaborations have recently performed lattice calculations aimed specifically at dark matter, including work with SU(2), SU(3), SU(4) and SO(4) gauge theories to represent the dark sector. Highlights of these studies are presented here, after a reminder of how lattice calculations in QCD itself are helping with the hunt for dark matter.

  3. Fast simulation of lattice systems

    DEFF Research Database (Denmark)

    Bohr, H.; Kaznelson, E.; Hansen, Frank;

    1983-01-01

    A new computer system with an entirely new processor design is described and demonstrated on a very small trial lattice. The new computer simulates systems of differential equations of the order of 104 times faster than present day computers and we describe how the machine can be applied to lattice...

  4. Branes and integrable lattice models

    CERN Document Server

    Yagi, Junya

    2016-01-01

    This is a brief review of my work on the correspondence between four-dimensional $\\mathcal{N} = 1$ supersymmetric field theories realized by brane tilings and two-dimensional integrable lattice models. I explain how to construct integrable lattice models from extended operators in partially topological quantum field theories, and elucidate the correspondence as an application of this construction.

  5. Charmed baryons on the lattice

    CERN Document Server

    Padmanath, M

    2015-01-01

    We discuss the significance of charm baryon spectroscopy in hadron physics and review the recent developments of the spectra of charmed baryons in lattice calculations. Special emphasis is given on the recent studies of highly excited charm baryon states. Recent precision lattice measurements of the low lying charm and bottom baryons are also reviewed.

  6. Quantum phases in optical lattices

    NARCIS (Netherlands)

    Dickerscheid, Dennis Brian Martin

    2006-01-01

    An important new development in the field of ultracold atomic gases is the study of the properties of these gases in a so-called optical lattice. An optical lattice is a periodic trapping potential for the atoms that is formed by the interference pattern of a few laser beams. A reason for the

  7. Lattice Induced Transparency in Metasurfaces

    CERN Document Server

    Manjappa, Manukumara; Singh, Ranjan

    2016-01-01

    Lattice modes are intrinsic to the periodic structures and their occurrence can be easily tuned and controlled by changing the lattice constant of the structural array. Previous studies have revealed excitation of sharp absorption resonances due to lattice mode coupling with the plasmonic resonances. Here, we report the first experimental observation of a lattice induced transparency (LIT) by coupling the first order lattice mode (FOLM) to the structural resonance of a metamaterial resonator at terahertz frequencies. The observed sharp transparency is a result of the destructive interference between the bright mode and the FOLM mediated dark mode. As the FOLM is swept across the metamaterial resonance, the transparency band undergoes large change in its bandwidth and resonance position. Besides controlling the transparency behaviour, LIT also shows a huge enhancement in the Q-factor and record high group delay of 28 ps, which could be pivotal in ultrasensitive sensing and slow light device applications.

  8. Lattice models of ionic systems

    Science.gov (United States)

    Kobelev, Vladimir; Kolomeisky, Anatoly B.; Fisher, Michael E.

    2002-05-01

    A theoretical analysis of Coulomb systems on lattices in general dimensions is presented. The thermodynamics is developed using Debye-Hückel theory with ion-pairing and dipole-ion solvation, specific calculations being performed for three-dimensional lattices. As for continuum electrolytes, low-density results for simple cubic (sc), body-centered cubic (bcc), and face-centered cubic (fcc) lattices indicate the existence of gas-liquid phase separation. The predicted critical densities have values comparable to those of continuum ionic systems, while the critical temperatures are 60%-70% higher. However, when the possibility of sublattice ordering as well as Debye screening is taken into account systematically, order-disorder transitions and a tricritical point are found on sc and bcc lattices, and gas-liquid coexistence is suppressed. Our results agree with recent Monte Carlo simulations of lattice electrolytes.

  9. Lattice quantum chromodynamics practical essentials

    CERN Document Server

    Knechtli, Francesco; Peardon, Michael

    2017-01-01

    This book provides an overview of the techniques central to lattice quantum chromodynamics, including modern developments. The book has four chapters. The first chapter explains the formulation of quarks and gluons on a Euclidean lattice. The second chapter introduces Monte Carlo methods and details the numerical algorithms to simulate lattice gauge fields. Chapter three explains the mathematical and numerical techniques needed to study quark fields and the computation of quark propagators. The fourth chapter is devoted to the physical observables constructed from lattice fields and explains how to measure them in simulations. The book is aimed at enabling graduate students who are new to the field to carry out explicitly the first steps and prepare them for research in lattice QCD.

  10. Algorithm and Architecture Independent Benchmarking with SEAK

    Energy Technology Data Exchange (ETDEWEB)

    Tallent, Nathan R.; Manzano Franco, Joseph B.; Gawande, Nitin A.; Kang, Seung-Hwa; Kerbyson, Darren J.; Hoisie, Adolfy; Cross, Joseph

    2016-05-23

    Many applications of high performance embedded computing are limited by performance or power bottlenecks. We have designed the Suite for Embedded Applications & Kernels (SEAK), a new benchmark suite, (a) to capture these bottlenecks in a way that encourages creative solutions; and (b) to facilitate rigorous, objective, end-user evaluation for their solutions. To avoid biasing solutions toward existing algorithms, SEAK benchmarks use a mission-centric (abstracted from a particular algorithm) and goal-oriented (functional) specification. To encourage solutions that are any combination of software or hardware, we use an end-user black-box evaluation that can capture tradeoffs between performance, power, accuracy, size, and weight. The tradeoffs are especially informative for procurement decisions. We call our benchmarks future proof because each mission-centric interface and evaluation remains useful despite shifting algorithmic preferences. It is challenging to create both concise and precise goal-oriented specifications for mission-centric problems. This paper describes the SEAK benchmark suite and presents an evaluation of sample solutions that highlights power and performance tradeoffs.

  11. A human benchmark for language recognition

    NARCIS (Netherlands)

    Orr, R.; Leeuwen, D.A. van

    2009-01-01

    In this study, we explore a human benchmark in language recognition, for the purpose of comparing human performance to machine performance in the context of the NIST LRE 2007. Humans are categorised in terms of language proficiency, and performance is presented per proficiency. Themain challenge in

  12. Benchmarking Year Five Students' Reading Abilities

    Science.gov (United States)

    Lim, Chang Kuan; Eng, Lin Siew; Mohamed, Abdul Rashid

    2014-01-01

    Reading and understanding a written text is one of the most important skills in English learning.This study attempts to benchmark Year Five students' reading abilities of fifteen rural schools in a district in Malaysia. The objectives of this study are to develop a set of standardised written reading comprehension and a set of indicators to inform…

  13. Benchmark Generation and Simulation at Extreme Scale

    Energy Technology Data Exchange (ETDEWEB)

    Lagadapati, Mahesh [North Carolina State University (NCSU), Raleigh; Mueller, Frank [North Carolina State University (NCSU), Raleigh; Engelmann, Christian [ORNL

    2016-01-01

    The path to extreme scale high-performance computing (HPC) poses several challenges related to power, performance, resilience, productivity, programmability, data movement, and data management. Investigating the performance of parallel applications at scale on future architectures and the performance impact of different architectural choices is an important component of HPC hardware/software co-design. Simulations using models of future HPC systems and communication traces from applications running on existing HPC systems can offer an insight into the performance of future architectures. This work targets technology developed for scalable application tracing of communication events. It focuses on extreme-scale simulation of HPC applications and their communication behavior via lightweight parallel discrete event simulation for performance estimation and evaluation. Instead of simply replaying a trace within a simulator, this work promotes the generation of a benchmark from traces. This benchmark is subsequently exposed to simulation using models to reflect the performance characteristics of future-generation HPC systems. This technique provides a number of benefits, such as eliminating the data intensive trace replay and enabling simulations at different scales. The presented work features novel software co-design aspects, combining the ScalaTrace tool to generate scalable trace files, the ScalaBenchGen tool to generate the benchmark, and the xSim tool to assess the benchmark characteristics within a simulator.

  14. A Benchmark and Simulator for UAV Tracking

    KAUST Repository

    Mueller, Matthias

    2016-09-16

    In this paper, we propose a new aerial video dataset and benchmark for low altitude UAV target tracking, as well as, a photorealistic UAV simulator that can be coupled with tracking methods. Our benchmark provides the first evaluation of many state-of-the-art and popular trackers on 123 new and fully annotated HD video sequences captured from a low-altitude aerial perspective. Among the compared trackers, we determine which ones are the most suitable for UAV tracking both in terms of tracking accuracy and run-time. The simulator can be used to evaluate tracking algorithms in real-time scenarios before they are deployed on a UAV “in the field”, as well as, generate synthetic but photo-realistic tracking datasets with automatic ground truth annotations to easily extend existing real-world datasets. Both the benchmark and simulator are made publicly available to the vision community on our website to further research in the area of object tracking from UAVs. (https://ivul.kaust.edu.sa/Pages/pub-benchmark-simulator-uav.aspx.). © Springer International Publishing AG 2016.

  15. Thermodynamic benchmark study using Biacore technology

    NARCIS (Netherlands)

    Navratilova, I.; Papalia, G.A.; Rich, R.L.; Bedinger, D.; Brophy, S.; Condon, B.; Deng, T.; Emerick, A.W.; Guan, H.W.; Hayden, T.; Heutmekers, T.; Hoorelbeke, B.; McCroskey, M.C.; Murphy, M.M.; Nakagawa, T.; Parmeggiani, F.; Xiaochun, Q.; Rebe, S.; Nenad, T.; Tsang, T.; Waddell, M.B.; Zhang, F.F.; Leavitt, S.; Myszka, D.G.

    2007-01-01

    A total of 22 individuals participated in this benchmark study to characterize the thermodynamics of small-molecule inhibitor-enzyme interactions using Biacore instruments. Participants were provided with reagents (the enzyme carbonic anhydrase II, which was immobilized onto the sensor surface, and

  16. Benchmarking European Gas Transmission System Operators

    DEFF Research Database (Denmark)

    Agrell, Per J.; Bogetoft, Peter; Trinkner, Urs

    This is the final report for the pan-European efficiency benchmarking of gas transmission system operations commissioned by the Netherlands Authority for Consumers and Markets (ACM), Den Haag, on behalf of the Council of European Energy Regulators (CEER) under the supervision of the authors....

  17. Alberta K-12 ESL Proficiency Benchmarks

    Science.gov (United States)

    Salmon, Kathy; Ettrich, Mike

    2012-01-01

    The Alberta K-12 ESL Proficiency Benchmarks are organized by division: kindergarten, grades 1-3, grades 4-6, grades 7-9, and grades 10-12. They are descriptors of language proficiency in listening, speaking, reading, and writing. The descriptors are arranged in a continuum of seven language competences across five proficiency levels. Several…

  18. Seven Benchmarks for Information Technology Investment.

    Science.gov (United States)

    Smallen, David; Leach, Karen

    2002-01-01

    Offers benchmarks to help campuses evaluate their efforts in supplying information technology (IT) services. The first three help understand the IT budget, the next three provide insight into staffing levels and emphases, and the seventh relates to the pervasiveness of institutional infrastructure. (EV)

  19. Benchmarking Peer Production Mechanisms, Processes & Practices

    Science.gov (United States)

    Fischer, Thomas; Kretschmer, Thomas

    2008-01-01

    This deliverable identifies key approaches for quality management in peer production by benchmarking peer production practices and processes in other areas. (Contains 29 footnotes, 13 figures and 2 tables.)[This report has been authored with contributions of: Kaisa Honkonen-Ratinen, Matti Auvinen, David Riley, Jose Pinzon, Thomas Fischer, Thomas…

  20. Operational benchmarking of Japanese and Danish hopsitals

    DEFF Research Database (Denmark)

    Traberg, Andreas; Itoh, Kenji; Jacobsen, Peter

    2010-01-01

    This benchmarking model is designed as an integration of three organizational dimensions suited for the healthcare sector. The model incorporates posterior operational indicators, and evaluates upon aggregation of performance. The model is tested upon seven cases from Japan and Denmark. Japanese...

  1. Simple benchmark for complex dose finding studies.

    Science.gov (United States)

    Cheung, Ying Kuen

    2014-06-01

    While a general goal of early phase clinical studies is to identify an acceptable dose for further investigation, modern dose finding studies and designs are highly specific to individual clinical settings. In addition, as outcome-adaptive dose finding methods often involve complex algorithms, it is crucial to have diagnostic tools to evaluate the plausibility of a method's simulated performance and the adequacy of the algorithm. In this article, we propose a simple technique that provides an upper limit, or a benchmark, of accuracy for dose finding methods for a given design objective. The proposed benchmark is nonparametric optimal in the sense of O'Quigley et al. (2002, Biostatistics 3, 51-56), and is demonstrated by examples to be a practical accuracy upper bound for model-based dose finding methods. We illustrate the implementation of the technique in the context of phase I trials that consider multiple toxicities and phase I/II trials where dosing decisions are based on both toxicity and efficacy, and apply the benchmark to several clinical examples considered in the literature. By comparing the operating characteristics of a dose finding method to that of the benchmark, we can form quick initial assessments of whether the method is adequately calibrated and evaluate its sensitivity to the dose-outcome relationships.

  2. Benchmarking 2010: Trends in Education Philanthropy

    Science.gov (United States)

    Bearman, Jessica

    2010-01-01

    "Benchmarking 2010" offers insights into the current priorities, practices and concerns of education grantmakers. The report is divided into five sections: (1) Mapping the Education Grantmaking Landscape; (2) 2010 Funding Priorities; (3) Strategies for Leveraging Greater Impact; (4) Identifying Significant Trends in Education Funding; and (5)…

  3. Benchmark Experiment for Beryllium Slab Samples

    Institute of Scientific and Technical Information of China (English)

    NIE; Yang-bo; BAO; Jie; HAN; Rui; RUAN; Xi-chao; REN; Jie; HUANG; Han-xiong; ZHOU; Zu-ying

    2015-01-01

    In order to validate the evaluated nuclear data on beryllium,a benchmark experiment has been performed at China Institution of Atomic Energy(CIAE).Neutron leakage spectra from pure beryllium slab samples(10cm×10cm×11cm)were measured at 61°and 121°using timeof-

  4. Benchmarking 2011: Trends in Education Philanthropy

    Science.gov (United States)

    Grantmakers for Education, 2011

    2011-01-01

    The analysis in "Benchmarking 2011" is based on data from an unduplicated sample of 184 education grantmaking organizations--approximately two-thirds of Grantmakers for Education's (GFE's) network of grantmakers--who responded to an online survey consisting of fixed-choice and open-ended questions. Because a different subset of funders elects to…

  5. Cleanroom Energy Efficiency: Metrics and Benchmarks

    Energy Technology Data Exchange (ETDEWEB)

    International SEMATECH Manufacturing Initiative; Mathew, Paul A.; Tschudi, William; Sartor, Dale; Beasley, James

    2010-07-07

    Cleanrooms are among the most energy-intensive types of facilities. This is primarily due to the cleanliness requirements that result in high airflow rates and system static pressures, as well as process requirements that result in high cooling loads. Various studies have shown that there is a wide range of cleanroom energy efficiencies and that facility managers may not be aware of how energy efficient their cleanroom facility can be relative to other cleanroom facilities with the same cleanliness requirements. Metrics and benchmarks are an effective way to compare one facility to another and to track the performance of a given facility over time. This article presents the key metrics and benchmarks that facility managers can use to assess, track, and manage their cleanroom energy efficiency or to set energy efficiency targets for new construction. These include system-level metrics such as air change rates, air handling W/cfm, and filter pressure drops. Operational data are presented from over 20 different cleanrooms that were benchmarked with these metrics and that are part of the cleanroom benchmark dataset maintained by Lawrence Berkeley National Laboratory (LBNL). Overall production efficiency metrics for cleanrooms in 28 semiconductor manufacturing facilities in the United States and recorded in the Fabs21 database are also presented.

  6. Issues in Benchmarking and Assessing Institutional Engagement

    Science.gov (United States)

    Furco, Andrew; Miller, William

    2009-01-01

    The process of assessing and benchmarking community engagement can take many forms. To date, more than two dozen assessment tools for measuring community engagement institutionalization have been published. These tools vary substantially in purpose, level of complexity, scope, process, structure, and focus. While some instruments are designed to…

  7. Benchmarking European Gas Transmission System Operators

    DEFF Research Database (Denmark)

    Agrell, Per J.; Bogetoft, Peter; Trinkner, Urs

    This is the final report for the pan-European efficiency benchmarking of gas transmission system operations commissioned by the Netherlands Authority for Consumers and Markets (ACM), Den Haag, on behalf of the Council of European Energy Regulators (CEER) under the supervision of the authors....

  8. EA-MC Neutronic Calculations on IAEA ADS Benchmark 3.2

    Energy Technology Data Exchange (ETDEWEB)

    Dahlfors, Marcus [Uppsala Univ. (Sweden). Dept. of Radiation Sciences; Kadi, Yacine [CERN, Geneva (Switzerland). Emerging Energy Technologies

    2006-01-15

    The neutronics and the transmutation properties of the IAEA ADS benchmark 3.2 setup, the 'Yalina' experiment or ISTC project B-70, have been studied through an extensive amount of 3-D Monte Carlo calculations at CERN. The simulations were performed with the state-of-the-art computer code package EA-MC, developed at CERN. The calculational approach is outlined and the results are presented in accordance with the guidelines given in the benchmark description. A variety of experimental conditions and parameters are examined; three different fuel rod configurations and three types of neutron sources are applied to the system. Reactivity change effects introduced by removal of fuel rods in both central and peripheral positions are also computed. Irradiation samples located in a total of 8 geometrical positions are examined. Calculations of capture reaction rates in {sup 129}I, {sup 237}Np and {sup 243}Am samples and of fission reaction rates in {sup 235}U, {sup 237}Np and {sup 243}Am samples are presented. Simulated neutron flux densities and energy spectra as well as spectral indices inside experimental channels are also given according to benchmark specifications. Two different nuclear data libraries, JAR-95 and JENDL-3.2, are applied for the calculations.

  9. Development and validation of burnup dependent computational schemes for the analysis of assemblies with advanced lattice codes

    Science.gov (United States)

    Ramamoorthy, Karthikeyan

    The main aim of this research is the development and validation of computational schemes for advanced lattice codes. The advanced lattice code which forms the primary part of this research is "DRAGON Version4". The code has unique features like self shielding calculation with capabilities to represent distributed and mutual resonance shielding effects, leakage models with space-dependent isotropic or anisotropic streaming effect, availability of the method of characteristics (MOC), burnup calculation with reaction-detailed energy production etc. Qualified reactor physics codes are essential for the study of all existing and envisaged designs of nuclear reactors. Any new design would require a thorough analysis of all the safety parameters and burnup dependent behaviour. Any reactor physics calculation requires the estimation of neutron fluxes in various regions of the problem domain. The calculation goes through several levels before the desired solution is obtained. Each level of the lattice calculation has its own significance and any compromise at any step will lead to poor final result. The various levels include choice of nuclear data library and energy group boundaries into which the multigroup library is cast; self shielding of nuclear data depending on the heterogeneous geometry and composition; tracking of geometry, keeping error in volume and surface to an acceptable minimum; generation of regionwise and groupwise collision probabilities or MOC-related information and their subsequent normalization thereof, solution of transport equation using the previously generated groupwise information and obtaining the fluxes and reaction rates in various regions of the lattice; depletion of fuel and of other materials based on normalization with constant power or constant flux. Of the above mentioned levels, the present research will mainly focus on two aspects, namely self shielding and depletion. The behaviour of the system is determined by composition of resonant

  10. Fossil Fuels.

    Science.gov (United States)

    Crank, Ron

    This instructional unit is one of 10 developed by students on various energy-related areas that deals specifically with fossil fuels. Some topics covered are historic facts, development of fuels, history of oil production, current and future trends of the oil industry, refining fossil fuels, and environmental problems. Material in each unit may…

  11. Fossil Fuels.

    Science.gov (United States)

    Crank, Ron

    This instructional unit is one of 10 developed by students on various energy-related areas that deals specifically with fossil fuels. Some topics covered are historic facts, development of fuels, history of oil production, current and future trends of the oil industry, refining fossil fuels, and environmental problems. Material in each unit may…

  12. Critical experiments supporting close proximity water storage of power reactor fuel. Technical progress report, October 1, 1977-December 31, 1977

    Energy Technology Data Exchange (ETDEWEB)

    Baldwin, M.N.; Hoovler, G.S.

    1978-03-01

    Experiments are being conducted on critical configurations of clusters of fuel rods, mocking up LWR-type fuel elements in close proximity water storage. Spacings between fuel clusters and the intervening material are being varied to provide a variety of benchmark loadings. (DLC)

  13. Benchmarking transaction and analytical processing systems the creation of a mixed workload benchmark and its application

    CERN Document Server

    Bog, Anja

    2014-01-01

    This book introduces a new benchmark for hybrid database systems, gauging the effect of adding OLAP to an OLTP workload and analyzing the impact of commonly used optimizations in historically separate OLTP and OLAP domains in mixed-workload scenarios.

  14. Electricity consumption in school buildings - benchmark and web tools; Elforbrug i skoler - benchmark og webvaerktoej

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2006-07-01

    The aim of this project has been to produce benchmarks for electricity consumption in Danish schools in order to encourage electricity conservation. An internet programme has been developed with the aim of facilitating schools' access to benchmarks and to evaluate energy consumption. The overall purpose is to create increased attention to the electricity consumption of each separate school by publishing benchmarks which take the schools' age and number of pupils as well as after school activities into account. Benchmarks can be used to make green accounts and work as markers in e.g. energy conservation campaigns, energy management and for educational purposes. The internet tool can be found on www.energiguiden.dk. (BA)

  15. The ACRV Picking Benchmark (APB): A Robotic Shelf Picking Benchmark to Foster Reproducible Research

    OpenAIRE

    Leitner, Jürgen; Tow, Adam W.; Dean, Jake E.; Suenderhauf, Niko; Durham, Joseph W.; Cooper, Matthew; Eich, Markus; Lehnert, Christopher; Mangels, Ruben; McCool, Christopher; Kujala, Peter; Nicholson, Lachlan; Van Pham, Trung; Sergeant, James; Wu, Liao

    2016-01-01

    Robotic challenges like the Amazon Picking Challenge (APC) or the DARPA Challenges are an established and important way to drive scientific progress. They make research comparable on a well-defined benchmark with equal test conditions for all participants. However, such challenge events occur only occasionally, are limited to a small number of contestants, and the test conditions are very difficult to replicate after the main event. We present a new physical benchmark challenge for robotic pi...

  16. Benchmark 1 - Failure Prediction after Cup Drawing, Reverse Redrawing and Expansion Part A: Benchmark Description

    Science.gov (United States)

    Watson, Martin; Dick, Robert; Huang, Y. Helen; Lockley, Andrew; Cardoso, Rui; Santos, Abel

    2016-08-01

    This Benchmark is designed to predict the fracture of a food can after drawing, reverse redrawing and expansion. The aim is to assess different sheet metal forming difficulties such as plastic anisotropic earing and failure models (strain and stress based Forming Limit Diagrams) under complex nonlinear strain paths. To study these effects, two distinct materials, TH330 steel (unstoved) and AA5352 aluminum alloy are considered in this Benchmark. Problem description, material properties, and simulation reports with experimental data are summarized.

  17. Revaluering benchmarking - A topical theme for the construction industry

    DEFF Research Database (Denmark)

    Rasmussen, Grane Mikael Gregaard

    2011-01-01

    Over the past decade, benchmarking has increasingly gained foothold in the construction industry. The predominant research, perceptions and uses of benchmarking are valued so strongly and uniformly, that what may seem valuable, is actually abstaining researchers and practitioners from studying an...... organizational relations, behaviors and actions. In closing it is briefly considered how to study the calculative practices of benchmarking....... and questioning the concept objectively. This paper addresses the underlying nature of benchmarking, and accounts for the importance of focusing attention on the sociological impacts benchmarking has in organizations. To understand these sociological impacts, benchmarking research needs to transcend...... this perspective develops more thorough knowledge about benchmarking and challenges the current dominating rationales. Hereby, it is argued that benchmarking is not a neutral practice. On the contrary it is highly influenced by organizational ambitions and strategies, with the potentials to transform...

  18. Effects of Exposure Imprecision on Estimation of the Benchmark Dose

    DEFF Research Database (Denmark)

    Budtz-Jørgensen, Esben; Keiding, Niels; Grandjean, Philippe

    Environmental epidemiology; exposure measurement error; effect of prenatal mercury exposure; exposure standards; benchmark dose......Environmental epidemiology; exposure measurement error; effect of prenatal mercury exposure; exposure standards; benchmark dose...

  19. Benchmarking in Identifying Priority Directions of Development of Telecommunication Operators

    Directory of Open Access Journals (Sweden)

    Zaharchenko Lolita A.

    2013-12-01

    Full Text Available The article analyses evolution of development and possibilities of application of benchmarking in the telecommunication sphere. It studies essence of benchmarking on the basis of generalisation of approaches of different scientists to definition of this notion. In order to improve activity of telecommunication operators, the article identifies the benchmarking technology and main factors, that determine success of the operator in the modern market economy, and the mechanism of benchmarking and component stages of carrying out benchmarking by a telecommunication operator. It analyses the telecommunication market and identifies dynamics of its development and tendencies of change of the composition of telecommunication operators and providers. Having generalised the existing experience of benchmarking application, the article identifies main types of benchmarking of telecommunication operators by the following features: by the level of conduct of (branch, inter-branch and international benchmarking; by relation to participation in the conduct (competitive and joint; and with respect to the enterprise environment (internal and external.

  20. Fuel distribution

    Energy Technology Data Exchange (ETDEWEB)

    Tison, R.R.; Baker, N.R.; Blazek, C.F.

    1979-07-01

    Distribution of fuel is considered from a supply point to the secondary conversion sites and ultimate end users. All distribution is intracity with the maximum distance between the supply point and end-use site generally considered to be 15 mi. The fuels discussed are: coal or coal-like solids, methanol, No. 2 fuel oil, No. 6 fuel oil, high-Btu gas, medium-Btu gas, and low-Btu gas. Although the fuel state, i.e., gas, liquid, etc., can have a major impact on the distribution system, the source of these fuels (e.g., naturally-occurring or coal-derived) does not. Single-source, single-termination point and single-source, multi-termination point systems for liquid, gaseous, and solid fuel distribution are considered. Transport modes and the fuels associated with each mode are: by truck - coal, methanol, No. 2 fuel oil, and No. 6 fuel oil; and by pipeline - coal, methane, No. 2 fuel oil, No. 6 oil, high-Btu gas, medium-Btu gas, and low-Btu gas. Data provided for each distribution system include component makeup and initial costs.

  1. Irreversible stochastic processes on lattices

    Energy Technology Data Exchange (ETDEWEB)

    Nord, R.S.

    1986-01-01

    Models for irreversible random or cooperative filling of lattices are required to describe many processes in chemistry and physics. Since the filling is assumed to be irreversible, even the stationary, saturation state is not in equilibrium. The kinetics and statistics of these processes are described by recasting the master equations in infinite hierarchical form. Solutions can be obtained by implementing various techniques: refinements in these solution techniques are presented. Programs considered include random dimer, trimer, and tetramer filling of 2D lattices, random dimer filling of a cubic lattice, competitive filling of two or more species, and the effect of a random distribution of inactive sites on the filling. Also considered is monomer filling of a linear lattice with nearest neighbor cooperative effects and solve for the exact cluster-size distribution for cluster sizes up to the asymptotic regime. Additionally, a technique is developed to directly determine the asymptotic properties of the cluster size distribution. Finally cluster growth is considered via irreversible aggregation involving random walkers. In particular, explicit results are provided for the large-lattice-size asymptotic behavior of trapping probabilities and average walk lengths for a single walker on a lattice with multiple traps. Procedures for exact calculation of these quantities on finite lattices are also developed.

  2. Lattice topology dictates photon statistics

    CERN Document Server

    Kondakci, H Esat; Saleh, Bahaa E A

    2016-01-01

    Propagation of coherent light through a disordered network is accompanied by randomization and possible conversion into thermal light. Here, we show that network topology plays a decisive role in determining the statistics of the emerging field if the underlying lattice satisfies chiral symmetry. By examining one-dimensional arrays of randomly coupled waveguides arranged on linear and ring topologies, we are led to a remarkable prediction: the field circularity and the photon statistics in ring lattices are dictated by its parity -- whether the number of sites is even or odd, while the same quantities are insensitive to the parity of a linear lattice. Adding or subtracting a single lattice site can switch the photon statistics from super-thermal to sub-thermal, or vice versa. This behavior is understood by examining the real and imaginary fields on a chiral-symmetric lattice, which form two strands that interleave along the lattice sites. These strands can be fully braided around an even-sited ring lattice th...

  3. Status of benchmark calculations of the neutron characteristics of the cascade molten salt ADS for the nuclear waste incineration

    Energy Technology Data Exchange (ETDEWEB)

    Dudnikov, A.A.; Alekseev, P.N.; Subbotin, S.A.; Vasiliev, A.V.; Abagyan, L.P.; Alexeyev, N.I.; Gomin, E.A.; Ponomarev, L.I.; Kolyaskin, O.E.; Men' shikov, L.I. [Russian Research Centre Kurchatov Inst., Moscow (Russian Federation); Kolesov, V.F.; Ivanin, I.A.; Zavialov, N.V. [Russian Federal Nuclear Center, RFNC-VNIIEF, Nizhnii Novgorod region (Russian Federation)

    2001-07-01

    The facility for incineration of long-lived minor actinides and some dangerous fission products should be an important feature of the future nuclear power (NP). For many reasons the liquid-fuel reactor driven by accelerator can be considered as the perspective reactor- burner for radioactive waste. The fuel of such reactor is the fluoride molten salt composition with minor actinides (Np, Cm, Am) and some fission products ({sup 99}Tc, {sup 129}I, etc.). Preliminary analysis shows that the values of keff, calculated with different codes and nuclear data differ up to several percents for such fuel compositions. Reliable critical and subcritical benchmark experiments with molten salt fuel compositions with significant quantities of minor actinides are absent. One of the main tasks for the numerical study of this problem is the estimation of nuclear data for such fuel compositions and verification of the different numerical codes used for the calculation of keff, neutron spectra and reaction rates. It is especially important for the resonance region where experimental data are poor or absent. The calculation benchmark of the cascade subcritical molten salt reactor is developed. For the chosen nuclear fuel composition the comparison of the results obtained by three different Monte-Carlo codes (MCNP4A, MCU, and C95) using three different nuclear data libraries are presented. This report concerns the investigation of subcritical molten salt reactor unit main peculiarities carried out at the beginning of ISTC project 1486. (author)

  4. Role of (, ) reactions in ADS, IAEA-benchmark and the Dubna Cascade Code

    Indian Academy of Sciences (India)

    V Kumar; Harphool Kumawat; Manish Sharma

    2007-02-01

    Dubna Cascade Code (version-2004) has been used for the Monte Carlo simulation of the 1500 MW accelerator driven sub-critical system (ADS) with 233U + 232Th fuel using the IAEA benchmark. Neutron spectrum, cross-section of (, ) reactions, isotopic yield, heat spectra etc. are simulated. Many of these results that help in understanding the IAEA benchmark are presented. It is revealed that the code predicts the proton beam current required for the 1500 MW ADS for eff = 0.98 to be 11.6 mA. Radial distribution of heat is fairly in agreement with other codes like the EA-MC and it needs nearly 1% less enrichment than given by other codes. This may be because the code takes care of the role of larger order of the (, ) reactions. It is emphasized that there is a strong need to study (, ) reactions both theoretically and experimentally for better design.

  5. Benchmarking in Identifying Priority Directions of Development of Telecommunication Operators

    OpenAIRE

    Zaharchenko Lolita A.; Kolesnyk Oksana A.

    2013-01-01

    The article analyses evolution of development and possibilities of application of benchmarking in the telecommunication sphere. It studies essence of benchmarking on the basis of generalisation of approaches of different scientists to definition of this notion. In order to improve activity of telecommunication operators, the article identifies the benchmarking technology and main factors, that determine success of the operator in the modern market economy, and the mechanism of benchmarking an...

  6. Regression Benchmarking: An Approach to Quality Assurance in Performance

    OpenAIRE

    2005-01-01

    The paper presents a short summary of our work in the area of regression benchmarking and its application to software development. Specially, we explain the concept of regression benchmarking, the requirements for employing regression testing in a software project, and methods used for analyzing the vast amounts of data resulting from repeated benchmarking. We present the application of regression benchmarking on a real software project and conclude with a glimpse at the challenges for the fu...

  7. Benchmarking of corporate social responsibility: Methodological problems and robustness

    OpenAIRE

    2004-01-01

    This paper investigates the possibilities and problems of benchmarking Corporate Social Responsibility (CSR). After a methodological analysis of the advantages and problems of benchmarking, we develop a benchmark method that includes economic, social and environmental aspects as well as national and international aspects of CSR. The overall benchmark is based on a weighted average of these aspects. The weights are based on the opinions of companies and NGO’s. Using different me...

  8. Lattice Boltzmann model for nanofluids

    Energy Technology Data Exchange (ETDEWEB)

    Xuan Yimin; Yao Zhengping [Nanjing University of Science and Technology, School of Power Engineering, Nanjing (China)

    2005-01-01

    A nanofluid is a particle suspension that consists of base liquids and nanoparticles and has great potential for heat transfer enhancement. By accounting for the external and internal forces acting on the suspended nanoparticles and interactions among the nanoparticles and fluid particles, a lattice Boltzmann model is proposed for simulating flow and energy transport processes inside the nanofluids. First, we briefly introduce the conventional lattice Boltzmann model for multicomponent systems. Then, we discuss the irregular motion of the nanoparticles and inherent dynamic behavior of nanofluids and describe a lattice Boltzmann model for simulating nanofluids. Finally, we conduct some calculations for the distribution of the suspended nanoparticles. (orig.)

  9. Localized structures in Kagome lattices

    Energy Technology Data Exchange (ETDEWEB)

    Saxena, Avadh B [Los Alamos National Laboratory; Bishop, Alan R [Los Alamos National Laboratory; Law, K J H [UNIV OF MASSACHUSETTS; Kevrekidis, P G [UNIV OF MASSACHUSETTS

    2009-01-01

    We investigate the existence and stability of gap vortices and multi-pole gap solitons in a Kagome lattice with a defocusing nonlinearity both in a discrete case and in a continuum one with periodic external modulation. In particular, predictions are made based on expansion around a simple and analytically tractable anti-continuum (zero coupling) limit. These predictions are then confirmed for a continuum model of an optically-induced Kagome lattice in a photorefractive crystal obtained by a continuous transformation of a honeycomb lattice.

  10. Lattice sums then and now

    CERN Document Server

    Borwein, J M; McPhedran, R C

    2013-01-01

    The study of lattice sums began when early investigators wanted to go from mechanical properties of crystals to the properties of the atoms and ions from which they were built (the literature of Madelung's constant). A parallel literature was built around the optical properties of regular lattices of atoms (initiated by Lord Rayleigh, Lorentz and Lorenz). For over a century many famous scientists and mathematicians have delved into the properties of lattices, sometimes unwittingly duplicating the work of their predecessors. Here, at last, is a comprehensive overview of the substantial body of

  11. Lattice Trace Operators

    Directory of Open Access Journals (Sweden)

    Brian Jefferies

    2014-01-01

    Full Text Available A bounded linear operator T on a Hilbert space ℋ is trace class if its singular values are summable. The trace class operators on ℋ form an operator ideal and in the case that ℋ is finite-dimensional, the trace tr(T of T is given by ∑jajj for any matrix representation {aij} of T. In applications of trace class operators to scattering theory and representation theory, the subject is complicated by the fact that if k is an integral kernel of the operator T on the Hilbert space L2(μ with μ a σ-finite measure, then k(x,x may not be defined, because the diagonal {(x,x} may be a set of (μ⊗μ-measure zero. The present note describes a class of linear operators acting on a Banach function space X which forms a lattice ideal of operators on X, rather than an operator ideal, but coincides with the collection of hermitian positive trace class operators in the case of X=L2(μ.

  12. NMR Express-analyser for quality monitoring of motor fuel

    Science.gov (United States)

    Protasov, E. A.; Protasov, D. E.

    2016-09-01

    A method for the rapid analysis of motor fuel quality was developed by artificial increase of the octane number through dissolving ferrocene in a low-octane gasoline (C10H10Fe). Measurements of the spin-lattice relaxation time of nuclear magnetic resonance is used for determination of ferrocene presence in standardized and real fuel from gas stations. The results of measurements of the relaxation characteristics among certain grades of motor fuel with dissolving ferrocene therein are presented.

  13. Benchmarking a signpost to excellence in quality and productivity

    CERN Document Server

    Karlof, Bengt

    1993-01-01

    According to the authors, benchmarking exerts a powerful leverage effect on an organization and they consider some of the factors which justify their claim. Describes how to implement benchmarking and exactly what to benchmark. Explains benchlearning which integrates education, leadership development and organizational dynamics with the actual work being done and how to make it work more efficiently in terms of quality and productivity.

  14. Taking Stock of Corporate Benchmarking Practices: Panacea or Pandora's Box?

    Science.gov (United States)

    Fleisher, Craig S.; Burton, Sara

    1995-01-01

    Discusses why corporate communications/public relations (cc/pr) should be benchmarked (an approach used by cc/pr managers to demonstrate the value of their activities to skeptical organizational executives). Discusses myths about cc/pr benchmarking; types, targets, and focus of cc/pr benchmarking; a process model; and critical decisions about…

  15. 47 CFR 69.108 - Transport rate benchmark.

    Science.gov (United States)

    2010-10-01

    ... with this subpart, the DS3-to-DS1 benchmark ratio shall be calculated as follows: the telephone company... benchmark ratio of 9.6 to 1 or higher. (c) If a telephone company's initial transport rates are based on... 47 Telecommunication 3 2010-10-01 2010-10-01 false Transport rate benchmark. 69.108 Section...

  16. Discovering and Implementing Best Practices to Strengthen SEAs: Collaborative Benchmarking

    Science.gov (United States)

    Building State Capacity and Productivity Center, 2013

    2013-01-01

    This paper is written for state educational agency (SEA) leaders who are considering the benefits of collaborative benchmarking, and it addresses the following questions: (1) What does benchmarking of best practices entail?; (2) How does "collaborative benchmarking" enhance the process?; (3) How do SEAs control the process so that "their" needs…

  17. 29 CFR 1952.323 - Compliance staffing benchmarks.

    Science.gov (United States)

    2010-07-01

    ... 29 Labor 9 2010-07-01 2010-07-01 false Compliance staffing benchmarks. 1952.323 Section 1952.323... Compliance staffing benchmarks. Under the terms of the 1978 Court Order in AFL-CIO v. Marshall compliance staffing levels (benchmarks) necessary for a “fully effective” enforcement program were required to...

  18. 29 CFR 1952.343 - Compliance staffing benchmarks.

    Science.gov (United States)

    2010-07-01

    ... 29 Labor 9 2010-07-01 2010-07-01 false Compliance staffing benchmarks. 1952.343 Section 1952.343... Compliance staffing benchmarks. Under the terms of the 1978 Court Order in AFL-CIO v. Marshall, Compliance staffing levels (benchmarks) necessary for a “fully effective” enforcement program were required to...

  19. 29 CFR 1952.213 - Compliance staffing benchmarks.

    Science.gov (United States)

    2010-07-01

    ... 29 Labor 9 2010-07-01 2010-07-01 false Compliance staffing benchmarks. 1952.213 Section 1952.213... Compliance staffing benchmarks. Under the terms of the 1978 Court Order in AFL-CIO v. Marshall compliance staffing levels (benchmarks) necessary for a “fully effective” enforcement program were required to...

  20. 29 CFR 1952.373 - Compliance staffing benchmarks.

    Science.gov (United States)

    2010-07-01

    ... 29 Labor 9 2010-07-01 2010-07-01 false Compliance staffing benchmarks. 1952.373 Section 1952.373... Compliance staffing benchmarks. Under the terms of the 1978 Court Order in AFL-CIO v. Marshall compliance staffing levels (benchmarks) necessary for a “fully effective” enforcement program were required to...

  1. 29 CFR 1952.163 - Compliance staffing benchmarks.

    Science.gov (United States)

    2010-07-01

    ... 29 Labor 9 2010-07-01 2010-07-01 false Compliance staffing benchmarks. 1952.163 Section 1952.163... Compliance staffing benchmarks. Under the terms of the 1978 Court Order in AFL-CIO v. Marshall, compliance staffing levels (benchmarks) necessary for a “fully effective” enforcement program were required to...

  2. 29 CFR 1952.203 - Compliance staffing benchmarks.

    Science.gov (United States)

    2010-07-01

    ... 29 Labor 9 2010-07-01 2010-07-01 false Compliance staffing benchmarks. 1952.203 Section 1952.203... Compliance staffing benchmarks. Under the terms of the 1978 Court Order in AFL-CIO v. Marshall, compliance staffing levels (benchmarks) necessary for a “fully effective” enforcement program were required to...

  3. 29 CFR 1952.293 - Compliance staffing benchmarks.

    Science.gov (United States)

    2010-07-01

    ... 29 Labor 9 2010-07-01 2010-07-01 false Compliance staffing benchmarks. 1952.293 Section 1952.293... Compliance staffing benchmarks. Under the terms of the 1978 Court Order in AFL-CIO v. Marshall compliance staffing levels (benchmarks) necessary for a “fully effective” enforcement program were required to...

  4. 29 CFR 1952.223 - Compliance staffing benchmarks.

    Science.gov (United States)

    2010-07-01

    ... 29 Labor 9 2010-07-01 2010-07-01 false Compliance staffing benchmarks. 1952.223 Section 1952.223... Compliance staffing benchmarks. Under the terms of the 1978 Court Order in AFL-CIO v. Marshall compliance staffing levels (benchmarks) necessary for a “fully effective” enforcement program were required to...

  5. 29 CFR 1952.233 - Compliance staffing benchmarks.

    Science.gov (United States)

    2010-07-01

    ... 29 Labor 9 2010-07-01 2010-07-01 false Compliance staffing benchmarks. 1952.233 Section 1952.233... Compliance staffing benchmarks. Under the terms of the 1978 Court Order in AFL-CIO v. Marshall compliance staffing levels (benchmarks) necessary for a “fully effective” enforcement program were required to...

  6. 29 CFR 1952.113 - Compliance staffing benchmarks.

    Science.gov (United States)

    2010-07-01

    ... 29 Labor 9 2010-07-01 2010-07-01 false Compliance staffing benchmarks. 1952.113 Section 1952.113... Compliance staffing benchmarks. Under the terms of the 1978 Court Order in AFL-CIO v. Marshall, compliance staffing levels (benchmarks) necessary for a “fully effective” enforcement program were required to...

  7. 29 CFR 1952.93 - Compliance staffing benchmarks.

    Science.gov (United States)

    2010-07-01

    ... 29 Labor 9 2010-07-01 2010-07-01 false Compliance staffing benchmarks. 1952.93 Section 1952.93....93 Compliance staffing benchmarks. Under the terms of the 1978 Court Order in AFL-CIO v. Marshall compliance staffing levels (benchmarks) necessary for a “fully effective” enforcement program were...

  8. 29 CFR 1952.353 - Compliance staffing benchmarks.

    Science.gov (United States)

    2010-07-01

    ... 29 Labor 9 2010-07-01 2010-07-01 false Compliance staffing benchmarks. 1952.353 Section 1952.353... Compliance staffing benchmarks. Under the terms of the 1978 Court Order in AFL-CIO v. Marshall, compliance staffing levels (benchmarks) necessary for a “fully effective” enforcement program were required to...

  9. Advanced Fuel Cell System Thermal Management for NASA Exploration Missions

    Science.gov (United States)

    Burke, Kenneth A.

    2009-01-01

    The NASA Glenn Research Center is developing advanced passive thermal management technology to reduce the mass and improve the reliability of space fuel cell systems for the NASA exploration program. An analysis of a state-of-the-art fuel cell cooling systems was done to benchmark the portion of a fuel cell system s mass that is dedicated to thermal management. Additional analysis was done to determine the key performance targets of the advanced passive thermal management technology that would substantially reduce fuel cell system mass.

  10. Extended lattice Boltzmann scheme for droplet combustion

    Science.gov (United States)

    Ashna, Mostafa; Rahimian, Mohammad Hassan; Fakhari, Abbas

    2017-05-01

    The available lattice Boltzmann (LB) models for combustion or phase change are focused on either single-phase flow combustion or two-phase flow with evaporation assuming a constant density for both liquid and gas phases. To pave the way towards simulation of spray combustion, we propose a two-phase LB method for modeling combustion of liquid fuel droplets. We develop an LB scheme to model phase change and combustion by taking into account the density variation in the gas phase and accounting for the chemical reaction based on the Cahn-Hilliard free-energy approach. Evaporation of liquid fuel is modeled by adding a source term, which is due to the divergence of the velocity field being nontrivial, in the continuity equation. The low-Mach-number approximation in the governing Navier-Stokes and energy equations is used to incorporate source terms due to heat release from chemical reactions, density variation, and nonluminous radiative heat loss. Additionally, the conservation equation for chemical species is formulated by including a source term due to chemical reaction. To validate the model, we consider the combustion of n-heptane and n -butanol droplets in stagnant air using overall single-step reactions. The diameter history and flame standoff ratio obtained from the proposed LB method are found to be in good agreement with available numerical and experimental data. The present LB scheme is believed to be a promising approach for modeling spray combustion.

  11. Extended lattice Boltzmann scheme for droplet combustion.

    Science.gov (United States)

    Ashna, Mostafa; Rahimian, Mohammad Hassan; Fakhari, Abbas

    2017-05-01

    The available lattice Boltzmann (LB) models for combustion or phase change are focused on either single-phase flow combustion or two-phase flow with evaporation assuming a constant density for both liquid and gas phases. To pave the way towards simulation of spray combustion, we propose a two-phase LB method for modeling combustion of liquid fuel droplets. We develop an LB scheme to model phase change and combustion by taking into account the density variation in the gas phase and accounting for the chemical reaction based on the Cahn-Hilliard free-energy approach. Evaporation of liquid fuel is modeled by adding a source term, which is due to the divergence of the velocity field being nontrivial, in the continuity equation. The low-Mach-number approximation in the governing Navier-Stokes and energy equations is used to incorporate source terms due to heat release from chemical reactions, density variation, and nonluminous radiative heat loss. Additionally, the conservation equation for chemical species is formulated by including a source term due to chemical reaction. To validate the model, we consider the combustion of n-heptane and n-butanol droplets in stagnant air using overall single-step reactions. The diameter history and flame standoff ratio obtained from the proposed LB method are found to be in good agreement with available numerical and experimental data. The present LB scheme is believed to be a promising approach for modeling spray combustion.

  12. Characterization of addressability by simultaneous randomized benchmarking

    CERN Document Server

    Gambetta, Jay M; Merkel, S T; Johnson, B R; Smolin, John A; Chow, Jerry M; Ryan, Colm A; Rigetti, Chad; Poletto, S; Ohki, Thomas A; Ketchen, Mark B; Steffen, M

    2012-01-01

    The control and handling of errors arising from cross-talk and unwanted interactions in multi-qubit systems is an important issue in quantum information processing architectures. We introduce a benchmarking protocol that provides information about the amount of addressability present in the system and implement it on coupled superconducting qubits. The protocol consists of randomized benchmarking each qubit individually and then simultaneously, and the amount of addressability is related to the difference of the average gate fidelities of those experiments. We present the results on two similar samples with different amounts of cross-talk and unwanted interactions, which agree with predictions based on simple models for the amount of residual coupling.

  13. Specification for the VERA Depletion Benchmark Suite

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Kang Seog [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2015-12-17

    CASL-X-2015-1014-000 iii Consortium for Advanced Simulation of LWRs EXECUTIVE SUMMARY The CASL neutronics simulator MPACT is under development for the neutronics and T-H coupled simulation for the pressurized water reactor. MPACT includes the ORIGEN-API and internal depletion module to perform depletion calculations based upon neutron-material reaction and radioactive decay. It is a challenge to validate the depletion capability because of the insufficient measured data. One of the detoured methods to validate it is to perform a code-to-code comparison for benchmark problems. In this study a depletion benchmark suite has been developed and a detailed guideline has been provided to obtain meaningful computational outcomes which can be used in the validation of the MPACT depletion capability.

  14. HPL and STREAM Benchmarks on SANAM Supercomputer

    KAUST Repository

    Bin Sulaiman, Riman A.

    2017-03-13

    SANAM supercomputer was jointly built by KACST and FIAS in 2012 ranking second that year in the Green500 list with a power efficiency of 2.3 GFLOPS/W (Rohr et al., 2014). It is a heterogeneous accelerator-based HPC system that has 300 compute nodes. Each node includes two Intel Xeon E5?2650 CPUs, two AMD FirePro S10000 dual GPUs and 128 GiB of main memory. In this work, the seven benchmarks of HPCC were installed and configured to reassess the performance of SANAM, as part of an unpublished master thesis, after it was reassembled in the Kingdom of Saudi Arabia. We present here detailed results of HPL and STREAM benchmarks.

  15. The PROOF benchmark suite measuring PROOF performance

    Science.gov (United States)

    Ryu, S.; Ganis, G.

    2012-06-01

    The PROOF benchmark suite is a new utility suite of PROOF to measure performance and scalability. The primary goal of the benchmark suite is to determine optimal configuration parameters for a set of machines to be used as PROOF cluster. The suite measures the performance of the cluster for a set of standard tasks as a function of the number of effective processes. Cluster administrators can use the suite to measure the performance of the cluster and find optimal configuration parameters. PROOF developers can also utilize the suite to help them measure, identify problems and improve their software. In this paper, the new tool is explained in detail and use cases are presented to illustrate the new tool.

  16. Measuring NUMA effects with the STREAM benchmark

    CERN Document Server

    Bergstrom, Lars

    2011-01-01

    Modern high-end machines feature multiple processor packages, each of which contains multiple independent cores and integrated memory controllers connected directly to dedicated physical RAM. These packages are connected via a shared bus, creating a system with a heterogeneous memory hierarchy. Since this shared bus has less bandwidth than the sum of the links to memory, aggregate memory bandwidth is higher when parallel threads all access memory local to their processor package than when they access memory attached to a remote package. But, the impact of this heterogeneous memory architecture is not easily understood from vendor benchmarks. Even where these measurements are available, they provide only best-case memory throughput. This work presents a series of modifications to the well-known STREAM benchmark to measure the effects of NUMA on both a 48-core AMD Opteron machine and a 32-core Intel Xeon machine.

  17. Argonne Code Center: benchmark problem book

    Energy Technology Data Exchange (ETDEWEB)

    1977-06-01

    This report is a supplement to the original report, published in 1968, as revised. The Benchmark Problem Book is intended to serve as a source book of solutions to mathematically well-defined problems for which either analytical or very accurate approximate solutions are known. This supplement contains problems in eight new areas: two-dimensional (R-z) reactor model; multidimensional (Hex-z) HTGR model; PWR thermal hydraulics--flow between two channels with different heat fluxes; multidimensional (x-y-z) LWR model; neutron transport in a cylindrical ''black'' rod; neutron transport in a BWR rod bundle; multidimensional (x-y-z) BWR model; and neutronic depletion benchmark problems. This supplement contains only the additional pages and those requiring modification. (RWR)

  18. Assessing and benchmarking multiphoton microscopes for biologists.

    Science.gov (United States)

    Corbin, Kaitlin; Pinkard, Henry; Peck, Sebastian; Beemiller, Peter; Krummel, Matthew F

    2014-01-01

    Multiphoton microscopy has become staple tool for tracking cells within tissues and organs due to superior depth of penetration, low excitation volumes, and reduced phototoxicity. Many factors, ranging from laser pulse width to relay optics to detectors and electronics, contribute to the overall ability of these microscopes to excite and detect fluorescence deep within tissues. However, we have found that there are few standard ways already described in the literature to distinguish between microscopes or to benchmark existing microscopes to measure the overall quality and efficiency of these instruments. Here, we discuss some simple parameters and methods that can either be used within a multiphoton facility or by a prospective purchaser to benchmark performance. This can both assist in identifying decay in microscope performance and in choosing features of a scope that are suited to experimental needs.

  19. ASBench: benchmarking sets for allosteric discovery.

    Science.gov (United States)

    Huang, Wenkang; Wang, Guanqiao; Shen, Qiancheng; Liu, Xinyi; Lu, Shaoyong; Geng, Lv; Huang, Zhimin; Zhang, Jian

    2015-08-01

    Allostery allows for the fine-tuning of protein function. Targeting allosteric sites is gaining increasing recognition as a novel strategy in drug design. The key challenge in the discovery of allosteric sites has strongly motivated the development of computational methods and thus high-quality, publicly accessible standard data have become indispensable. Here, we report benchmarking data for experimentally determined allosteric sites through a complex process, including a 'Core set' with 235 unique allosteric sites and a 'Core-Diversity set' with 147 structurally diverse allosteric sites. These benchmarking sets can be exploited to develop efficient computational methods to predict unknown allosteric sites in proteins and reveal unique allosteric ligand-protein interactions to guide allosteric drug design.

  20. Active vibration control of nonlinear benchmark buildings

    Institute of Scientific and Technical Information of China (English)

    ZHOU Xing-de; CHEN Dao-zheng

    2007-01-01

    The present nonlinear model reduction methods unfit the nonlinear benchmark buildings as their vibration equations belong to a non-affine system. Meanwhile,the controllers designed directly by the nonlinear control strategy have a high order, and they are difficult to be applied actually. Therefore, a new active vibration control way which fits the nonlinear buildings is proposed. The idea of the proposed way is based on the model identification and structural model linearization, and exerting the control force to the built model according to the force action principle. This proposed way has a better practicability as the built model can be reduced by the balance reduction method based on the empirical Grammian matrix. A three-story benchmark structure is presented and the simulation results illustrate that the proposed method is viable for the civil engineering structures.